From 1ed1362bbd047a7e8897e511dfd12b8c328577cf Mon Sep 17 00:00:00 2001 From: Nutomic Date: Tue, 19 Nov 2024 14:51:50 +0100 Subject: [PATCH] Update troubleshooting.md --- src/administration/troubleshooting.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/administration/troubleshooting.md b/src/administration/troubleshooting.md index e3a70ef..e5f9750 100644 --- a/src/administration/troubleshooting.md +++ b/src/administration/troubleshooting.md @@ -225,7 +225,7 @@ If you notice that your lemmy-ui sometimes becomes sluggish or unresponsive over There's a lot of scraping bots online and they can easily overwhelm your site when they're behaving too "greedily". Unfortunately the existing lemmy-ui has a habit of falling over when polled too eagerly, while the backend still continues to work. -A solution is to cache responses in nginx as seen in lemmy-ansible [here](https://github.com/LemmyNet/lemmy-ansible/blob/1.5.3/templates/nginx.conf#L2-L3) and [here](https://github.com/LemmyNet/lemmy-ansible/blob/1.5.3/templates/nginx.conf#L66-L71). This way lemmy-ui doesn't have to generate all responses from scratch, which reduces CPU load. +A solution is to cache responses in nginx as seen in lemmy-ansible [here](https://github.com/LemmyNet/lemmy-ansible/blob/1.5.3/templates/nginx.conf#L2-L3) and [here](https://github.com/LemmyNet/lemmy-ansible/blob/1.5.3/templates/nginx.conf#L66-L71). This way lemmy-ui doesn't have to generate all responses from scratch, which reduces CPU load. However it won't help if a single crawler goes through thousands of unique urls in a short time. Another option is to block the scraper's user agents. To do so, you can modify your `nginx_internal.conf` to block some of the usual suspects, with this line under `server`