double number of concurrent crawlers
This commit is contained in:
parent
41ac4ca9a8
commit
37c00908ec
|
@ -12,6 +12,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||
### Changed
|
||||
|
||||
- Bring back `develop` staging backup (now managed in DNS)
|
||||
- Increase default number of concurrent crawlers to 100
|
||||
|
||||
### Deprecated
|
||||
|
||||
|
|
|
@ -60,7 +60,7 @@ config :backend, :crawler,
|
|||
status_count_limit: 5000,
|
||||
personal_instance_threshold: 10,
|
||||
crawl_interval_mins: 30,
|
||||
crawl_workers: 50,
|
||||
crawl_workers: 100,
|
||||
blacklist: [
|
||||
# spam
|
||||
"gab.best",
|
||||
|
|
Loading…
Reference in a new issue