Fork me on GitHub
#clj-http
<
2024-05-27
>
Nim Sadeh21:05:49

I have a web scraper making async requests, and I get a lot of the following error: Caused by: java.io.IOException: Too many open files Is this caused by having too many concurrent requests? Is there a way to make an application-level limit on the number of concurrent requests?

Nim Sadeh01:05:44

To anyone wondering: https://github.com/dakrone/clj-http?tab=readme-ov-file#persistent-connections worked like a charm and I never saw these errors again.