This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2024-05-27
Channels
- # announcements (8)
- # babashka (11)
- # beginners (34)
- # clerk (11)
- # clj-http (2)
- # clojure (5)
- # clojure-europe (9)
- # clojure-gamedev (1)
- # clojure-nl (1)
- # clojure-norway (17)
- # clojure-poland (1)
- # clojure-sweden (5)
- # clojure-uk (9)
- # clojurescript (17)
- # core-typed (12)
- # cursive (4)
- # datahike (4)
- # datalevin (2)
- # datomic (7)
- # emacs (8)
- # events (8)
- # graphql (5)
- # gratitude (1)
- # hyperfiddle (19)
- # jobs-discuss (4)
- # leiningen (4)
- # lsp (21)
- # meander (2)
- # off-topic (9)
- # play-clj (1)
- # polylith (10)
- # releases (1)
- # sci (18)
- # vim (10)
I have a web scraper making async requests, and I get a lot of the following error: Caused by: java.io.IOException: Too many open files
Is this caused by having too many concurrent requests? Is there a way to make an application-level limit on the number of concurrent requests?
To anyone wondering: https://github.com/dakrone/clj-http?tab=readme-ov-file#persistent-connections worked like a charm and I never saw these errors again.