This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2022-03-30
Channels
- # announcements (8)
- # babashka (102)
- # beginners (312)
- # calva (9)
- # clj-kondo (9)
- # cljfx (7)
- # clojure (128)
- # clojure-europe (52)
- # clojure-nl (2)
- # clojure-norway (2)
- # clojure-spec (5)
- # clojure-uk (4)
- # clojurescript (13)
- # conjure (5)
- # cursive (5)
- # datalog (18)
- # datomic (8)
- # emacs (1)
- # events (3)
- # fulcro (16)
- # graphql (2)
- # gratitude (1)
- # helix (16)
- # inf-clojure (17)
- # introduce-yourself (9)
- # java (11)
- # lambdaisland (3)
- # leiningen (3)
- # lsp (8)
- # malli (3)
- # membrane (7)
- # missionary (26)
- # nextjournal (1)
- # off-topic (19)
- # pathom (3)
- # polylith (13)
- # portal (16)
- # reagent (39)
- # reitit (2)
- # releases (23)
- # remote-jobs (1)
- # shadow-cljs (40)
- # specter (3)
- # sql (12)
- # tools-deps (8)
- # tree-sitter (1)
- # vim (3)
- # web-security (6)
- # xtdb (16)
I don't think it has to do with that. It's fine to pull chunks of data (by a small number of rows, say, 4000), so long as we expose them as an iterator and discard the processed ones from memory. The problem is that they stay in memory.
Oh yes.
I can't use with-open
, though. The cursors are closed inside .onComplete
of the Scala future returned when the flow graph begins to run.
Otherwise, they'd be closed before the flow is done.
Huh, interesting. Thanks!
Hmm, but opts
in my code above is actually {:fetch-size 4000}
, anyway.
Just need auto-commit.