Fork me on GitHub
#sql
<
2022-06-03
>
Sathiya12:06:14

Hi. I have a scenario where i have to process records from a table having around 10 million records. I want the records to be fetched in batches and processed instead of all the records as it runs into heap space and memory issues. I tried with jdbc/fetch-lazy and jdbc/cursor->lazyseq from funcool/clojure.jdbc. But it doesnt quite seem to do the job. Any help would be appreciated. Thank you

seancorfield16:06:09

That library hasn't had any updates in six years. The guy who created it started by copying code from the Clojure Contrib library org.clojure/clojure.java.jdbc and removing all the copyright/license stuff until he was called out on the Clojure mailing list about it. Most people either use org.clojure/clojure.java.jdbc (which is also no longer maintained) or next.jdbc which is actively maintained and supports the scenario you want via plan.

Sathiya13:06:20

Thanks @U04V70XH6. Let me try with plan.