This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-02-18
Channels
- # announcements (43)
- # aws (28)
- # babashka (32)
- # beginners (80)
- # calva (13)
- # chlorine-clover (2)
- # cider (11)
- # clj-kondo (15)
- # cljs-dev (1)
- # clojure (151)
- # clojure-dev (11)
- # clojure-europe (11)
- # clojure-italy (3)
- # clojure-losangeles (3)
- # clojure-nl (4)
- # clojure-spec (20)
- # clojure-uk (58)
- # clojured (3)
- # clojuredesign-podcast (2)
- # clojurescript (37)
- # core-async (4)
- # core-typed (1)
- # cursive (53)
- # datascript (5)
- # datomic (26)
- # duct (23)
- # emacs (3)
- # fulcro (22)
- # graalvm (1)
- # jobs (2)
- # joker (11)
- # juxt (24)
- # lumo (1)
- # mid-cities-meetup (2)
- # nyc (1)
- # off-topic (54)
- # parinfer (1)
- # reagent (13)
- # shadow-cljs (16)
- # sql (9)
- # tree-sitter (9)
- # vim (9)
Hello everyone!
(-> (sql/query ds ["select * from customers"] {:builder-fn rs/as-unqualified-lower-maps}) lazy-seq)
Would it be a good idea to transform this return of data into a lazy-seq, considering the fact that i could get a huge amount of data and then avoid issues?Yup, what @dharrigan said @ramon.rios -- query
(which is the same as execute!
really) gives you a complete, realized vector of hash maps -- converting that to a lazy-seq
is pointless: the data is already all in memory. What you want is plan
and then reduce over that.
plan
also lets you take advantage of lazy/streaming result sets on the JDBC side (even tho' reducing is eager on the Clojure side), because you can set options to get the JDBC sdriver/database to read results N rows at a time and process them in a reduction, so you can deal with result sets much larger than will fit in memory. See the Tips & Tricks section of Friendly SQL Functions (in the GitHub repo -- it has doc changes that have not yet been published in a release to http://cljdoc.org).
Ok. I got that plan is the one who i need to pick up. I'll try out on what i want to achieve, thanks 🙂