This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-11-20
Channels
- # announcements (2)
- # architecture (5)
- # beginners (118)
- # cider (5)
- # clara (13)
- # cljdoc (8)
- # cljs-dev (49)
- # cljsjs (2)
- # clojure (107)
- # clojure-dev (9)
- # clojure-europe (3)
- # clojure-italy (58)
- # clojure-japan (2)
- # clojure-nl (6)
- # clojure-spec (89)
- # clojure-uk (27)
- # clojurescript (9)
- # core-async (33)
- # cursive (2)
- # datascript (2)
- # datomic (31)
- # duct (4)
- # emacs (1)
- # events (1)
- # figwheel-main (1)
- # fulcro (69)
- # hoplon (7)
- # hyperfiddle (16)
- # incanter (4)
- # instaparse (4)
- # kaocha (1)
- # mount (2)
- # nrepl (19)
- # off-topic (40)
- # onyx (6)
- # other-languages (3)
- # pedestal (2)
- # re-frame (48)
- # reagent (2)
- # reitit (10)
- # ring-swagger (9)
- # shadow-cljs (63)
- # spacemacs (13)
- # sql (8)
Hi all I’d like to process each out of 12,000,000 records in my table (MySQL InnoDB) It this code efficient?
(run! (fn [row]
(async/>!! chan row))
(jdbc/reducible-query
db
["select * from table where checked_at is NULL order by postcode limit 100000"]
{:fetch-size 10000
:max-rows 10000
:cursors :hold
:result-type :forward-only
:concurrency :read-only
:auto-commit? false}))
because now mysql spends quite some time to process the query
do I need the limit? do I need the cursor? (is cursor actually used?) every record first putted into channel for proccessing (channel is limited to 1000)
should I use same limit in query as channel has? (1000)
@seancorfield I am still having some misunderstandings with performing a batch insert. Consider the following
(def routing-guide (sql/query warp-db [(slurp "src/bayesianic/routing-guide.sql")] {:as-arrays? true}))
(time
(sql/insert-multi! prem-db :routing_guide (first routing-guide) (rest routing-guide)))
(time
(batch-insert prem-db "routing_guide" (first routing-guide) (rest routing-guide)))
where batch-insert is a function that i wrote myself. The timings are as follows:
"Elapsed time: 26476.902023 msecs"
"Elapsed time: 98.909443 msecs"
my batch insert function looks like this:
(defn batch-insert [db table colnames rows]
(sql/execute! db [(str
"insert into "
table
(str " (" (clojure.string/join ", " (map name colnames)) ") ")
" values "
(apply str (clojure.string/join ", "
(map (fn [x] (str "(" (clojure.string/join ", " x) ")")) rows))))]))
@michaellindon Your function isn't equivalent tho' -- it's assuming it can convert arbitrary Clojure values to legal SQL literal values with str
-- whereas insert-multi!
creates a PreparedStatement
with a parameter for every value and sets all those values as parameters in the JDBC call.
Yours will work for numbers but not strings (for example).