This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2022-11-01
Channels
- # announcements (10)
- # asami (2)
- # babashka (10)
- # beginners (55)
- # biff (37)
- # calva (9)
- # cherry (1)
- # clj-kondo (11)
- # clojure (221)
- # clojure-bay-area (12)
- # clojure-europe (77)
- # clojure-hungary (3)
- # clojure-nl (5)
- # clojure-norway (12)
- # clojurescript (11)
- # cursive (1)
- # data-science (11)
- # emacs (27)
- # figwheel (3)
- # fulcro (11)
- # graphql (5)
- # helix (7)
- # honeysql (3)
- # humbleui (9)
- # interceptors (2)
- # introduce-yourself (2)
- # kaocha (12)
- # lsp (27)
- # malli (6)
- # nbb (70)
- # off-topic (6)
- # re-frame (6)
- # react (3)
- # reitit (9)
- # releases (2)
- # scittle (29)
- # shadow-cljs (26)
- # sql (13)
- # tools-deps (61)
(next.jdbc.sql/insert-multi! ... {:batch true :batch-size 2000})
How do you deal with situation when index key is duplicated, but you want to insert all others? Is there any magic PostgreSQL settings or you switch to add each row one by one then? Filter batch before insert-multi!
by query DB before and get ids to remove them in batch. Quite abstractive question, but maybe there is something which I don’t know about.
(defn insert-multi! [table cols rows]
(try
(jdbc-sql/insert-multi! (pool) table cols rows {:batch true
:batch-size 2000})
(catch BatchUpdateException e
(if (postgresql-utils/duplicated-key? (ex-cause e))
(doseq [row rows]
(try
(jdbc-sql/insert! (pool) table (zipmap cols row))
(catch PSQLException e
(if (postgresql-utils/duplicated-key? e)
(l/warn e)
(throw e)))))
(throw e)))))
looks pretty ugly, but probably can’t be much betterThis looks pretty janky. I'd suggest the following steps: • begin transaction • search for existing duplicates in the db • remove the duplicates from the input (or from the db, depending on what you are trying to accomplish) • insert the filtered input • commit the txn
right now I have v2:
(defn- insert! [table m]
(try
(jdbc-sql/insert! (pool) table m)
(catch PSQLException e
(if (postgresql-utils/duplicated-key? e)
(l/warn e)
(throw e)))))
(defn insert-multi! [table cols rows]
(try
(jdbc-sql/insert-multi! (pool) table cols rows {:batch true :batch-size 2000})
(catch BatchUpdateException e
(if (postgresql-utils/duplicated-key? (ex-cause e))
(doseq [row rows]
(insert! table (zipmap cols row)))
(throw e)))))
I have to think about this. What you are saying is correct too, but will take more lines and probably will end with more complex code, than it is now.
Also I will have to use transaction and block other threads.
This is not as easy choice.actually the only one fragment which should change then is
(doseq [row rows]
(insert! table (zipmap cols row)))
Because I have to check it the cause of the issues ir duplicate key or not. So this code stay as it is.So the
(doseq [row rows]
(insert! table (zipmap cols row)))
at first glance looks simpler, than:
https://clojurians.slack.com/archives/C1Q164V29/p1667313560042789?thread_ts=1667309672.219459&cid=C1Q164V29Do I use {:batch true :batch-size 2000}
correctly here? I mean did the map correctly set batch-size?
I have feeling like it is going too slow vs when I was partitioned this myself.
(defn insert-but-drop-dupes
[txable rows]
(jdbc/with-transaction [txn txable]
(let [dupes (into #{}
(map :mytable/id)
(jdbc/execute!
txn
["select id from mytable where id = any(?)"
(into-array Long (map :mytable/id rows))]))
unique-rows (remove (comp dupes :mytable/id) rows)]
(sql/insert-multi! txn :mytable unique-rows))))
thank you, I think I can mix this 2 into 1 better. This situation happen only from time to time, so I don’t know if I want to always check dedupes with duplicate Exception.
this is just a proof of concept and there are problems like: • the rows arent deduplicated, so if there's duplicate id in the input, the fn will barf • I assume there could be problems if the rows coll was exceedigly large • the txn handling probably should be done somewhere else (for instance, if this was part of a web app, I'd probably want to run the whole handler in a single txn to provide atomicity)