This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-11-15
Channels
- # announcements (11)
- # beginners (66)
- # boot (6)
- # clara (25)
- # cljdoc (4)
- # cljs-dev (22)
- # clojure (261)
- # clojure-dev (1)
- # clojure-europe (2)
- # clojure-italy (15)
- # clojure-losangeles (1)
- # clojure-nl (19)
- # clojure-spec (62)
- # clojure-uk (50)
- # clojurescript (12)
- # community-development (6)
- # cursive (60)
- # datomic (21)
- # emacs (2)
- # figwheel (2)
- # figwheel-main (3)
- # fulcro (2)
- # graphql (11)
- # hyperfiddle (11)
- # javascript (1)
- # jobs (6)
- # juxt (1)
- # kaocha (5)
- # keechma (2)
- # off-topic (4)
- # onyx (10)
- # pathom (7)
- # re-frame (15)
- # reagent (8)
- # remote-jobs (2)
- # ring-swagger (14)
- # shadow-cljs (35)
- # sql (22)
- # testing (9)
- # tools-deps (62)
- # vim (12)
is it possible with clojure.java.jdbc
to stream data into the database?
@peterwestmacott Not quite sure what you mean... could you elaborate?
I can use eg. reducible-query
to run a select statement and put each of the values onto eg. a manifold stream or core.async channel
if I have a stream or channel of values can I combine those with an insert statement to stream the rows into a database table?
I can fallback to executing the same insert statement multiple times, each with a batch of rows
…but I wondered if maybe there was a better way
SQL/JDBC isn't really a "streaming" target so I'm not sure how you could do any better than just a loop reading from a channel and calling insert!
...
...if you do think of anything, please report back. It's an interesting question.
okay thanks
it seems asymmetrical to be able to execute one SQL query and get an ongoing sequence of data, but not be able to take an ongoing sequence of data and insert it back into the DB
but I guess the use-case for writing like that is rarer
I didn't find a good way to "pour" a result set into a channel either (beyond an explicit reduce
that puts values)...
and if I’m pooling my connections then the temporal overhead of doing multiple batch inserts (eg. insert-multi
) probably isn’t so bad
For writing to the DB, there are additional concerns, such as transactions, exception handling, etc.
yes, I would expect to need a long running transaction to attempt such a thing
thank you for your time, both now and generally in maintaining such a well-depended-upon library!
I'm using postgres with a column of timestamptz
type and I have a java.time.ZonedDateTime object in my clojure code, but using an object with that type throws an exception
1. Unhandled org.postgresql.util.PSQLException
Can't infer the SQL type to use for an instance of
java.time.ZonedDateTime. Use setObject() with an explicit Types
value to specify the type to use.
How do I use this setObject()
? Or should I convert to java.sql.Timestamp first (which appears to work)?@madstap You should convert to java.sql.Timestamp
first I think. You can also extend the protocol so the conversion is done automatically.
At some point I may extend java.jdbc
to do the inbound conversions automatically, but that will require Java 8+...
Depends on exactly what behavior you want -- see the docs http://clojure-doc.org/articles/ecosystem/java_jdbc/using_sql.html#protocol-extensions-for-transforming-values