Fork me on GitHub
#sql
<
2021-07-25
>
indy06:07:44

Does it makes sense to have a flag to automatically apply datafiable-row on the row when reducing over the result-set using plan?

indy06:07:56

My main use case of plan is for streaming large number of rows, but I'm fine with each row being implicitly realized during the reduction step.

indy06:07:02

I have something like this, which I will have in a db utils ns, and allow the consumers to call these instead of using plan in a bespoke way everywhere. But for this, the consumers need to deal with rs/datafiable-row in the reducing functions that they pass, which I don't want them to do.

(extend-protocol ReducibleStream
  java.sql.Connection
  (reduce-stream
    [this sql-params f init]
    (with-autocommit-off [conn this]
      (reduce f init (jdbc/plan conn sql-params (stream-options)))))
  (transduce-stream
    ([this sql-params xf f]
     (transduce-stream this sql-params xf f (f)))
    ([this sql-params xf f init]
     (with-autocommit-off [conn this]
       (transduce xf f init (jdbc/plan conn sql-params (stream-options))))))

  javax.sql.DataSource
  (reduce-stream
    [this sql-params f init]
    (with-open [conn (jdbc/get-connection this)]
      (reduce-stream conn sql-params f init)))
  (transduce-stream
    ([this sql-params xf f]
     (transduce-stream this sql-params xf f (f)))
    ([this sql-params xf f init]
     (with-open [conn (jdbc/get-connection this)]
       (transduce-stream conn sql-params xf f init)))))

indy06:07:38

I know I could use the raw underscored column names directly in the reducing function, but just wondering if it makes sense to have a flag to automatically datafy the row.

indy08:07:38

So that transducing functions are reusable, where I'll be dealing with the decorated row names (qualified, kebab-cased).

indy09:07:52

Nvm, I guess I have a workaround for now

(reduce-stream
    [this sql-params f init]
    (with-autocommit-off [conn this]
      (reduce (fn [acc row] (f acc (datafiable-row row conn)))
              init
              (jdbc/plan conn sql-params (stream-options)))))
  (transduce-stream
    ([this sql-params xf f]
     (transduce-stream this sql-params xf f (f)))
    ([this sql-params xf f init]
     (with-autocommit-off [conn this]
       (transduce (comp (map #(datafiable-row % conn)) xf)
                  f
                  init
                  (jdbc/plan conn sql-params (stream-options))))))

zackteo16:07:16

Does anyone have any examples of inserting json or jsonb into postgresql? Via next.JDBC ?

zackteo01:07:16

Nice! 😄