Fork me on GitHub

Hi Do i understand correctly that if i want to utilize cursor - i should pass prepared statement here? otherwise the connection is closed and resultset is not accessible anymore thanks


If you pass a PreparedStatement it is used as-is. If you pass a db-spec containing an active Connection, a new PreparedStatement is created and executed and then that statement is closed, but not the original Connection. Otherwise, a new Connection is created, and a new PreparedStatement is created from that, and executed, and then both are closed.


You might get more helpful answers if you shared the code you tried and explained what is and isn't working (as well as explaining what problem you're actually trying to solve). The reason I didn't reply to your previous question was because there wasn't enough information in it for me to understand what you were trying to do. And that's why I can't answer the cursor portion of this question -- only what that specific function does in terms of creating/closing things.


> If you pass a PreparedStatement it is used as-is.

(defn- execute-query-with-params
  "Given a prepared statement, a set of parameters, a parameter setting
  function, and a function to process the result set, execute the query and
  apply the processing function."
  [^PreparedStatement stmt params set-parameters func]
  (set-parameters stmt params)
  (with-open [rset (.executeQuery stmt)]
    (func rset)))


sorry for not being clear what i want to achive - is lazily fetch query result and serialize it to csv (it is pretty big, so I’d like ot avoid keeping it entirely in memory)


there is no much code tbh i just call query within tx with following opts ` {:result-type :forward-only :concurrency :read-only :cursors :hold :fetch-size 50} `


but what i see sofar - PreparedStatement/`Connection` will be closed right after jdbc/query executed did i miss any?


also, one should pass result-set-fn identity option otherwise doall or vec will be invoked (please correct me if I am wrong)


The only way to "stream" large result sets in c.j.j is via the reducible query machinery. You asked about a specific function, as regards closing things, and I answered based on that.


next.jdbc has plan for this and the docs talk specifically about how to stream large result sets -- but in both libraries you must perform and eager reduction over the (reducible) result set to do your processing because of resource management.