Fork me on GitHub
#sql
<
2019-10-15
>
seancorfield18:10:20

Who uses CLOB or BLOB columns? clojure.java.jdbc did not do anything to support those so I'm curious as to how folks deal with them using c.j.j.? I've just added a helper function and an example column reader for CLOB (that calls .getObject and then slurps from the CLOB's InputStream to produce a string) and added a brief mention in the result set builders docs: https://github.com/seancorfield/next-jdbc/blob/master/doc/result-set-builders.md (master, at the end of the first section which describes all the public functions).

jumar07:10:04

We use CLOBs with H2, unfortunately much more than we should (we should have used just VARCHAR pretty much in every case) We have this little function for converting them back to string when they're read:

(defn declob
  "Turn a clob into a String"
  [clob]
  (when clob
    ;; don't strip any new lines! (as shown here: )
    (slurp (.getCharacterStream clob))))

seancorfield17:10:37

That's pretty much what the new clob->string helper does on master of next.jdbc...

👍 4
seancorfield18:10:24

Is there a sensible "default" helper/example for BLOB, perhaps producing a byte[]? How do folks normally process BLOB columns? What would you expect to find in the documentation for CLOB/BLOB support/usage? (the reason it matters is that once the result set is closed, you can't read these large fields on several databases, so you have to turn them into data while you are processing the result set)

dcj18:10:27

@seancorfield With c.j.j I have a column which is a Postgres bytea.

dcj18:10:01

In the not-too-distant future, I will migrate this usage to next.jdbc

dcj19:10:24

I haven't had any issues so far (not that my use is extensive). For insert/update, I do this: (.toByteArray bs) Reading it back out, IIRC I just get a ByteArray without doing anything special In my application, this column contains a highly optimized representation of something I am visualizing. It is kind of an app specific materialized view. On the read side, performance is critical. When I migrate this to next.jdbc I've been planning to take a look at the more primitive functions to ensure that Clojure does as little as possible to this result set (not sure this is necessary/needed..) . In my app, I send this over the wire to the browser, pretty much as-is, and I don't want anything to get in the way of that.

seancorfield19:10:09

@dcj Thanks. So I suspect that PG data type does not depend on the result set remaining open to "inflate" the value from the SQL type to the JDBC type (and hence into a "Clojure" data type).

seancorfield19:10:19

I guess I could add some CLOB tests to c.j.j. and see whether the same result set closing problem exists there -- I'm just sort of assuming it does right now since the machinery is similar at the value-reading level in both libraries.

dcj19:10:22

@seancorfield my only nagging concern is that generally I use some sort of coercion assistance to convert certain datatypes between Clojure and PG, e.g. times, maps and vectors <-> jsonb/json, etc. (you may recall numerous questions awhile back when I was migrating some of that to next.jdbc...). Anyway, I am virtually certain that I did not dive into that code when I added my bytea column, and IIRC, everything "just worked". It is possible that something in the coercion code is "assisting" with my bytea usage.... I will go take a quick look.

seancorfield19:10:18

That "coercion assistance" should be easy to migrate to next.jdbc -- happy to take a look and help whenever you get around to that.

dcj19:10:33

@seancorfield you already helped! The first version is working.... Anyway, long story short, for pre-next.jdbc work, I used a (private fork) of this: https://github.com/atsman/clj-postgresql This file is key: https://github.com/atsman/clj-postgresql/blob/master/src/clj_postgresql/types.clj I just took a quick look and not obvious to me if that lib is adding any support for my bytea usage..... FYI only, I have/am re-implementing that library for next.jdbc, and the datatypes I need seems to be working fine (time, clj <-> json(b), arrrays, etc,) I haven't yet started on the PostGIS part yet, I need to understand better what I really want on both the Clojure side and on the PostGIS/Postgres side before working on that part....

seancorfield19:10:25

Yeah, there's a bunch of extension of the ISQLValue and ISQLParameter protocols in there from c.j.j.

dcj19:10:17

getting my head wrapped around that took a while, and you answered a bunch of questions WRT all that re next.jdbc

seancorfield19:10:01

So you'll need equivalent extensions done for those protocols for that library in order to migrate completely.

dcj19:10:49

yes, I've get that all working for the data types that I really need

dcj19:10:58

I will try and "get busy" and move my bytea usage to next.jdbc, but to set expectations, it may be 2-3 weeks before I can do that. I have a bunch of other higher priority things going on ATM.

seancorfield19:10:58

No problem. next.jdbc evolution is slow and gentle -- and purely accretive. It already does everything I need for work so this is mostly about making it easier for others to use for more exotic things.