Fork me on GitHub
#sql
<
2020-05-07
>
fabrao00:05:23

Hello all, do you think the problem about charset in next.jdbc came from jdbc driver?

{:chamado "241740",
  :descricao
  "LOJA 39 - copia de seguranca do backup do servidor ********** falhou.",
  :categoria "Bay.Banco de Dados.Backup com Falha",
  :abertura "26/04/2020 08:53",
  :fechamento "02/05/2020 22:04",
  :local "39 **** FLORIANÃ"POLIS IGUATEMI",  <----}]

fabrao00:05:02

is there any way to fix this?

hiredman01:05:35

it is very unlikely to be something with next.jdbc

hiredman01:05:05

but character encoding is tricky

hiredman01:05:24

it could even be due to a mismatch of your terminal and the jvm's default character encoding

fabrao01:05:52

do you think I have to encode after result?

seancorfield01:05:42

@fabrao You may need to specify :characterEncoding "utf8" in your db spec or ?characterEncoding=utf8" in your JDBC URL, depending on your database and how you are setting up your data source.

fabrao01:05:02

well, it didn´t work

fabrao01:05:01

maybe it caused by SQLServer Collate

seancorfield01:05:06

I don't know if what I suggested would work with MS SQL Server. You'll need to read the relevant docs for that DB and figure it out.

seancorfield01:05:30

It could be failing to handle the character encoding at any number of points along the way... It could be the data insertion point, it could be the data retrieval point, it could be how your JVM is configured on your system, it could be your terminal...

fabrao01:05:45

Yes, I´m trying to figure out where is the problem.

fabrao01:05:29

the same results goes to GraphQL result

fabrao01:05:00

"local": "39 FLORIAN�POLIS IGUATEMI"

seancorfield01:05:10

If you have some native SQL Server "workbench" you should use that to verify the data that is actually in the DB.

fabrao01:05:44

in sql management studio is ok for this

maxp10:05:23

Is there an example how to handle postgresql arrays in result set?

seancorfield17:05:00

I'm surprised the solution needs to be that complicated...

seancorfield17:05:52

...oh, because you are deliberately deferring the expansion of the array data from JDBC? What's the reasoning behind that @U0HJNJWJH?

seancorfield17:05:03

It seems like any operation you perform on it will expand the whole array anyway, and it seems like you would run the risk of the call to .getArray happening after the connection closes with your code, which would fail in most cases I think?

seancorfield17:05:04

I think the minimum version would be something like:

(extend-protocol rs/ReadableColumn
  Array
  (read-column-by-label [^Array v _] (vec (.getArray v)))
  (read-column-by-index [^Array v _ _] (vec (.getArray v))))

serioga17:05:43

the reason is to avoid expensive (.getArray a) if I don't access data in result column.

serioga17:05:17

> you would run the risk of the call to .getArray happening after the connection closes let me check 🙂

serioga17:05:40

usually I always work with result after connection close, so I think data is kept in java.sql.Array instance internally...

seancorfield17:05:02

If you use plan and don't reference the array column, it wouldn't actually be read from the result set anyway. And if you use execute!/`execute-one!` the assumption is that you should get a fully-realized Clojure datastructure that is completely detached from JDBC (so don't select array columns if you don't need them 🙂 ).

seancorfield17:05:32

It feels like you're fighting against the simple path here... but, hey, if it works for your use case.

serioga17:05:44

Yes, I just try to avoid to convert all array values if code does not work with them. I feel sick about wasting CPU time for useless conversion 🙂

serioga17:05:27

> If you use plan and don't reference the array column, it wouldn't actually be read from the result set anyway. @seancorfield is it so if I fetch row but ignore column?

seancorfield17:05:26

ReadableColumn only applies if plan's reduction references that specific column by name (actually by label)

seancorfield17:05:13

In execute!/`execute-one!`, every column mentioned in the result set, i.e., select'd in the SQL, is read (by index).

seancorfield17:05:19

Both of those build a full hash map from each row, with all columns. plan doesn't even build a hash map if you only reference columns directly by label and don't try any map-like operations on the row itself. That's why plan is so efficient.

maxp04:05:50

Great examples! Thank you.

maxp10:05:34

rspecially varchar[] and integer[]

souenzzo15:05:24

How to pass a "collection" to next.jdbc@postgres? (jdbc/execute! conn ["SELECT * FROM app_todo WHERE id IN (?)" [1]]) something like this

souenzzo16:05:57

My solution

(jdbc/execute!
  @conn
  ["SELECT *
    FROM app_todo
    WHERE id IN (SELECT unnest(?))"
   (int-array [1 2])])
                  

seancorfield15:05:02

@souenzzo No, @maxp was asking about arrays in result sets not parameters.

souenzzo15:05:43

so there is no way to do this? how should I "get many vals from many inputs" in next.jdbc?

seancorfield15:05:14

Sure, the docs tell you how to deal with additional data types in general in result sets. There's just no specific example for PostgreSQL array types.

seancorfield16:05:12

(and it links to deeper discussions and some examples -- but it would be nice to have specific examples for array types in the PostgreSQL Tips &amp; Tricks section if someone is willing to write them up: I do not use PostgreSQL!)

seancorfield16:05:45

@souenzzo Oh, were you asking your own question, separate from @maxp’s question?

👍 4
seancorfield16:05:21

(if so, the answer is in the Getting Started section I pointed to since that covers both input parameter types and output result set column types)

seancorfield16:05:40

I thought you were trying to answer his question...

souenzzo16:05:33

@seancorfield I also don't know SQL enough to know where i'm searching it's about "input parameter type"?

seancorfield16:05:00

@souenzzo The two use cases are: * ReadableColumn protocol (for converting database-specific types to Clojure values -- reading from a JDBC result set back into Clojure data * SettableParameter protocol (for converting Clojure values to database-specific types -- passing Clojure data as input parameters into JDBC operations Does that help?

souenzzo16:05:40

SettableParameter is the thing. i will search. tnks 🙂

seancorfield16:05:33

@maxp is asking about the first case -- turning PostgreSQL array column types into Clojure vectors; I think you're asking about the second case -- passing Clojure collections into JDBC (presumably as PostgreSQL arrays)?

seancorfield16:05:26

If you extend SettableParameter to Clojure vectors, the = ANY(?) trick will stop working I suspect. But you may not need to extend SettableParameter for your use case in PostgreSQL: simply using ANY(?) instead of just ? may be sufficient for the PostgreSQL JDBC driver to accept a vector type and convert it to an array type for the column...

seancorfield16:05:14

e.g., ["INSERT INTO my_table (id, array_data) VALUES (?, ANY(?))" 1, [2 3 4]]

seancorfield16:05:55

I've no idea whether that would work (like I say, I don't use PostgreSQL at all) but given how the IN/`= ANY(?)` trick works, I suspect it might.

souenzzo16:05:57

My solution

(jdbc/execute!
  @conn
  ["SELECT *
    FROM app_todo
    WHERE id IN (SELECT unnest(?))"
   (int-array [1 2])])
                  

seancorfield17:05:59

@souenzzo WHERE id = ANY(?) should work -- which is the example in the docs.

seancorfield17:05:16

Note: = rather than IN.

seancorfield17:05:49

I thought you were asking about passing a collection into a single column for an insert or similar.

souenzzo17:05:49

["SELECT *
  FROM app_todo
  WHERE author = ANY(?)"
 (int-array [1 3])])
This works! But without int-array I get
Execution error (PSQLException) at org.postgresql.jdbc.PgPreparedStatement/setObject (PgPreparedStatement.java:978).
Can't infer the SQL type to use for an instance of clojure.lang.PersistentVector. Use setObject() with an explicit Types value to specify the type to use.

seancorfield17:05:39

Yes, that's expected.

seancorfield17:05:01

(and that's exactly what the documentation says to do)

seancorfield17:05:22

> What does this mean for your use of next.jdbc? In plan, execute!, and execute-one!, you can use col = ANY(?) in the SQL string and a single primitive array parameter, such as (int-array [1 2 3 4]). That means that in next.jdbc.sql's functions that take a where clause (`find-by-keys`, update!, and delete!) you can specify ["col = ANY(?)" (int-array data)] for what would be a col IN (?,?,?,,,?) where clause for other databases and require multiple values.