Fork me on GitHub
#datomic
<
2021-05-24
>
plexus08:05:53

I'm having trouble trying to query Datomic Analytics / Presto via JDBC. I have a decimal(38,2) field, and it's causing an exception in the Presto JDBC driver.

(def conn (java.sql.DriverManager/getConnection presto-url "." ""))

(let [stmt (.createStatement conn)]
  (.executeQuery stmt "SELECT credit_amount FROM journal_entry_line"))

;;=>
1. Caused by java.lang.IllegalArgumentException
   ParameterKind is [TYPE] but expected [LONG]

TypeSignatureParameter.java:  110  com.facebook.presto.jdbc.internal.common.type.TypeSignatureParameter/getValue
TypeSignatureParameter.java:  122  com.facebook.presto.jdbc.internal.common.type.TypeSignatureParameter/getLongLiteral
           ColumnInfo.java:  194  com.facebook.presto.jdbc.ColumnInfo/setTypeInfo
      PrestoResultSet.java: 1869  com.facebook.presto.jdbc.PrestoResultSet/getColumnInfo
      PrestoResultSet.java:  123  com.facebook.presto.jdbc.PrestoResultSet/<init>
      PrestoStatement.java:  272  com.facebook.presto.jdbc.PrestoStatement/internalExecute
      PrestoStatement.java:  230  com.facebook.presto.jdbc.PrestoStatement/execute
      PrestoStatement.java:   79  com.facebook.presto.jdbc.PrestoStatement/executeQuery
If I cast(credit_amount AS varchar) then it works. `getLongLiteral` looks suspicious since it's a decimal field... Not sure if the issue lies with Presto, Datomic Analytics, or the Presto JDBC driver. So I'm mainly asking: what would be the best place(s) to report this?

plexus08:05:23

This seems to be central line in the stacktrace: https://github.com/prestodb/presto/blob/2ad67dcf000be86ebc5ff7732bbb9994c8e324a8/presto-jdbc/src/main/java/com/facebook/presto/jdbc/ColumnInfo.java#L194

case "decimal":
                builder.setSigned(true);
                builder.setColumnDisplaySize(type.getParameters().get(0).getLongLiteral().intValue() + 2); // dot and sign
                builder.setPrecision(type.getParameters().get(0).getLongLiteral().intValue());
                builder.setScale(type.getParameters().get(1).getLongLiteral().intValue()); // <----- getLongLiteral ->  ParameterKind is [TYPE] but expected [LONG]

futuro14:05:12

I'm splitting my initial Marketplace master Datomic Cloud stack into a split-stack solo topology. I didn't provide an ApplicationName in my initial setup from the Marketplace (so the System Name is used, as I understand it); should I provide one now?

futuro14:05:46

Have folks found it beneficial to provide the ApplicationName even when it's the same as the SystemName?

pinkfrog16:05:22

Say I want to connect to the in-memory database of an on-prem datomic peer server. What’s the value to specify for :endpoint? (see https://docs.datomic.com/client-api/datomic.client.api.html#var-client)

pinkfrog16:05:53

Fundamentally, I want to perform unit test with datomic (on prem).

pinkfrog16:05:20

One issue with spinning up an in-memory peer server and connecting to it is that, the peer server listens on a TCP port. So we cannot really run two instances of the test because the two collides on the same port.

cjsauer18:05:54

Is it a bad idea to rely on :db/txInstant as the “created at” time for an entity? The instant at which an entity’s :thing/id datom was asserted is a nice natural creation date, but I’m getting the sense that I’m abusing it a bit. For example, I can’t use index-pull to get the “10 latest” things, because that txInstant datom is on a separate entity (the transaction)…

jcromartie18:05:18

If the entity is created and submitted by an external system, it’s best to require a creation/event time as an input and to verify that is at a point in the recent past.

cjsauer18:05:33

That’s a good point. Or I suppose import jobs are another reason why one shouldn’t overload the :db/txInstant attribute. It’s really more “this is when it entered the system”, whereas creation time is a domain concern.

jcromartie18:05:22

> wall clock times specified by `db:txInstant` are imprecise as more than one transaction can be recorded in the same millisecond

jcromartie18:05:29

even in systems with a RDBMS I like users of a system to provide specific times with their data and also record transaction timestamps

cjsauer19:05:17

What if the entity is created by users? Should I be managing created-at/`updated-at` times manually?

joshkh08:05:25

we tend to explicitly add dates because the historical data is not accessible to our external integrations via Datomic Analytics. also, if you ever replay your tx log from one database to another then the dates of the transactions will differ

favila12:05:58

@U0GC1C09L “if you ever replay your tx log from one database to another then the dates of the transactions will differ” that’s not completely correct. the :db/txInstant assertion is in the tx log, so it will copy over unless you filter it out

favila12:05:11

The use case for allowing this is to back-date data, but the tx/Instant of any new transaction must be >= the previous one, so this technique is limited to fresh databases

joshkh12:05:39

that's news to me, thanks for the correction @U09R86PA4