This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2021-05-24
Channels
- # announcements (1)
- # babashka (86)
- # beginners (75)
- # boot-dev (1)
- # cljdoc (18)
- # cljs-dev (1)
- # cljsrn (67)
- # clojure (127)
- # clojure-australia (1)
- # clojure-dev (2)
- # clojure-europe (9)
- # clojure-nl (2)
- # clojure-serbia (2)
- # clojure-spec (11)
- # clojure-uk (14)
- # clojurescript (17)
- # code-reviews (4)
- # conjure (37)
- # core-async (11)
- # datomic (21)
- # emacs (1)
- # helix (36)
- # jobs (6)
- # malli (1)
- # meander (20)
- # re-frame (13)
- # reagent (49)
- # remote-jobs (11)
- # rum (1)
- # sci (1)
- # shadow-cljs (29)
- # sql (17)
- # vim (2)
I'm having trouble trying to query Datomic Analytics / Presto via JDBC. I have a decimal(38,2)
field, and it's causing an exception in the Presto JDBC driver.
(def conn (java.sql.DriverManager/getConnection presto-url "." ""))
(let [stmt (.createStatement conn)]
(.executeQuery stmt "SELECT credit_amount FROM journal_entry_line"))
;;=>
1. Caused by java.lang.IllegalArgumentException
ParameterKind is [TYPE] but expected [LONG]
TypeSignatureParameter.java: 110 com.facebook.presto.jdbc.internal.common.type.TypeSignatureParameter/getValue
TypeSignatureParameter.java: 122 com.facebook.presto.jdbc.internal.common.type.TypeSignatureParameter/getLongLiteral
ColumnInfo.java: 194 com.facebook.presto.jdbc.ColumnInfo/setTypeInfo
PrestoResultSet.java: 1869 com.facebook.presto.jdbc.PrestoResultSet/getColumnInfo
PrestoResultSet.java: 123 com.facebook.presto.jdbc.PrestoResultSet/<init>
PrestoStatement.java: 272 com.facebook.presto.jdbc.PrestoStatement/internalExecute
PrestoStatement.java: 230 com.facebook.presto.jdbc.PrestoStatement/execute
PrestoStatement.java: 79 com.facebook.presto.jdbc.PrestoStatement/executeQuery
If I cast(credit_amount AS varchar)
then it works. This seems to be central line in the stacktrace: https://github.com/prestodb/presto/blob/2ad67dcf000be86ebc5ff7732bbb9994c8e324a8/presto-jdbc/src/main/java/com/facebook/presto/jdbc/ColumnInfo.java#L194
case "decimal":
builder.setSigned(true);
builder.setColumnDisplaySize(type.getParameters().get(0).getLongLiteral().intValue() + 2); // dot and sign
builder.setPrecision(type.getParameters().get(0).getLongLiteral().intValue());
builder.setScale(type.getParameters().get(1).getLongLiteral().intValue()); // <----- getLongLiteral -> ParameterKind is [TYPE] but expected [LONG]
I'm splitting my initial Marketplace master Datomic Cloud stack into a split-stack solo topology. I didn't provide an ApplicationName in my initial setup from the Marketplace (so the System Name is used, as I understand it); should I provide one now?
Have folks found it beneficial to provide the ApplicationName even when it's the same as the SystemName?
Say I want to connect to the in-memory database of an on-prem datomic peer server. What’s the value to specify for :endpoint? (see https://docs.datomic.com/client-api/datomic.client.api.html#var-client)
One issue with spinning up an in-memory peer server and connecting to it is that, the peer server listens on a TCP port. So we cannot really run two instances of the test because the two collides on the same port.
As a docs heads up, there's an empty bullet-point at https://docs.datomic.com/cloud/getting-started/configure-access.html#authorize-gateway
Is it a bad idea to rely on :db/txInstant
as the “created at” time for an entity? The instant at which an entity’s :thing/id
datom was asserted is a nice natural creation date, but I’m getting the sense that I’m abusing it a bit. For example, I can’t use index-pull
to get the “10 latest” things, because that txInstant datom is on a separate entity (the transaction)…
If the entity is created and submitted by an external system, it’s best to require a creation/event time as an input and to verify that is at a point in the recent past.
That’s a good point. Or I suppose import jobs are another reason why one shouldn’t overload the :db/txInstant
attribute. It’s really more “this is when it entered the system”, whereas creation time is a domain concern.
> wall clock times specified by `db:txInstant` are imprecise as more than one transaction can be recorded in the same millisecond
you would want to set txInstant on imports, too https://docs.datomic.com/cloud/best.html#set-txinstant-on-imports
even in systems with a RDBMS I like users of a system to provide specific times with their data and also record transaction timestamps
What if the entity is created by users? Should I be managing created-at
/`updated-at` times manually?
Ah found some good material on the matter: https://vvvvalvalval.github.io/posts/2017-07-08-Datomic-this-is-not-the-history-youre-looking-for.html
we tend to explicitly add dates because the historical data is not accessible to our external integrations via Datomic Analytics. also, if you ever replay your tx log from one database to another then the dates of the transactions will differ
@U0GC1C09L “if you ever replay your tx log from one database to another then the dates of the transactions will differ” that’s not completely correct. the :db/txInstant assertion is in the tx log, so it will copy over unless you filter it out
The use case for allowing this is to back-date data, but the tx/Instant of any new transaction must be >= the previous one, so this technique is limited to fresh databases
that's news to me, thanks for the correction @U09R86PA4