Fork me on GitHub
#datomic
<
2020-05-29
>
arohner11:05:26

hrm, it seems like my tuple write was failing, and I don’t understand why.

dmarjenburgh12:05:20

I'm trying to do an index-pull but running into a Datomic Client Exception:

clojure.lang.ExceptionInfo: Datomic Client Exception {:cognitect.anomalies/category :cognitect.anomalies/forbidden, :http-result {:status 403, :headers {"server" "Jetty(9.4.24.v20191120)", "content-length" "19", "date" "Fri, 29 May 2020 12:34:05 GMT", "content-type" "application/transit+msgpack"}, :body nil}}
	at datomic.client.api.async$ares.invokeStatic(async.clj:58)
	at datomic.client.api.async$ares.invoke(async.clj:54)
	at datomic.client.api.sync$channel__GT_seq.invokeStatic(sync.clj:72)
	at datomic.client.api.sync$channel__GT_seq.invoke(sync.clj:69)
	at datomic.client.api.sync$eval20791$fn__20808.invoke(sync.clj:113)
	at datomic.client.api.protocols$fn__11940$G__11875__11947.invoke(protocols.clj:126)
	at datomic.client.api$index_pull.invokeStatic(api.clj:293)
	at datomic.client.api$index_pull.invoke(api.clj:272)
I can query the db normally otherwise.

favila12:05:34

are you sure the target server supports it?

dmarjenburgh13:05:09

Haha, I was under the impression the upgrade was already deployed, but it was still in the pipeline :face_palm::skin-tone-3: . Works now.

arohner12:05:18

The fn is either a fully qualified function allowed under the :xforms key in resources/datomic/extensions.edn, or one of the following built-ins:
I can’t find anything else in the docs that reference extensions.edn. Where can I learn more about that?

favila13:05:41

doubling down on this question, it’s also not clear to me whether the extension function needs to exist on the client’s classpath or the client-server’s classpath

favila13:05:00

or why this is necessary at all for the on-prem api

marshall15:05:34

The extensions.edn file needs to be available in the classpath at that relative path (`resources/datomic/extensions.edn`)

marshall15:05:46

it needs to be there in the system that will be doing the work

marshall15:05:51

so if you’re using peer, in the peer process

marshall15:05:08

for client, it needs to be in the cp of the peer-server process

marshall15:05:44

if you’re using it inside a transaction function, it would need to be in the transactor cp

favila16:05:37

And it looks like {:xforms #{var/name ,,,}} ?

marshall16:05:18

i believe the value is a vector (or list) of symbols

marshall16:05:27

set may work too

marshall16:05:42

based on cloud, I would say a vector of fully qualified symbols

marshall16:05:59

I’ll look at adding that detail in onprem docs

jaret15:05:10

Howdy! We just released a fix for Datomic On-Prem Console. The latest release had a bug that caused console to fail to start. https://forum.datomic.com/t/datomic-console-0-1-225-now-available/1472

arohner15:05:37

Is it possible to use a lookup ref in the same transact that creates the unique identity? It seems like the answer is no

marshall15:05:39

No, but you can use a tempid for that

arohner15:05:05

But then I need to know whether the unique identity already exists or not

marshall15:05:33

i think i’d need more detail If you have one entity being asserted that has a unique ID and another that references it via tempid, Datomic’s entity resolution should handle that correctly whether or not the entity with the unique ID already exists or not. If it does, it will become an upsert, if it doesn’t it will be created

favila16:05:54

I’m guessing from our earlier conversation that Allen wants to use this with a unique-identity composite attr. I think this doesn’t work unless you assert the composite. e.g. {:db/id "tempid" :attr-a 123 :attr-b 456} where the upsert attr is :attr-a+b

marshall16:05:32

yes, agreed if you’re upserting you need to include the :attr-a+b in the transaction

arohner20:05:01

AFAICT, it doesn’t work with a scalar unique attribute either

marshall20:05:36

Can you provide your txn data and results you see not working?

arohner20:05:12

The code is kind of lengthy and it’s late here (London). I’m trying to build a ledger. When inserting transaction items:

{:db/ensure ::accounts/tx-item
                                                       ::accounts/account [::accounts/account-id (::accounts/account-id i)]
                                                       ::money/currency (-> i ::accounts/tx-amount :currency keyword)
                                                       ::money/value (-> i ::accounts/tx-amount :value)}

arohner20:05:13

I’m trying to insert :accounts/account, in the same transaction as the tx-items. Inserting tx-items fails with

:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message "Unable to resolve entity: [:griffin.proc.accounts/account-id #uuid \"17af261f-9ad5-58e6-938f-b3b7a0ffee22\"] in datom [-9223301668109421343 :griffin.proc.accounts/account [:griffin.proc.accounts/account-id #uuid \"17af261f-9ad5-58e6-938f-b3b7a0ffee22\"]]"

marshall20:05:26

You cant use the lookup ref

marshall20:05:33

You need to use the tempid

marshall20:05:45

If youre creating the entity in the same transaction

marshall20:05:43

Create the account with a db/id "foo"

marshall20:05:58

And "foo" in place of your lookup ref

arohner20:05:10

Right. That’s not convenient because it requires me knowing whether the entity already exists or not, which requires an extra query

marshall20:05:35

Not if you have the account entity in the same txn

marshall20:05:16

[{:account/id "someuniquevalue"
  :db/id "foo"}
 {:transaction/value 20
  :transaction/account "foo"}]

marshall20:05:32

if account/id “someuniquevalue” exists, it will upsert

marshall20:05:35

if not it will create

marshall20:05:56

either way, the txn with value 20 will have a ref attr pointing to that account

arohner20:05:41

It’s been several years since I used datomic in anger. At the time, the advice was don’t assert facts unnecessarily. Won’t that create new datoms every time, even if the account already exists?

marshall20:05:32

datomic does redundancy elimination

marshall20:05:40

if the acct entity exists it will upsert

marshall20:05:44

if it doesnt it will be created

marshall20:05:05

any attr/val pairs that already exist for that entity will be eliminated if the value is identical

marshall20:05:17

if the value is different it will retract the old value and assert the new value

marshall20:05:31

if the attr is not present at all on that entity it will assert the attr/value for that entity

marshall20:05:28

not sure where “dont assert facts unnecessarily” would come from certainly doing the work of redundancy elimination has some cost, but i would not expect it to be prohibitive, especially in this case, as you have to “find” the account entity either way, whether it’s via the entity being asserted or with the lookupref

marshall20:05:19

a completely redundant txn would create a tx/Instant datom

marshall20:05:40

so if everrything you assert is duplicate you’d be accumulating an “unnecessary” couple of datoms

marshall20:05:02

which again, not a big deal as long as you arent doing it in huge numbers

marshall20:05:25

i.e. here and there totally nbd every single minute, 10 times a minute all the time… maybe not so great

arohner20:05:42

That’s good to know

arohner20:05:57

The rest of the transaction definitely has to happen and will have novelty, so it sounds like nbd

kschltz21:05:17

Hi there, We're currently using datomic cloud and I've been stuck with the following: We have several source applications providing financial data, each one with its own payload, so we decided to have a 'normalizer' service to convert each format to a common payload, so we can build our products in an agnostic manner. To illustrate this:

;;{:source.a/name "John Doe"  would become something like-> {:common/name "John Doe"
   :source.a/amount 44.50}                                   :common/amount 4450
                                                             :common/source {:source.a/name "John Doe"
                                                                             :source.a/amount 44.50}}
   
We chose to keep the original format in the final structure to maintain some backtracking and ease integration with legacy systems. Now, say there is a buggy implementation in this conversion function rounding floats or any other error, and we end up with incorrect values in the common payload, but we still have the original data. Does datomic have any support for me to bulk 'alter' that data? The first solution that came to my mind, was to query all the incorrect data, extract the source info, pass it through the correct function, then transact it back to datomic. But I wonder, does Datomic has any feature to better support that, something closer to a "compare and swap-like" feature? Thanks to you all, patient readers 😄

marshall21:05:41

You'd need to handle the bulk nature yourself. Also, if your entities are cardinality one, you could just reassert all the values

marshall21:05:21

Ones that were the same would be unchanged (redundancy elimination)

marshall21:05:46

Ones that differ would be "upserted"

marshall21:05:10

If the attributes are cardinality many youd need to retract them explicitly

kschltz21:05:51

thanks a lot