This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-10-08
Channels
- # announcements (10)
- # babashka (4)
- # beginners (98)
- # cider (47)
- # clara (6)
- # clj-kondo (16)
- # clojure (54)
- # clojure-australia (3)
- # clojure-berlin (3)
- # clojure-czech (2)
- # clojure-europe (77)
- # clojure-nl (4)
- # clojure-uk (12)
- # clojuredesign-podcast (6)
- # clojurescript (10)
- # conjure (56)
- # cursive (3)
- # data-science (6)
- # datascript (8)
- # datomic (213)
- # depstar (5)
- # events (1)
- # figwheel-main (2)
- # fulcro (23)
- # graalvm (2)
- # jobs (3)
- # london-clojurians (1)
- # malli (30)
- # meander (15)
- # midje (1)
- # mount (5)
- # off-topic (18)
- # re-frame (4)
- # reitit (15)
- # remote-jobs (1)
- # shadow-cljs (23)
- # spacemacs (10)
- # specter (1)
- # tools-deps (88)
- # vim (16)
- # xtdb (1)
Is it a known bug that when there's a bunch of datums that get transacted simultaneously, it can randomly cause a :db.error/tempid-not-an-entity tempid '17503138' used only as value in transaction
error?
The meaning of this error is that the string “17503138” is used as a tempid that is the value of an assertion, but there is no place where the tempid is used as the entityid of an assertion; the latter is necessary for datomic to decide whether to mint a new entity id or resolve it to an existing one
Well, as you can see in the actual datums I posted, it clearly is being used as :db/id
.
I had my program dump all datums into a file before transacting, and I copied the two that refer to this string over into here
In your example, I see the second item says :account/accounts “17503138”. Are both these maps together in the same transaction?
(Btw a map is not a datum but syntax sugar for many assertions—it’s a bit confusing to call it that)
Yes, they are both together in the same transaction. True, I mixed up the terminology... Entity would be more fitting
Yes, reliably, every time with the same dataset. Both locally with a dev
database as well as on our staging server using PostgreSQL.
I had that same issue a while back in a normal transaction without conformity as well though
I’ve only ever used conformity for schema migrations; using it for data seems novel; but I’m suspicious that these are really not in the same transaction
See if you can get it to dump the full transaction that fails and make sure both maps mentioning that tempid are in the same transaction
It is often caused by one single entry that is the same structure as many others. Everything is fine, but for some reason, Datomic doesn't like it. Removing that one entry solves the problem.
why are both of those entity maps in separate vectors?
If you’re adding them with d/transact
, all of the entity maps and/or datoms passed under the :tx-data
key need to be in the same collection
based on the problem you described, I would expect that error if you transacted the first of those, and then tried the second of those in a separate transaction
The two datums causing problems:
[{:account/photo
"REDACTED",
:account/first-name "REDACTED",
:account/bio
"REDACTED",
:account/email-verified? false,
:account/location 2643743,
:account/vendor-skills [17592186045491],
:account/id #uuid "dd33747e-5c13-4779-8c23-9042460eb3f3",
:account/vendor-industry-experiences [],
:account/languages [17592186045618 17592186045620],
:account/vendor-specialism 17592186045640,
:account/links
[{:db/id "REDACTED",
:link/id #uuid "ea51184c-d027-44d0-8f20-df222e58daf3",
:link/type :link-type/twitter,
:link/url "REDACTED"}
{:db/id
"REDACTED",
:link/id #uuid "c9577ca4-332d-41f0-b617-c00e89fc94b4",
:link/type :link-type/linkedin,
:link/url
"REDACTED"}],
:account/last-name "REDACTED",
:account/email "REDACTED",
:account/vendor-geo-expertises
[17592186045655 17592186045740 17592186045648],
:db/id "17503138",
:account/vendor-type 17592186045484,
:account/roles [:account.role/vendor-admin],
:account/job-title "Investor"}]
and
[{:account/primary-account "17503138",
:company/headline "REDACTED",
:account/accounts ["17503138"],
:tenant/tenants [[:tenant/name "REDACTED"]],
:company/name "REDACTED",
:company/types [:company.type/contact],
:db/id "REDACTED",
:company/id #uuid "ee26b11f-53ba-43f9-a59b-f7ad1a408d41",
:company/domain "REDACTED"}]
During a meetup recording that I haven't uploaded yet I recorded my own maven private token from https://cognitect.com/dev-tools/view-creds.html is there a way I can regenerate that token?
Can you send an email to <mailto:[email protected]|[email protected]> and we will help with this?
thank you, I've just sent an email over
Hey, I just missing something and can't figure out what. I am calling tx on datomic:
(defn add-source [conn {:keys [id name]
:or {id (d/squuid)}}]
(let [tx {;; Source initial state
:db/id (d/tempid :db.part/user)
:source/id id
:source/storage-type :source.storage-type/disk
:source/job-status :source.job-status/dispatched
:source/created (java.util.Date.)
:source/name name}]
@(d/transact conn [tx])))
;; and then later API will call
(add-source conn entity-data)
After I call add-source
entity is created, but after another call is made old entity is rewritten, only if I call transact with multiple transactions I can create multiple entities, but other than that old entity is being rewritten. I am new to datomic, and I can't find any resources about that, can anyone help?tempids resolve to existing entities if you assert a :db.unique/identity
attribute value on them that already exists. Are any of these attributes :db.unique/identity
? are you sure you are not supplying an id argument to your function?
(btw I would separate transaction data creation into a separate function so it’s easier to inspect)
{:db/doc "Source ID"
:db/ident :source/id
:db/valueType :db.type/uuid
:db/cardinality :db.cardinality/one
:db/id #db/id [:db.part/db]
:db.install/_attribute :db.part/db}
Id I removed :db/id
from transaction, I shoud still be able to create new entity, right? But everytime first one is rewritten
something that shows you calling add-source twice with the returned tx data, and pointing out what you think is wrong with the result of the second call?
Ok I had unique on other parameter:
{:db/doc "Source name"
:db/ident :source/name
:db/unique :db.unique/identity
:db/valueType :db.type/string
:db/cardinality :db.cardinality/one
:db/id #db/id [:db.part/db]
:db.install/_attribute :db.part/db}
If I removed it, all entities are created and it works how I expected. So I will read more about unique attribute, thanks @U09R86PA4 I would not noticed it without your help!Well, I guess I am going to do my migrations using a home-made solution now. I just lost all trust in Conformity. It doesn't write anything to the database most of the time I noticed.
I have a migration that is in a function. Conformity runs the function normally, but instead of transacting the data returned from it, it just discards it. The data is definitely valid; I made my migration so it also dumps the data into a file. I can load that file as EDN and transact it to the db using d/transact
perfectly fine.
not sure what to tell you. you need to analyze this further before throwing up your hands
Conformity does bookkeeping to decide whether a “conform” was already run on that database. If you’re running the same key name against the same database a second time, it won’t run again. Is that what you are doing?
Well, the transaction is changing the schema, and then transforming the data that is in there.
you can use conforms-to?
to test whether conformity thinks the db already has the norm you are trying to transact
Well, what is the second argument to conforms-to?
? It's neither the file name nor the output of c/read-resource
heya, coming here for a question about datomic cloud. I've noticed that while developing on a repl, I get exceptions as described in the datomic.api.client
api:
All errors are reported via ex-info exceptions, with map contents
as specified by cognitect.anomalies.
See .
But on the live system, these exceptions don't seem to be ex-info
exceptions, just normal errors. At any rate, ex-data
returns nil for them. Does anyone know if this is intended? I couldn't find information about this differing behaviour.
A good example of these exceptions is malformed queries for q
. On the repl, connected via the datomic
binary, I get this return from ex-data
{:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message \"Query is referencing unbound variables: #{?string}\", :variables #{?string}, :db/error :db.error/unbound-query-variables, :dbs [{:database-id \"48e8dd4d-84bb-4216-a9d7-4b4d17867050\", :t 97901, :next-t 97902, :history false}]}
But on the live system, I get nil.think so, yeah
have a ion handling http requests directly, and the repl is calling the handler that's registered on the ion
so it should be the same code running
we can see on the aws logs that the error is of a different shape
let me dig it up
on the aws logs, logging the exception, shows this
{
"Msg": "Alpha API Failed",
"Ex": {
"Via": [
{
"Type": "com.google.common.util.concurrent.UncheckedExecutionException",
"Message": "clojure.lang.ExceptionInfo: :db.error/not-a-binding-form Invalid binding form: :entity/graph {:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message \"Invalid binding form: :entity/graph\", :db/error :db.error/not-a-binding-form}",
"At": [
"com.google.common.cache.LocalCache$Segment",
"get",
"LocalCache.java",
2051
]
},
{
"Type": "clojure.lang.ExceptionInfo",
"Message": ":db.error/not-a-binding-form Invalid binding form: :entity/graph",
"Data": {
"CognitectAnomaliesCategory": "CognitectAnomaliesIncorrect",
"CognitectAnomaliesMessage": "Invalid binding form: :entity/graph",
"DbError": "DbErrorNotABindingForm"
},
"At": [
"datomic.core.error$raise",
"invokeStatic",
"error.clj",
55
]
}
],
(note: this was not the same unbound var query as above)
printing the error on the repl, we see this instead
#error {
:cause "Invalid binding form: :entity/graph"
:data {:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message "Invalid binding form: :entity/graph", :db/error :db.error/not-a-binding-form, :dbs [{:database-id "48e8dd4d-84bb-4216-a9d7-4b4d17867050", :t 97058, :next-t 97059, :history false}]}
:via
[{:type clojure.lang.ExceptionInfo
:message "Invalid binding form: :entity/graph"
:data {:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message "Invalid binding form: :entity/graph", :db/error :db.error/not-a-binding-form, :dbs [{:database-id "48e8dd4d-84bb-4216-a9d7-4b4d17867050", :t 97058, :next-t 97059, :history false}]}
:at [datomic.client.api.async$ares invokeStatic "async.clj" 58]}]
more precisely, (ex-data e)
returns the anomaly inside that exception
I imagine the datomic client wraps the exception doing something like (ex-info e anomaly cause)
we're not wrapping it on our end, just calling ex-data over it to get the anomaly
but on the live system, ex-data over the exception returns nil
which I think means it wasn't created with ex-info
I mean, I wouldn't be surprised if this is indeed intended to not leak information on the live system
that anomaly contains database ids, time info, and history info
just wanted to make sure if it was intended or not before working around it
@filipematossilva are you saying that you are not able to get a :cognitect.anomalies/incorrect
from your failing query on the client side?
if by client side you mean "what calls the live datomic cloud system", then yes, that's it
@filipematossilva so what's different about your "live system" vs. the repl?
I really don't know, that's what prompted this question
regarding printing the error
I'm printing the exception proper like this:
(cast/alert {:msg "Alpha API Failed"
:ex e})
on the live system the cast prints this
oh, yeah that's a com.google.common.util.concurrent.UncheckedExecutionException at the outermost layer
on the repl, when cast is redirected to stderr, the datomic binary shows this
just realized that the logged response there on the live system wasn't complete, let me fetch the full thing
ok this is the full casted thing on aws logs
now that I look at the full cast on life, I can definitely see the cause and data fields there
which leaves me extra confused 😐
in your REPL, you are getting an exception that is: * clojure.lang.ExceptionInfo + anomaly data in your live system you are getting: * com.google.common.util.concurrent.UncheckedExecutionException * clojure.lang.ExceptionInfo + anomaly data
to work around temporarily, you can do (-> e ex-cause ex-data)
to unwrap the outer layer
I can see that via
indeed shows different things, as you say
but the toplevel still shows data
and cause
for both situations
I imagine that data
would be returned from ex-data
let me edit those code blocks to remove the trace, I think it's adding a lot of noise and not helping
I think it's important to separate the exception object chain from the data that represents it (which may pull data from the root exception, not from the top exception)
Throwable->map
for example pulls :cause, :data, :via from the root exception (deepest in the chain)
@alexmiller it's not clear to me what you mean by that in the current context
(besides the factual observation)
is it that you also think that the different behaviour between the repl+datomic binary and live system should be overcome by calling Throwable->map
prior to extracting the data via ex-data
?
I’m just saying that the data you’re seeing is consistent with what Ghadi is saying
Even though that may be confusing
ok I think I understand what you mean now
thank you for explaining
currently deploying your workaround, and testing
@filipematossilva this is in an Ion correct?
the workaround is fine enough for me, but maybe you'd like more information about this?
@marshall correct
in a handler-fn for http-direct
@ghadi I replaced my (ex-data e)
with this fn
(defn error->error-data [e]
;; Workaround for a difference in the live datomic system where clojure exceptions
;; are wrapped in a com.google.common.util.concurrent.UncheckedExecutionException.
;; To get the ex-data on live, we must convert it to a map and access :data directly.
(or (ex-data e)
(-> e Throwable->map :data)))
I can confirm this gets me the anomaly for the live system
slightly different than on the repl still
live:
{:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message "Invalid binding form: :entity/graph", :db/error :db.error/not-a-binding-form}
repl:
{:cognitect.anomalies/category :cognitect.anomalies/incorrect, :cognitect.anomalies/message \"Invalid binding form: :entity/graph\", :db/error :db.error/not-a-binding-form, :dbs [{:database-id \"48e8dd4d-84bb-4216-a9d7-4b4d17867050\", :t 97901, :next-t 97902, :history false}]}
which makes sense, because in the live exception the :dbs
property just isn't there
but tbh that's the one that really shouldn't be exposed
so that's fine enough for me
thank you
Does anyone know how I get the t
from tx
(d/tx->t tx)
, but my tx
is a map and the error in the conversion?
{:db-before [email protected], :db-after [email protected], :tx-data [#datom[13194139534369 50 #inst "2020-10-08T19:30:19.852-00:00" 13194139534369 true] #datom[277076930200610 169 #inst "2020-10-08T06:00:59.275-00:00" 13194139534369 true] #datom[277076930200610 163 17592186045452 13194139534369 true] #datom[277076930200610 165 277076930200584 13194139534369 true] #datom[277076930200610 170 17592186045454 13194139534369 true] #datom[277076930200610 162 277076930200581 13194139534369 true] #datom[277076930200610 167 #inst "2020-10-08T19:30:19.850-00:00" 13194139534369 true] #datom[277076930200610 168 17592186045432 13194139534369 true] #datom[277076930200610 166 #uuid "5f7f68cb-08f0-4cb2-964b-4e811a34a949" 13194139534369 true]], :tempids {-9223090561879066169 277076930200610}}
java.lang.ClassCastException: clojure.lang.PersistentArrayMap cannot be cast to java.lang.Number
You need to grab the tx
from a datom
in :tx-data
, in your case 13194139534369
. I think something like (-> result :tx-data first :tx)
will give you it
Q: I want to store 3rd party oauth tokens in Datomic. Storing them as cleartext is not secure enough so I plan to use KMS to symmetrically encrypt them before storage. Has anyone done something like this before? If so, any advice? Or is there an alternative you would recommend?
interaction patterns within KMS are not supposed to be for encryption/decryption of fine granularity items
so when you boot up, you ask KMS to decrypt the DEK, then you use the DEK to decrypt fine-grained things in the application
if you talk to KMS every time you want to decrypt a token, you'll pay a fortune and add a ton of latency
if I am weighing pros/cons of DEK/Datomic vs Secrets Manager, what are the advantages of using Datomic?
It’s a Salesforce OAuth so the refresh period is configurable I believe. would need to check