This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2021-11-02
Channels
- # announcements (11)
- # aws (2)
- # babashka (42)
- # beginners (26)
- # calva (17)
- # cider (1)
- # clara (2)
- # clj-kondo (44)
- # clojars (30)
- # clojure (43)
- # clojure-australia (6)
- # clojure-europe (29)
- # clojure-gamedev (4)
- # clojure-greece (1)
- # clojure-nl (4)
- # clojure-spec (4)
- # clojure-uk (6)
- # clojurescript (28)
- # cursive (16)
- # data-science (1)
- # datahike (4)
- # datomic (26)
- # emacs (6)
- # events (3)
- # fulcro (11)
- # graalvm (7)
- # holy-lambda (118)
- # java (9)
- # jobs (1)
- # leiningen (3)
- # lsp (21)
- # luminus (2)
- # malli (13)
- # membrane-term (1)
- # music (1)
- # nrepl (3)
- # off-topic (38)
- # pedestal (2)
- # polylith (39)
- # re-frame (33)
- # reagent (7)
- # releases (1)
- # remote-jobs (4)
- # rewrite-clj (28)
- # ring (21)
- # sql (2)
- # tools-deps (23)
- # vim (4)
- # xtdb (15)
Hey @jaret, thanks for your reply! It was a database used in our dev environment. I find it hard to believe that it is not possible to restore backups in datomic cloud? No, we only deleted a backed-up table in DynamoDB and then restored it. The purpose was to import a production table to recreate a bug from the production environment.
There is a chance the previous developer created Datomic backups - but backup/restore has not been added for Datomic cloud yet? https://forum.datomic.com/t/cloud-backups-recovery/370/2 It is kind of crazy that a database does not support backups/restores
Hi @allard.valtech. Thanks for using Datomic! A few points: 1. You can use https://docs.datomic.com/cloud/dev-local.html#import-cloud to make a local copy of a production database for dev purposes. 2. The resources created with a Datomic cloud system (e.g. DDB, EFS, and S3) should be managed only through Datomic tools. AWS tools are not aware of Datomic semantics are cannot preserve system invariants between different resources. We will look at making the documentation more clear on this point. 3. We are working on new ideas for data mobility and would love to hear about your specific use case if it is not covered by import-cloud.
Thanks for the clarification @U072WS7PE 🙂 I think import-cloud should cover our use case. Being new to Datomic, it seems like we've made some incorrect assumptions. Do you have any pointers on what action we could take in order to be able to write to the restored table? The restored table has the same ARN, so it seems like it should work
Hi Everyone,
I wonder if anyone might be able to help me with something?
I am trying to use an entity spec with datomic to provide constraint checking when entities are transacted/retracted from the database using :db/ensure
. This works fine - if my predicate fails the entities are not transacted into Datomic. But in the case where the predicate passes the :db/ensure
key/value also appears to be stored in the database - it's visible when I pull the entity out again. This isn't what I expected to happen according to the docs:
https://docs.datomic.com/cloud/schema/schema-reference.html#entity-specs
":db/ensure is a virtual attribute. It is not added in the database; instead it triggers checks based on the named entity."
Have I done something wrong causing :db/ensure
to be added to the database? Or have I misunderstood the docs?
I've got a simple test case showing this behaviour if it helps see what I mean.
If anyone can point me in the right direction that would be much appreciated, cheers!
I get the same behaviour in on-prem. Maybe the documentation is only valid for Datomic Cloud?
Thanks Fredrik 🙂
Good thinking, but looking at the on-prem docs it seems the behaviour should be the same (that is - it sounds like the :db/ensure
key shouldn't end up in the db), so not sure that's the reason:
https://docs.datomic.com/on-prem/schema/schema.html#entity-specs
I'm just guessing, but maybe being "virtual" has to do with the way the value of the attribute is resolved: The fact such an attribute exists is in fact recorded to the database, but the lookup is done at runtime using the classpath.
Thanks Fredrik. I'll keep looking and let you know if I work out what's up/stop it behaving like this/hear why it is like it is :)
Hi Cognitect, since Datomic Cloud 2021/09/29 936-9118, is this still true for Ion apps?
> There is no redirection when running in Datomic Cloud, so it is ok to leave calls to initialize-redirect in your production code.
I'm investigating why our logs cast no more appear in CloudWatch.
We do init our app, even in prod deployments, with (cast/initialize-redirect :tap)
and if I add-tap
using a remote REPL connection to the Ion app, I see that the logs cast are indeed tapped. Due to the documentation I'd expect them to not be tapped, but instead be sent to CloudWatch. That's what happened until recently and I wonder if I could correlate to our update to 936-9118.
Or the correlation could be to ion 1.0.57, even though the changelog isn't exactly about my issue.
I tried reverting to v 0.9.50 of ion.
I tried not running (cast/initialize-redirect :tap)
in deployed ion code.
Logs are not back up yet.
Continuing to try to find a proper cause.
As of now I'm certain the new version of Datomic is not in cause. False alert to Cognitect. 🙂
I found the cause. Sometime ago I started loading this file upon booting, even when in deployment in Datomic Cloud Ion. I shouldn't have.
Cognitect, please note that it seems to prove that your documentation is out of date.
That is, calling initialize-redirect
in production will have an effect, and block logs from appearing in CloudWatch.
The dark background screenshot shows that I needed to comment out these forms to bring back our logging capabilities. So I'll need to load this only in local dev, like I did before.
It was ok to bring back the ns with its require during boot, but not both expressions below.
I'm having trouble connecting to local postgres.
I have a docker-compose with hosts postgres
and datomicdb
My connection string is "datomic:sql://?jdbc:
My transactor has
protocol=sql
host=postgres
port=4334
license-key=LICENSE_KEY
sql-url=jdbc:
sql-user=datomic
sql-password=datomic
sql-driver-class=org.postgresql.Driver
memory-index-threshold=32m
memory-index-max=256m
object-cache-max=128m
When i try to connect from clojure i get,
clojure | Error communicating with HOST postgres on PORT 4334
So, I try swapping out the postgres host for datomicdb and i get
clojure | :db.error/read-transactor-location-failed Could not read transactor location from storage
figured it out ...
The issue was i had put my host as postgres
The host was supposed to be datomicdb
Hey, I’m writing a malli schema translation to datomic schema.
I’m puzzled with how to translate [:vector string?]
.
I think it should be a homogeneous tuple like:
{:db/ident :order/comments
:db/valueType :db.type/tuple
:db/cardinality :db.cardinality/one
:db/tupleType :db.type/string}
But my colleague suggests it may be just a cardinality/many string type prop.Tuple's only support up to 8 values, so I think your colleague might be on to something.
thanks! I missed it
order is important though
Ok, if order is important, then you need to represent an ordinal somehow.
:order/comments #{{:ordinal 1 :comment/text ":100:"} {:ordinal 2 :comment/text "Meh."}}
Hmm, yes, thanks for the note! I must specify that I’m writing a transformer function, to translate an arbitrary malli schema to a datomic schema, so I’m talking about potential input. I guess I should add warnings data to the output then.
I found the cause. Sometime ago I started loading this file upon booting, even when in deployment in Datomic Cloud Ion. I shouldn't have.
Cognitect, please note that it seems to prove that your documentation is out of date.
That is, calling initialize-redirect
in production will have an effect, and block logs from appearing in CloudWatch.
The dark background screenshot shows that I needed to comment out these forms to bring back our logging capabilities. So I'll need to load this only in local dev, like I did before.