This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-07-03
Channels
- # aleph (3)
- # beginners (139)
- # boot (3)
- # cider (12)
- # cljs-dev (18)
- # clojure (100)
- # clojure-dev (21)
- # clojure-dusseldorf (5)
- # clojure-germany (1)
- # clojure-italy (35)
- # clojure-nl (26)
- # clojure-spec (4)
- # clojure-uk (60)
- # clojurescript (11)
- # clojutre (4)
- # cursive (21)
- # data-science (21)
- # datomic (47)
- # editors (3)
- # emacs (2)
- # events (4)
- # figwheel (2)
- # fulcro (28)
- # jobs (27)
- # jobs-discuss (21)
- # lein-figwheel (3)
- # midje (2)
- # off-topic (20)
- # om-next (4)
- # onyx (10)
- # overtone (1)
- # pedestal (2)
- # portkey (14)
- # re-frame (71)
- # reagent (44)
- # reitit (11)
- # remote-jobs (1)
- # ring-swagger (4)
- # shadow-cljs (64)
- # spacemacs (11)
- # testing (2)
- # tools-deps (8)
- # vim (8)
What are the recommended ways to test a transaction function with datomic cloud? I can see that a traditional fixture approach will work against a datmoic cloud service (create-database, add schema, add data, run tests, drop database). Can't see how generative testing could work because of the db
parameter but I do see we can treat that as static at least (ref https://docs.datomic.com/cloud/transactions/transaction-functions.html#testing). Feels like using datomic free in-mem would be faster but I think that doesn't have the cloud-api (likely premature optimisation in that thought).
Instead of trying to interpret/validate the data returned I guess I could be observing the normalised data associated with the change it makes on the database via the Log API.
sorry if someone already asked this but…the result of a datomic client query is a vector of vector tuples. in the old (peer) api it was a set of vector tuples. is this deliberate? the reason I ask is because the docs https://docs.datomic.com/cloud/query/query-executing.html show two behaviours - first a set and then a vector. which is correct or is there some way to control this?
I could imagine this being related to the new :limit and :offset features (presumably for pagination) although that would also imply some kind of “order by” feature as well but I can’t find docs on using these for pagination
I found the docs for :limit and :offset https://docs.datomic.com/cloud/client/client-api.html although I’m still unsure about the lack of “order-by” like behaviour when using these features. Is it just arbitrary?
so, in summary: 1/ is the set result here a doc bug? https://docs.datomic.com/cloud/query/query-executing.html and 2/ when paginating, is there an implied ordering or can we control this now?
another possible doc bug? : should the :server-type be :cloud in the “Connect and use Datomic” section (instead of :ion)
@steveb8n check if you are on an older version of client pre ion support, see https://docs.datomic.com/cloud/releases.html#0-8-54
I suspect a bug in the client api as well. it doesn’t mention :ion here (throw (impl/incorrect ":server-type must be :cloud, :peer-server, or :local")))
I can’t see any change in behaviour between :cloud and :ion. is there any difference?
Released Datomock v0.2.2
. This solves bugs with Datomock's Log implementation, which failed to accept nil, Dates and tx-entids for txRange
bounds.
https://github.com/vvvvalvalval/datomock
Perhaps a typo in the Ions Tutorial. In the Deploy section it gives an example of using curl to make sure everything is okay:
curl https://$(obfuscated-name). -d :hat
Which returned {"message":"Missing Authentication Token"}
I think that should be:
curl https://$(obfuscated-name). -d :hat
Cool. Glad I could help. Just noticed the formatting of the following section isn't quite right. The HTML for the link is showing as text:
The API Gateway is an external connection point, not managed by Datomic. If you created an API Gateway in the previous step, you can select and delete it <a href="" target="_awsconsole">in the console</a>.
@olivergeorge thanks! I’ve updated the malformed link.
Clearly that's not quiet right but it works.
hi @steveb8n Query is documented to return a collection of tuples: https://docs.datomic.com/client-api/datomic.client.api.html#var-q, so your consuming code should not know/care about sets vs vectors. There is no "order by" in query (yet).
hi @olivergeorge You can generate db values by picking from a set of premade example values, and those example values do not need to be constructed every time you run the test. Why not construct fixture dbs once when you write a test, give them good db-names (db-in-state-A, db-in-state-B)?
I'll give that a try. Thanks.
@olivergeorge what does your tx fn do?
At this stage I'm doing simple things. But thinking ahead to more complex systems. Still lots to learn.
Thanks @stuarthalloway for the clarification
any advice/strategies on env config/parameterization for ions? I know we can obviously just stick the info in datomic itself
The “as a standalone Clojure API” link at the very top of https://docs.datomic.com/on-prem/pull.html is broken
@eoliphant I recommend using AWS Systems Manager parameter store. Hm, maybe Datomic should have a feature making that easier... 🙂
Hmm... I gather that datomic doesn't store duplicate values. But could it be done?
Usecase: I want to store the last n
dice rolls for one of my players, but duplicate values should still count.
Currently I'm just using the following schema:
{:db/ident :player/last-rolls
:db/valueType :db.type/long
:db/cardinality :db.cardinality/many
:db/doc "The last couple of dice rolls for this player"}
Hey @rhansen, you could change :db/valueType
to :db.type/ref
then create an entity to represent a roll, including a timestamp on the entity.
Do you have an example of this? Do you mean encode as a series of bytes or a string? Just looking for some clarity here. Seems like a workaround to encode an ordered collection, am I right?
the point is only that datomic doesn't see into the value; because it's a blob to datomic you can store whatever you want in it that datomic couldn't represent natively
as well as an edn string too I suppose. Neat. That may change the way I model a schema today. I’ll give it a shot.