Fork me on GitHub

@mpenet @favila I’m exploring DynamoDB as an alternative (cheating with Amazonica for now). Here’s the code I’m trying out at the moment. It encodes and decodes the EDN blob using Transit in Messagepack mode.


There’s a lot of casts and pprint in there for me to follow along with what’s happening just now, it can be stripped out.


I think for dealing with versioning, you’d use a new UUID for every time you update the blob, and keep all versions intact in DynamoDB. That way it can follow along with the versioning happening in Datomic.


For Amazonica, I’m stripping out all AWS deps and pull DynamoDB back in at the same version that Ions uses.


Just wondering, why the 4k string limit and lack of byte[] in cloud? Is that to artificially limit the memory usage of databases/entities?


Hi folks, I'm having some problems using db/fn with datomic memory connections in which they don't give a exception or warning if a transaction function uses dependencies that are not part of datomic classpath (for a example, calling a datomic function using a dynamo connection gives a "not a classpath" error)


Is there some way to represent this kind of error using a datomic memory connection?


String size limit is probably to keep segment size down