Fork me on GitHub

HoneySQL "2.0 Gold" -- com.github.seancorfield/honeysql {:mvn/version "2.0.783"} is available -- SQL as Clojure data structures. Build queries programmatically - even at runtime - without having to bash strings together! -- ā€¢ Uses different coordinates and namespaces to 1.0.x so that you can use both together and migrate on a per-query basis! ā€¢ Completely rewritten to make user-level extension much easier and to fully support PostgreSQL without needing additional libraries -- but still maintains compatibility with the data DSL from 1.0.x and most of the helpers from 1.0.x -- see for more details

šŸŽ‰ 104
šŸÆ 18
sheepy 12
catjam 9

Kudos! fully support PostgreSQL without needing additional libraries calls my attention. Does this refer to the revamped extension API, or I can actualy use vendor-specific syntax without extension whatsoever? comes to mind as a would-be tricky thing


Because of Clojure's restrictions on symbols/keywords, for the @> and <@ operators, you need to define a var as an alias:

(def at> (keyword "@>"))
but you can just register that as an operator and use at> in the DSL.

šŸ‘Œ 3

The primary goal was to implement everything from the nilenso extension library out of the box and add more over time as folks need.

bananadance 3

(sql/register-op! at>)
That's all that is needed to register a binary op for @>.


Thanks! Looking forward to give it a spin. Pleasantly surprised by how simple it is.

āž• 9

I've done a bit of work to the datomic-cloud-backup library and am up to version 0.0.5. The new version includes new support for writing backups to a local filesystem, a more general-purpose backup-segment! function, and a parallelized backup function for doing the initial backup of large databases in less time. My preliminary tests show a backup speed in the Cloud (writing to S3 in an alternate region) of at least 40k transactions per minute. I expect that to get faster as the I/O subsystems scale due to demand.

šŸ‘ 15

Hm. The parallel stuff seems to push Cloud over the DynamoDB limit and it fails. Looks like there needs to be some kind of back-off on that when that exception happens.


or a read speed rate-limiter to keep it under. this might be better for exporting from prod databases so that normal operations are less at risk