This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-03-17
Channels
- # announcements (7)
- # babashka (56)
- # beginners (114)
- # bristol-clojurians (4)
- # calva (22)
- # cider (7)
- # clara (1)
- # clj-kondo (17)
- # cljs-dev (1)
- # clojure (93)
- # clojure-europe (8)
- # clojure-italy (5)
- # clojure-nl (2)
- # clojure-uk (79)
- # clojuredesign-podcast (18)
- # clojurescript (108)
- # code-reviews (6)
- # cursive (3)
- # data-science (16)
- # datomic (151)
- # duct (7)
- # emacs (10)
- # events (1)
- # fulcro (76)
- # luminus (8)
- # off-topic (3)
- # other-lisps (2)
- # pathom (8)
- # re-frame (5)
- # reitit (8)
- # schema (9)
- # shadow-cljs (37)
- # specter (3)
- # sql (17)
- # tree-sitter (2)
- # yada (9)
Hi everyone Is there a way in Clojure to wrap an arbitrary kind of value such that some special function I designate gets run when the value is accessed? My situation: I have a lot of different functions that process items in batches; every time one of them is called, it鈥檚 handed a collection of these items. I want to capture the timestamp for when each item in a batch gets processed, ideally without having to modify the guts of every one of these functions. If each of these functions handled a single item instead of a batch, I could wrap them in something that captured the timestamp and then did the work, but alas.
it seems like you could reify IDeref with that behavior, and just ensure that the caller uses deref to get the value from the container (the same way you would with an atom)
Is any sneakier approach possible, without having to visit each function and making sure it does anything special?
I doubt it
user=> (defn into-container [x] (reify clojure.lang.IDeref (deref [this] (println "some arbitrary side effect") x)))
#'user/into-container
user=> (def v (into-container 31))
#'user/v
user=> (+ @v @v)
some arbitrary side effect
some arbitrary side effect
62
AFAIK there's no container type other than vars that are implicitly accessed, and yeah you can add a watch to a var, but you aren't likely writing your functions such that they use the vars
though if you passed in the var itself rather than its symbol (via var-quote) and the client always dereferenced it could work with add-watch - but then your client has to change as much as it had to for the reify of IDeref anyway
Do the possibilities open up at all if it鈥檚 known ahead of time these items are maps and not some arbitrary kind of value?
in that case you could use reify or deftype and implement valAt
such that it does some extra step when something accesses a key
the kind of trick you can do with https://github.com/ztellman/potemkin#def-map-type
(with def-map-type I guess you'd override get
?)
along with all this I should probably say this isn't a typical style of programming in clojure, and there's likely a way to untangle your system using first class functions that gets the same property (logging accesses) without the "spookiness" of value access that has implicit side effects
you鈥檙e right the result of that untangling would be less surprising to any new team member working on the code
If I have a set of keys and a vector of values, how might I generate a vector of maps with the appropriate keys / values? [:name :phone] ["Joe" "555-1212" "Kim" "555-1212" "Pat" "555-1212"] [{:name "Joe" :phone "555-1212"} {:name "Kim" :phone "555-1212"} {:name "Pat" :phone "555-1212"}]
Part of an answer could be zipmap
, e.g.:
user=> (zipmap [:name :phone] ["Joe" "555-1212"])
{:name "Joe", :phone "555-1212"}
Hence why I said "part of an answer"
A more complete answer:
user=> (defn repeated-zipmap [key-seq val-seq]
(let [n (count key-seq)]
(map zipmap (repeat key-seq) (partition-all n val-seq))))
#'user/repeated-zipmap
user=> (repeated-zipmap [:name :phone] ["Joe" "555-1212" "Kim" "555-1212" "Pat" "555-1212"])
({:name "Joe", :phone "555-1212"} {:name "Kim", :phone "555-1212"} {:name "Pat", :phone "555-1212"})
Mine is essentially the same, except it also works for other lengths of the key sequence.
Set of keys and vector of values seems like a mismatch. How can you know which keys go to which values?
ScArcher probably meant "sequence" rather than "set", given the example input described.
hello all, how do I transform this
["SCHWEPPES CITRUS LT (C贸digo: 2582287 )"
"Qtde.:1 UN: un Vl. Unit.: 1,99 Vl. Total"
"1,99"
"OVO GRANDE BRANCO CR (C贸digo: 5286387 )"
"Qtde.:1 UN: un Vl. Unit.: 11,99 Vl. Total"
"11,99"
"PAO FRANCES CARREF K (C贸digo: 168076 )"
"Qtde.:0,326 UN: kg Vl. Unit.: 12,99 Vl. Total"
"4,23"
"PAO FRANCES CARREF K (C贸digo: 168076 )"
"Qtde.:0,354 UN: kg Vl. Unit.: 12,99 Vl. Total"
"4,60"
"SUCO REFR NATURAL XA (C贸digo: 7100698 )"
"Qtde.:4 UN: un Vl. Unit.: 7,79 Vl. Total"
"31,16"]
into
[["SCHWEPPES CITRUS LT (C贸digo: 2582287 )"
"Qtde.:1 UN: un Vl. Unit.: 1,99 Vl. Total"
"1,99"]
["OVO GRANDE BRANCO CR (C贸digo: 5286387 )"
"Qtde.:1 UN: un Vl. Unit.: 11,99 Vl. Total"
"11,99"]
["PAO FRANCES CARREF K (C贸digo: 168076 )"
"Qtde.:0,326 UN: kg Vl. Unit.: 12,99 Vl. Total"
"4,23"]
["PAO FRANCES CARREF K (C贸digo: 168076 )"
"Qtde.:0,354 UN: kg Vl. Unit.: 12,99 Vl. Total"
"4,60"]
["SUCO REFR NATURAL XA (C贸digo: 7100698 )"
"Qtde.:4 UN: un Vl. Unit.: 7,79 Vl. Total"
"31,16"]]
in clojure ?If the number of things isn't divisible by 3 you'll miss the last partition.
Try (partition 3 (range 8))
and (partition-all 3 (range 8))
.
These fns also have a 2-arg arity which is very handy. (partition 2 1 coll)
for example, gives you a rolling window of size two over the collection.
Hi there, I wonder about serialization of data through postgresql aggregate functions, the result currently is:
{:categorylist #object[org.postgresql.jdbc.PgArray 0x1c446e0 {27}], :dates nil, :city , :max_price nil, :subcategorylist nil, :user_id 28, :num_tickets nil, :regionlist #object[org.postgresql.jdbc.PgArray 0x680d34a7 {3}], :user_status_id 1}
And I wonder how to get a vector
instead of org.postgresql.jdbc.PgArray
https://github.com/niquola/clj-pg/blob/master/src/clj_pg/coerce.clj have a look at this library
specifically at this protocol extension - https://github.com/niquola/clj-pg/blob/master/src/clj_pg/coerce.clj#L42-L61
On naming idioms:
what would be the convention on 'getters/generators/creators', let's say a NS generates some tokens, which idiom on naming should I use?
get-jwt-token, data->jwt-token, ->jwt-token, jwt-token, get-other-token, data->other-token, etc...
I'd like to represent the 'return-type' and not the 'computation', on another programming languages I'd go with getToken.
The most common naming convention I've seen for "creators" is of the thing they are creating, i.e. calling a function by what it returns, not by what it does. By this logic, a name like jwt-token
will be most straightforward
and your thoughts on ->jwt-token
which means the same just more 'implicit' with the 'arrow'
It's usually an indication of a record's constructor, and it can get a little confusing. Some of the examples I've seen in regards to the conventions I noticed is developers wrapping their record constructors in functions, exposing them via requiring the ns instead of needing to import the record.
Yeah, ->
is either for record constructor or for converters. For example, ->json
might be a fn
that converts various things to JSON.
My point is, the arrow ->
might have other conventions associated with it, so best avoid it for normal constructors.
That's a bit philosophical, but if you stray away from "mainstream" conventions you might find your code base speaking with a slight "accent", so while everything is consistent and makes sense internally, the back and forth might be difficult
Anyone read through https://www.amazon.com/dp/B07N7525GX And has any comments? Good? Bad?
@UQXS91RT7 Most Packt books are pretty bad so I would avoid them in general.
(I haven't read that particular book, but all the Packt ones I have read are full of errors and the editing is awful)
@U04V70XH6 I agree, that's why I'm being overly cautious
It's short, to the point, and makes the case for using the Rx libs in Clojure. I'm not an experienced dev, but I can say that there weren't any strange errors in the way the information was conveyed within the book, however, I have seen that in other packt books
Also, I found this one to be good for a brief overview of functional data structures, and it's from packt too
Most of the examples are written in Scala though
anyone know how to iterate over AsyncIterables
?
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for-await...of
I am using shadow-cljs, compiling to node. I am stuck with this part.
What are the trade-offs of serializing your data using something like https://github.com/ptaoussanis/nippy and saving it as bytes in Postgres vs encoding the data as JSON, using https://github.com/dakrone/cheshire, and saving it on a JSONB column?
with nippy you can't meaningfully query the payload with postgres
eg. doing a select WHERE some json field or nested json field
or aggregates like SUM on fields in json payloads
wow - now that I look at a doc it's pretty cool what psql can do on json fields - eg. iterate on k/v pairs https://www.postgresqltutorial.com/postgresql-json/
I suppose trade-off in the other direction would be that Nippy with compression would likely result in much smaller data. Also Nippy supports encryption iirc.
But my argument is that this data isnt something we are going to be needing to query. Plus it will save me the hassle of having to keep track of data type changes. eg. dealing with string keys vs keyword keys.
there's a postgres extension that compresses jsonb fields
if you don't need to query the data, put it in s3 and put s3 urls in the db :D
Okay so there really isn't any benefit of using the Nippy freeze library in this case?
I can't think of one, if zson actually works https://github.com/postgrespro/zson
but having direct translation of clojure data types like keywords is nice (alternatively, you can just use string keys, it's less of a problem then you might think to use the get
function)
The problem I am trying to solve is this. I have a map where the top level keys are strings but the inner maps are keywords. When I convert to a JSON string, it turns the keys to string as per JSON spec. But when I read it back into CLJ land the keys are strings. I could tell cheshire to turn the keys into keywords but I need the top level keys to remain strings.
the string/keyword thing is always lossy going through json, it's much simpler to do all keywords or all strings
sounds like you found something that works anyway
right, but as a general principle, the best case is not having to keep track of when a key is string vs. keyword
Turns out the solution is provided in Cheshire itself. You can implement a fn that allows you to encode keywords in a certain format and then provide another fn to the parser that allows you to key in on that and turn those strings with a certain format into keywords otherwise leaving it as a string
JSONB columns are dangerous! It's easy to "just" add a quick query here and there. Then 6 months later you've got what amounts to a schema-less table and someone on the team's forgotten you SELECT it one specific way in a different part of the app and makes a breaking change to the layout. Full-team discipline definitely required 馃槀
in general I agree a structured database is worth the cost. they're handy when integrating sometimes though
like a "any_other_shit_that_came_with_the_payload_that_we_dont_know_about" jsonb coulumn
I hate it as well. I never remember how to query it and have to spend 5 mins looking online or going through my SQL history. But the powers at be decided that the easiest way to migrate over to PG from mongodb was to set up these JSONB columns. 馃憖
A more aggressive senior joined the ranks and designed a whole DB schema. To replace the current one
But now the new design, IMO, is overly complicated but maybe I am just too junior to see it 馃槄
Is there an (easy) way to get all keys with a specific namespace? I can do that with a reduce-kv
but I was wondering if there was a way to destructure a map, or anything shorter
oh - all keys in one map, I thought you meant literally every keyword using a given ns (since they are cached it's hypothetically possible) :D
sadly the table of all keywords is private https://github.com/clojure/clojure/blob/master/src/jvm/clojure/lang/Keyword.java#L26
user=> (def table (.get (doto (.getDeclaredField clojure.lang.Keyword "table") (.setAccessible true)) clojure.lang.Keyword))
#'user/table
user=> (def kws (map #(.get %) (vals table)))
#'user/kws
user=> (take 10 kws)
(:target :clojure.main/message :clojure.spec.alpha/unknown :datafy :clojure.core.specs.alpha/prefix :dir :clojure.core.specs.alpha/binding-form :allow :assertion-failed :method-builders)
Oh sorry, my constructed that sentence incorrectly. I meant "filtering" the map, that only the key/value pairs remain of a specific namespace. e.g.
(keep-ns {:human/name "foo" :human/age 99 :some/random 1} :human)
;;#> {:human/name "foo" :human/age 99}
yeah - I think reduce-kv as mentioned above is your best bet, though (into {} (filter (key-ns? "human")) m)
could work , where (defn key-ns? [s] (fn [[k v]] (= (namespace k) s))
user=> (defn key-ns? [s] (fn [[k v]] (= (namespace k) s)))
#'user/key-ns?
user=> (def n {:human/name "foo" :human/age 99 :some/random 1})
#'user/n
user=> (into {} (filter (key-ns? "human")) n)
#:human{:name "foo", :age 99}
into will out-perform reduce-kv
it uses transients to build the result
where reduce-kv is doing multiple individual assoc calls
and the filter is run as a transducer, so there's no intermediate lazy-seq created
we should compare though
my solution fits in 2 lines, as you see above
Yeah I had the same as the into version, except this was different:
(defn key-ns [n]
(comp #{n} namespace first))
But I'm just playing golf 馃檪getting access to all interned keywords would be useful for tooling I guess, autocomplete in editors for example
And it would make prevent keyword garbage collection from working properly
Unless I am thinking about the reference system poorly
In any case, that would have to be considered
What would be the best way to take a json file and read through that to create a simple table view of that data in a webapp?
The json file will largely be static for now or in a situation that could easily restart to show new data (infrequent changes)
it depends on how you are doing your rendering, but with eg. reagent you can use for
to create a list of lists that translate into a table