Fork me on GitHub
#beginners
<
2020-03-17
>
fappy00:03:44

Hi everyone Is there a way in Clojure to wrap an arbitrary kind of value such that some special function I designate gets run when the value is accessed? My situation: I have a lot of different functions that process items in batches; every time one of them is called, it鈥檚 handed a collection of these items. I want to capture the timestamp for when each item in a batch gets processed, ideally without having to modify the guts of every one of these functions. If each of these functions handled a single item instead of a batch, I could wrap them in something that captured the timestamp and then did the work, but alas.

noisesmith00:03:55

it seems like you could reify IDeref with that behavior, and just ensure that the caller uses deref to get the value from the container (the same way you would with an atom)

fappy00:03:56

Is any sneakier approach possible, without having to visit each function and making sure it does anything special?

noisesmith00:03:58

user=> (defn into-container [x] (reify clojure.lang.IDeref (deref [this] (println "some arbitrary side effect") x)))
#'user/into-container
user=> (def v (into-container 31))
#'user/v
user=> (+ @v @v)
some arbitrary side effect
some arbitrary side effect
62

馃嵑 4
noisesmith00:03:19

AFAIK there's no container type other than vars that are implicitly accessed, and yeah you can add a watch to a var, but you aren't likely writing your functions such that they use the vars

noisesmith00:03:33

though if you passed in the var itself rather than its symbol (via var-quote) and the client always dereferenced it could work with add-watch - but then your client has to change as much as it had to for the reify of IDeref anyway

fappy00:03:59

Do the possibilities open up at all if it鈥檚 known ahead of time these items are maps and not some arbitrary kind of value?

noisesmith00:03:17

in that case you could use reify or deftype and implement valAt such that it does some extra step when something accesses a key

noisesmith00:03:56

(with def-map-type I guess you'd override get?)

fappy00:03:59

ooh interesting! thanks!

noisesmith00:03:13

along with all this I should probably say this isn't a typical style of programming in clojure, and there's likely a way to untangle your system using first class functions that gets the same property (logging accesses) without the "spookiness" of value access that has implicit side effects

馃挴 4
fappy00:03:42

you鈥檙e right the result of that untangling would be less surprising to any new team member working on the code

ScArcher01:03:16

If I have a set of keys and a vector of values, how might I generate a vector of maps with the appropriate keys / values? [:name :phone] ["Joe" "555-1212" "Kim" "555-1212" "Pat" "555-1212"] [{:name "Joe" :phone "555-1212"} {:name "Kim" :phone "555-1212"} {:name "Pat" :phone "555-1212"}]

andy.fingerhut01:03:15

Part of an answer could be zipmap, e.g.:

user=> (zipmap [:name :phone] ["Joe" "555-1212"])
{:name "Joe", :phone "555-1212"}

ScArcher01:03:47

I tried zipmap, but it only resulted in one map

andy.fingerhut01:03:59

Hence why I said "part of an answer"

andy.fingerhut01:03:56

A more complete answer:

user=> (defn repeated-zipmap [key-seq val-seq]
  (let [n (count key-seq)]
    (map zipmap (repeat key-seq) (partition-all n val-seq))))
#'user/repeated-zipmap
user=> (repeated-zipmap [:name :phone] ["Joe" "555-1212" "Kim" "555-1212" "Pat" "555-1212"])
({:name "Joe", :phone "555-1212"} {:name "Kim", :phone "555-1212"} {:name "Pat", :phone "555-1212"})

ScArcher01:03:08

Thank you, I came up with this - (map #(zipmap headers %) (partition 2 data)

andy.fingerhut01:03:40

Mine is essentially the same, except it also works for other lengths of the key sequence.

ScArcher01:03:52

Yep your works for the more generic case.

dpsutton02:03:18

Set of keys and vector of values seems like a mismatch. How can you know which keys go to which values?

andy.fingerhut03:03:38

ScArcher probably meant "sequence" rather than "set", given the example input described.

fabrao03:03:13

hello all, how do I transform this

["SCHWEPPES CITRUS LT (C贸digo: 2582287 )"
 "Qtde.:1 UN: un Vl. Unit.:   1,99 Vl. Total"
 "1,99"
 "OVO GRANDE BRANCO CR (C贸digo: 5286387 )"
 "Qtde.:1 UN: un Vl. Unit.:   11,99 Vl. Total"
 "11,99"
 "PAO FRANCES CARREF K (C贸digo: 168076 )"
 "Qtde.:0,326 UN: kg Vl. Unit.:   12,99 Vl. Total"
 "4,23"
 "PAO FRANCES CARREF K (C贸digo: 168076 )"
 "Qtde.:0,354 UN: kg Vl. Unit.:   12,99 Vl. Total"
 "4,60"
 "SUCO REFR NATURAL XA (C贸digo: 7100698 )"
 "Qtde.:4 UN: un Vl. Unit.:   7,79 Vl. Total"
 "31,16"] 
into 
[["SCHWEPPES CITRUS LT (C贸digo: 2582287 )"
 "Qtde.:1 UN: un Vl. Unit.:   1,99 Vl. Total"
 "1,99"]
 ["OVO GRANDE BRANCO CR (C贸digo: 5286387 )"
 "Qtde.:1 UN: un Vl. Unit.:   11,99 Vl. Total"
 "11,99"]
 ["PAO FRANCES CARREF K (C贸digo: 168076 )"
 "Qtde.:0,326 UN: kg Vl. Unit.:   12,99 Vl. Total"
 "4,23"]
 ["PAO FRANCES CARREF K (C贸digo: 168076 )"
 "Qtde.:0,354 UN: kg Vl. Unit.:   12,99 Vl. Total"
 "4,60"]
 ["SUCO REFR NATURAL XA (C贸digo: 7100698 )"
 "Qtde.:4 UN: un Vl. Unit.:   7,79 Vl. Total"
 "31,16"]]
in clojure ?

fabrao03:03:26

taking 3 by 3

solf03:03:23

(partition 3 coll)

fabrao03:03:37

is that simple as is?

jaihindhreddy04:03:39

If the number of things isn't divisible by 3 you'll miss the last partition. Try (partition 3 (range 8)) and (partition-all 3 (range 8)). These fns also have a 2-arg arity which is very handy. (partition 2 1 coll) for example, gives you a rolling window of size two over the collection.

fabrao03:03:04

@dromar56 thanks a lot, I didn麓t know that was so simple

agigao08:03:19

Hi there, I wonder about serialization of data through postgresql aggregate functions, the result currently is:

{:categorylist #object[org.postgresql.jdbc.PgArray 0x1c446e0 {27}], :dates nil, :city , :max_price nil, :subcategorylist nil, :user_id 28, :num_tickets nil, :regionlist #object[org.postgresql.jdbc.PgArray 0x680d34a7 {3}], :user_status_id 1}
And I wonder how to get a vector instead of org.postgresql.jdbc.PgArray

Aviv Kotek14:03:41

On naming idioms: what would be the convention on 'getters/generators/creators', let's say a NS generates some tokens, which idiom on naming should I use? get-jwt-token, data->jwt-token, ->jwt-token, jwt-token, get-other-token, data->other-token, etc... I'd like to represent the 'return-type' and not the 'computation', on another programming languages I'd go with getToken.

Ben Sless14:03:59

The most common naming convention I've seen for "creators" is of the thing they are creating, i.e. calling a function by what it returns, not by what it does. By this logic, a name like jwt-token will be most straightforward

Aviv Kotek14:03:06

and your thoughts on ->jwt-token

Aviv Kotek14:03:45

which means the same just more 'implicit' with the 'arrow'

Ben Sless14:03:03

It's usually an indication of a record's constructor, and it can get a little confusing. Some of the examples I've seen in regards to the conventions I noticed is developers wrapping their record constructors in functions, exposing them via requiring the ns instead of needing to import the record.

馃憤 4
hindol14:03:41

Yeah, -> is either for record constructor or for converters. For example, ->json might be a fn that converts various things to JSON.

馃憤 8
hindol14:03:34

My point is, the arrow -> might have other conventions associated with it, so best avoid it for normal constructors.

hindol14:03:04

But if you are consistent, and you document it up front, anything can work.

Ben Sless14:03:06

That's a bit philosophical, but if you stray away from "mainstream" conventions you might find your code base speaking with a slight "accent", so while everything is consistent and makes sense internally, the back and forth might be difficult

Gulli14:03:24

Anyone read through https://www.amazon.com/dp/B07N7525GX And has any comments? Good? Bad?

seancorfield17:03:55

@UQXS91RT7 Most Packt books are pretty bad so I would avoid them in general.

seancorfield17:03:26

(I haven't read that particular book, but all the Packt ones I have read are full of errors and the editing is awful)

Gulli17:03:14

@U04V70XH6 I agree, that's why I'm being overly cautious

James Good16:03:49

It's short, to the point, and makes the case for using the Rx libs in Clojure. I'm not an experienced dev, but I can say that there weren't any strange errors in the way the information was conveyed within the book, however, I have seen that in other packt books

James Good16:03:08

Also, I found this one to be good for a brief overview of functional data structures, and it's from packt too

James Good16:03:02

Most of the examples are written in Scala though

p4ulcristian16:03:06

anyone know how to iterate over AsyncIterables ? https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for-await...of I am using shadow-cljs, compiling to node. I am stuck with this part.

Mario C.17:03:30

What are the trade-offs of serializing your data using something like https://github.com/ptaoussanis/nippy and saving it as bytes in Postgres vs encoding the data as JSON, using https://github.com/dakrone/cheshire, and saving it on a JSONB column?

noisesmith17:03:32

with nippy you can't meaningfully query the payload with postgres

noisesmith17:03:00

eg. doing a select WHERE some json field or nested json field

noisesmith17:03:43

or aggregates like SUM on fields in json payloads

noisesmith17:03:32

wow - now that I look at a doc it's pretty cool what psql can do on json fields - eg. iterate on k/v pairs https://www.postgresqltutorial.com/postgresql-json/

Eddie17:03:24

I suppose trade-off in the other direction would be that Nippy with compression would likely result in much smaller data. Also Nippy supports encryption iirc.

Mario C.17:03:26

Thats the main pushback I was getting about using Nippy.

Mario C.17:03:08

But my argument is that this data isnt something we are going to be needing to query. Plus it will save me the hassle of having to keep track of data type changes. eg. dealing with string keys vs keyword keys.

noisesmith17:03:59

there's a postgres extension that compresses jsonb fields

noisesmith17:03:07

if you don't need to query the data, put it in s3 and put s3 urls in the db :D

Mario C.17:03:32

Okay so there really isn't any benefit of using the Nippy freeze library in this case?

noisesmith17:03:16

I can't think of one, if zson actually works https://github.com/postgrespro/zson

noisesmith17:03:15

but having direct translation of clojure data types like keywords is nice (alternatively, you can just use string keys, it's less of a problem then you might think to use the get function)

Mario C.17:03:31

The problem I am trying to solve is this. I have a map where the top level keys are strings but the inner maps are keywords. When I convert to a JSON string, it turns the keys to string as per JSON spec. But when I read it back into CLJ land the keys are strings. I could tell cheshire to turn the keys into keywords but I need the top level keys to remain strings.

Mario C.17:03:54

Ahh. I found the answer. Return keywords and just stringify the top level keys...

Mario C.17:03:06

馃槄 rubber ducky sorry thanks though

noisesmith17:03:19

the string/keyword thing is always lossy going through json, it's much simpler to do all keywords or all strings

noisesmith17:03:33

sounds like you found something that works anyway

Mario C.17:03:13

Well, I was hoping I could do this (zipmap (map str (keys m)) (vals m))

Mario C.17:03:26

Should solve for the top level keys needing to be strings

noisesmith18:03:13

right, but as a general principle, the best case is not having to keep track of when a key is string vs. keyword

Mario C.18:03:00

Turns out the solution is provided in Cheshire itself. You can implement a fn that allows you to encode keywords in a certain format and then provide another fn to the parser that allows you to key in on that and turn those strings with a certain format into keywords otherwise leaving it as a string

mloughlin18:03:07

JSONB columns are dangerous! It's easy to "just" add a quick query here and there. Then 6 months later you've got what amounts to a schema-less table and someone on the team's forgotten you SELECT it one specific way in a different part of the app and makes a breaking change to the layout. Full-team discipline definitely required 馃槀

bfabry18:03:36

in general I agree a structured database is worth the cost. they're handy when integrating sometimes though

bfabry18:03:42

like a "any_other_shit_that_came_with_the_payload_that_we_dont_know_about" jsonb coulumn

Mario C.18:03:21

I hate it as well. I never remember how to query it and have to spend 5 mins looking online or going through my SQL history. But the powers at be decided that the easiest way to migrate over to PG from mongodb was to set up these JSONB columns. 馃憖

Mario C.18:03:45

The next step was to break it out into tables but that step was never taken lol

4
mloughlin18:03:20

In that case, best of luck! 馃槈

Mario C.18:03:28

A more aggressive senior joined the ranks and designed a whole DB schema. To replace the current one

Mario C.18:03:02

But now the new design, IMO, is overly complicated but maybe I am just too junior to see it 馃槄

Kevin19:03:26

Is there an (easy) way to get all keys with a specific namespace? I can do that with a reduce-kv but I was wondering if there was a way to destructure a map, or anything shorter

borkdude19:03:26

there isn't

Kevin19:03:59

All right, thanks 馃檪

noisesmith19:03:50

oh - all keys in one map, I thought you meant literally every keyword using a given ns (since they are cached it's hypothetically possible) :D

borkdude19:03:12

(that's how I first parsed that sentence too)

borkdude19:03:59

I'm sure you can access it using some reflective hacks?

borkdude19:03:24

user=> (def table (.get (doto (.getDeclaredField clojure.lang.Keyword "table") (.setAccessible true)) clojure.lang.Keyword))
#'user/table
user=> (def kws (map #(.get %) (vals table)))
#'user/kws
user=> (take 10 kws)
(:target :clojure.main/message :clojure.spec.alpha/unknown :datafy :clojure.core.specs.alpha/prefix :dir :clojure.core.specs.alpha/binding-form :allow :assertion-failed :method-builders)

Kevin19:03:29

Oh sorry, my constructed that sentence incorrectly. I meant "filtering" the map, that only the key/value pairs remain of a specific namespace. e.g.

(keep-ns {:human/name "foo" :human/age 99 :some/random 1} :human)
;;#> {:human/name "foo" :human/age 99}

noisesmith19:03:31

yeah - I think reduce-kv as mentioned above is your best bet, though (into {} (filter (key-ns? "human")) m) could work , where (defn key-ns? [s] (fn [[k v]] (= (namespace k) s))

noisesmith19:03:42

user=> (defn key-ns? [s] (fn [[k v]] (= (namespace k) s)))
#'user/key-ns?
user=> (def n {:human/name "foo" :human/age 99 :some/random 1})
#'user/n
user=> (into {} (filter (key-ns? "human")) n)
#:human{:name "foo", :age 99}

Kevin19:03:50

I think reduce-kv would be shorter, and possibly more performant?

noisesmith19:03:02

into will out-perform reduce-kv

Kevin19:03:39

Interesting, it sounds like more work

noisesmith19:03:42

it uses transients to build the result

Kevin19:03:52

Ok that makes sense

noisesmith19:03:54

where reduce-kv is doing multiple individual assoc calls

noisesmith19:03:16

and the filter is run as a transducer, so there's no intermediate lazy-seq created

noisesmith19:03:27

we should compare though

noisesmith19:03:45

my solution fits in 2 lines, as you see above

Kevin19:03:09

Yeah I had the same as the into version, except this was different:

(defn key-ns [n]
  (comp #{n} namespace first))
But I'm just playing golf 馃檪

Kevin19:03:27

Thanks for the tip on into . Learned something new

ghadi19:03:49

setAccessible true is going to be banned at some point

borkdude19:03:23

getting access to all interned keywords would be useful for tooling I guess, autocomplete in editors for example

Alex Miller (Clojure team)19:03:54

And it would make prevent keyword garbage collection from working properly

Alex Miller (Clojure team)19:03:34

Unless I am thinking about the reference system poorly

Alex Miller (Clojure team)19:03:57

In any case, that would have to be considered

coldbrewedbrew22:03:27

What would be the best way to take a json file and read through that to create a simple table view of that data in a webapp?

coldbrewedbrew22:03:00

The json file will largely be static for now or in a situation that could easily restart to show new data (infrequent changes)

noisesmith23:03:43

it depends on how you are doing your rendering, but with eg. reagent you can use for to create a list of lists that translate into a table