Fork me on GitHub
#clojure
<
2024-05-20
>
ryan echternacht03:05:29

Are there any good libraries for uuid-v7 in clojure? my googling isn't coming up with anything and I find that surprising

p-himik03:05:37

You can use any Java library.

ryan echternacht03:05:12

Yeah, but I'm an interop nub 😞

ryan echternacht03:05:19

Maybe this is the time to figure it out

💪 1
p-himik03:05:04

Java interop is much more trivial than the whole seq abstraction. :) When it's about just using fields/methods of a Java class, there's more even to the threading macros than to this part of the interop.

Alex Miller (Clojure team)03:05:46

Here's an example v7 generator from a java lib

$ clj -Sdeps '{:deps {com.fasterxml.uuid/java-uuid-generator {:mvn/version "4.1.0"}}}'
Downloading: com/fasterxml/uuid/java-uuid-generator/4.1.0/java-uuid-generator-4.1.0.pom from central
Downloading: com/fasterxml/oss-parent/49/oss-parent-49.pom from central
Downloading: com/fasterxml/uuid/java-uuid-generator/4.1.0/java-uuid-generator-4.1.0.jar from central
Clojure 1.11.1
user=> (import com.fasterxml.uuid.Generators)
com.fasterxml.uuid.Generators
user=> (def gen (Generators/timeBasedEpochGenerator))
#'user/gen
user=> (.generate gen)
#uuid "018f9414-d450-78db-a4d1-e1c2363c8344"
user=> (.generate gen)
#uuid "018f9414-d778-79eb-a54f-ca2c5e224497"
user=>

ryan echternacht13:05:11

thanks Alex. The java libs I was looking at had a bunch of factory pattern stuff, and I was feeling lazy. Thanks for the rec!

Alex Miller (Clojure team)15:05:44

not a rec! just an example :)

saidone11:06:52

2nd take, removed useless casts and found a better way to put dashes around:

(defn gen-uuid-v7
  []
  (let [rand-array (byte-array 10)
        _ (.nextBytes (SecureRandom.) rand-array)
        rand-seq (map byte rand-array)
        uuid-seq (map #(str/lower-case (format "%02X" %))
                      (concat
                        ;; timestamp
                        (map byte (.toByteArray (biginteger (bit-and (System/currentTimeMillis) 0xFFFFFFFFFFFF))))
                        ;; ver
                        (list (bit-or (bit-and (first rand-seq) 0x0F) 0x70))
                        (list (second rand-seq))
                        ;; var
                        (list (bit-or (bit-and (first (drop 2 rand-seq)) 2r00111111) 2r10000000))
                        (drop 3 rand-seq)))]
    (str/join "-" (map #(apply str %)
                       (let [indices (reductions + 0 [4 2 2 2 6])]
                         (map #(subvec (vec uuid-seq) %1 %2) (butlast indices) (rest indices)))))))

Chris Chambers07:05:01

I notice @seancorfield said this in response to one of the beginner questions: > Records? No, mostly Clojurians use plain hash maps. > > One thing Alex has said about Clojure Applied is that in the next addition, records will be played down, and domain objects will be shown as plain hash maps. Can anyone explain to me why is this the case, or link to some authoritative resource that explains it? It seems to me that records are superior for domain objects: they have the same hash map semantics, but (if I recollect correctly) are more performant and allow for protocol implementations, so it seems like a superset of the functionality that maps provide with some additional perks, like nice apis for construction, etc.

1
👍 1
jpmonettas12:05:54

I don't know about Alex or Sean opinions on it, but imho Clojure philosophy is to represent all associative data as open maps, where data can be added, merged, removed, sub keys selected, etc and tries to stay away from closed types. In this respect records are more complex and trickier, because they are "closed types" with Clojure interfaces to make them look like maps, but under the hood they are still different. You can add extra data to a record and you still have a record (inside they will add a hashmap to keep track of these extra data), but when you remove something for example, if you remove from that extra hashmap you will still have the same record, but if you remove a field, now you lost the record. Same tricky things will happen with merge, select-keys, etc.

jpmonettas12:05:28

A place where records are superior is around memory footprint, so I use them instead of maps when I need to pack a collection of things which contain the same keys.

Jason Bullers13:05:57

Another place records can be a problem is re-evaluating in the repl. If you make a change and have existing instances, you're gonna have some extra hoops to jump through, like the reloaded workflows

1
Jason Bullers13:05:19

If you don't need the performance boost (yet), and they're semantically the same in terms of how you use them, it seems "better" in an ergonomic sense to just start with maps until you know you need a record

emccue14:05:10

Also compilation time in CLJS

emccue14:05:33

I've had some protocol/record nightmares there

emccue14:05:38

(don't ask)

emccue14:05:02

but this is part of why extend-via-metadata was introduced, i think

emccue14:05:18

give polymorphic dispatch without an explicit named type

emccue16:05:29

This is gonna sound weird, but I only started seeing the pros of Clojures approach is once I got deeper into Java

emccue16:05:25

With open maps you get easy handling of data that is flowing through your system

emccue16:05:49

With closed maps/nominal entities you have friction with specifically that

2
emccue16:05:21

There are legitimate pros to the closed world

Jason Bullers16:05:37

With closed maps/nominal entities you have friction with specifically that> For sure. There's a lot more you have to "get right" up front, and it's not really great for exploring a problem/solution space. I had a similar epiphany trying to solve the same problem in Java and Clojure

emccue16:05:41

But they are weakest in the presence of "data evolution"

Chris Chambers17:05:10

Interesting feedback so far. Are records really "closed types" though? I'll have to check out some of the "tricky things", so far I've only encountered dissoc, whose trickiness makes perfect logical sense to me (i.e. it converts the record to a map iff you dissoc one of the required fields).

Jason Bullers17:05:03

Maybe a way to think about it is shape versus definition. While it's true that Clojure tries to blur records and maps as much as possible, records are conceptually definitions of a type of thing with particular "slots". Maps, on the other hand, are just bags of data, and so when you want to treat a map as an entity, you care more about the shape of the data than its actual type, if that makes sense

respatialized18:05:27

The docs for https://clojure.org/about/spec discuss another part of the “open and associative” data model facilitated by maps. > In Clojure we gain power by dynamically composing, merging and building up maps. We routinely deal with optional and partial data, data produced by unreliable external sources, dynamic queries etc. These maps represent various sets, subsets, intersections and unions of the same keys, and in general ought to have the same semantic for the same key wherever it is used. Defining specifications of every subset/union/intersection, and then redundantly stating the semantic of each key is both an antipattern and unworkable in the most dynamic cases. Defining records for everything means falling into this antipattern. Relying on them for the core data model makes it harder to define their field/key semantics in a global and reusable way, which is what spec allows.

seancorfield18:05:44

We like to use qualified keys for things within our business domain, so that's another strike against records since they can only declare unqualified keys.

2
phill22:05:40

Records are host interop, so they bring a complex of strong and weak points -- "complex" itself being a bad thing, on principle -- and although records look portable, the complex is not the same on all of Clojure's hosts. Records were a worthy addition to Clojure 1.2 as a way to bring crazy host-provided optimizations within reach. But when you don't need records, they don't need you either.

didibus22:05:46

I think records are totally fine. But maps are even simpler and so I think people just end up using plain maps because of that.

didibus22:05:10

If I hypothesize, the reason why records would be downplayed in Clojure Applied next edition is probably because maps want to be emphasized instead. Because records can give you the impression you need records to model entities in your application. But you don't, simple maps will do and be more flexible.

Jason Bullers22:05:39

Possibly also because records are the place to realize protocols, and then you're kinda down the oop path instead thinking in terms of data?

Jason Bullers22:05:49

Not really oop, but drifting from "just data"

didibus22:05:26

Polymorphism isn't discouraged though. But I agree that records are often like the first place of comfort for people from an OO background. And they try to use them as objects.

Jason Bullers22:05:20

Was thinking about "unnecessary polymorphism." I ended up there when I first looked at Clojure for exactly the reason you mentioned: records seemed comfortable to my Java brain. Personally, I found using them took me toward designs that focussed on the type of thing and a bunch of protocols (basically oop) when I should have been thinking about the shape of data and I really had no need of polymorphism to solve my problem

👍 1
john19:05:12

Records are like 90% of the way towards what ti-yong/transformers are. Maps that you can add an IFn to

john19:05:30

(among other things)

john19:05:39

You can also think of a record as an extensible map that you don't have to reimplement the basic object and associative protocols for

john19:05:20

Cause you can always instantiate a record with zero default fields

john19:05:48

And now you have a map you can add new behavior to, like a special deref or whatever

john19:05:16

But yeah I basically never need that lol

john19:05:54

Worth having for more systemy things imo. I just don't do that stuff much

seancorfield20:05:51

(and since you can use metadata to extend a lot of protocols these days, you need records even less for that now)

3
didibus20:05:21

Also a lot of systemy things will be stateful and probably better served by deftype

seancorfield21:05:04

I thought we were talking about web servers and database connections/pools -- which would be Component style stuff that used to need to be records but can now be hash maps with metadata...

john22:05:56

Yeah I guess I mean moreso language level systemy

john22:05:22

Lower level loop stuff

didibus22:05:26

All the above I guess. But ya I was thinking more custom types, or implementing actual components, not using them, like writing your own client.

Bailey Kocin17:05:32

How would I remove a symbol and the function attached to it in the REPL and then reevaluate it in the same REPL? I want to do this programmatically and I would like to reevaluate the code because a macro is producing the function so literally just removing the entire object and recreating it seems easiest....

Noah Bogart17:05:17

https://clojuredocs.org/clojure.core/ns-unmap. this won't remove any references if you've stored them in closures or maps, etc

Bailey Kocin17:05:01

How do I remap it then? I have the namespace and symbol...

Bailey Kocin17:05:17

I have never reloaded a file in the REPL before I guess, or reloaded an entire namespace?

Noah Bogart17:05:01

if you want to assign the var to a new value, you can just use def. if you want a new function, use defn.

Bailey Kocin17:05:32

The macro produces a special function with metadata, I need to reevaluate it and recreate the symbol with the var

Noah Bogart17:05:53

does reevaluating not work in some way?

Bailey Kocin17:05:00

I want to reevaluate it in a function programmatically! But I do not know how to do that. I could be confused with what you are saying though?

Noah Bogart17:05:24

lol i think i'm confused about your goal. Unless the macro is checking if the function exists and not recreating it, the macro should work whether or not there's existing vars with the same name. for example, (defrecord Foo []) creates ->Foo and map->Foo functions. if you reevaluate the defrecord call, the functions will also be recreated and overwrite the existing definitions.

Bailey Kocin17:05:36

I am doing something to programmatically go through a bunch of namespaces, undef some things and reload all them in the code. I could just click save on the file and eval the buffer but there are alot of files, that is not the goal here

Noah Bogart17:05:52

Maybe you should look into https://github.com/tonsky/clj-reload for reloading stale namespaces

p-himik17:05:03

If you're using a macro, just use def/`defn` in the expanded code. If you aren't use a macro, you can use intern. Neither way you have to undef anything.

user=> (intern 'user 'x 1)
#'user/x
user=> x
1
user=> (intern 'user 'x 2)
#'user/x
user=> x
2

p-himik17:05:31

But yeah, what you're describing sounds exactly like tools.namespace or clj-reload.

Bailey Kocin17:05:08

I am doing this because the functions defined by macros in these files are picked up on a service launched in another thread via an autodispatch mechanism on the metadata on the start only, I am stopping the service on the thread, resetting everything, then starting the service which picks up everything again with auto dispatch. Yea the intern or clj-reload is what I was looking for! Thanks