Fork me on GitHub
#clojure
<
2023-11-30
>
onetom05:11:51

Is there a way to make clojure.core/conj operate on java.nio.file.Path? so instead of this:

(-> "" java.net.URI. java.nio.file.Path/of
      (as-> p
            (reduce #(.resolve ^java.nio.file.Path %1 ^String %2)
                    p
                    ["asd" "qwe/zxc"])))
; => #object[software.amazon.nio.spi.s3.S3Path 0x66f6cc75 "/asd/qwe/zxc"]
i can just write this:
(-> "" java.net.URI. java.nio.file.Path/of (conj "asd" "qwe/zxc"))

onetom05:11:23

i tried

(extend-type java.nio.file.Path
    clojure.lang.IPersistentCollection
    (cons [^java.nio.file.Path p ^String more]
      (.resolve p more)))
but got this error:
Execution error (IllegalArgumentException) at ...
interface clojure.lang.IPersistentCollection is not a protocol
which is true, but disappointing. iirc, Rich Hickey mentioned in one of his talks that if he could start over again, he would have used protocols as the basis of clojure.core instead of interfaces. (or something along those lines) i guess that would have allowed this kind of extension, right?

Linus Ericsson07:11:31

I think you’re right. Maybe you could use proxy or reify to create a custom imolementation of Path, but I’m not really sure. But - if we rewind a bit - why do you want to overload conj with this particular operation? This example seems like a perfect use case for a clearly named utility function. I would be surprised to see someone bolt this quite domain specific string-like thing to conj or other collection operations in real life code.

onetom08:11:33

i just felt conj would make (some) sense, because it already has the iteration built into it and i would rather use existing core functions then introduce new ones, which feel like parochialism.

onetom08:11:22

i did write these utility functions at the end:

(defn- join1 [^Path p segment]
  (.resolve p segment))

(defn join [^Path p & segments]
  (reduce join1 p segments))
called it join because the standard libs in other languages call it join. (resolve doesn't make a lot of sense to me)

p-himik08:11:30

join sounds like it would turn a and /b into a//b or a/b. resolve makes it sound more like it turns it into /b. Though in Python both os.path.join and pathlib.Path.joinpath work as the latter, but it's not immediately intuitive from the name, and the second function doesn't document the behavior.

onetom08:11:04

would my.fs.utils/conj make more sense?

p-himik08:11:16

IMO conj sounds like what I described join sounds like.

👍 1
p-himik08:11:20

Heh, C++ didn't even add a method for it - it's just overloading the / operator (Python also does that, but there's also a method).

p-himik08:11:52

C# calls it Combine, I think it's also more clear and intuitive than join/`conj`/"concat" (what C++ calls it in the docs)/`/`.

p-himik08:11:17

BTW if you don't need that behavior with absolute segments, any name would do. And you might already be able to use io/file for that (no clue about other protocols - don't have them):

(-> "file:///a/b" java.net.URI. java.nio.file.Path/of
    (io/file "c" "d" "e")
    .toPath)
=> #object[sun.nio.fs.UnixPath 0x51818c79 "/a/b/c/d/e"]

onetom10:11:27

yeah, the problem with mixing io/file into the picture is that it's not aware of the protocol/`java.nio.file.FileSystem` concept, which is why we are transitioning to using Paths

👍 1
onetom03:12:32

making conj work would imply that into would work too, since into uses conj as its reduction function by default, so (into p ["asd" "qwe/zxc"]) should work too. combine doesn't suggest ordering, but something more intricate.

Stefano08:11:31

Hi everyone, apologies if this has already been asked tons of times. Do you know why the primitive ^long type hint causes that reflection warning? Looks like the return type is ignored (looking into an Object instead of a String) and that it's looking for the field length (instead of the method)

(set! *warn-on-reflection* true)

(defn append-1 ^String [^String s ^long n] (str s "-" n))
(defn append-2 ^String [^String s ^Long n] (str s "-" n))

(.length (append-1 "some-string" 1)) ; => Reflection warning - reference to field length on java.lang.Object can't be resolved.
(.length (append-2 "some-string" 2)) ; => No warnings

p-himik09:11:44

When you use primitive type hints for a function, the compiler tries to find the matching interface in IFn. In this case, it finds this:

static public interface OLO{Object invokePrim(Object arg0, long arg1);}
Interfaces with primitives take higher priority, so the function ends up being essentially (defn append-1 [s ^long n] ...).

p-himik09:11:57

And the fix here would be to remove type hints from the arguments. Or at least from the second argument.

Stefano09:11:50

Thanks for the answer! I think I'll remove the type hint in the argument and move it in the body of the function

Stefano09:11:27

Did you read that in the source code? Can you please point me to it?

Stefano09:11:26

Thank you! I stripped down the example so it's not clear, but I need the type hint in the argument to be able to call a java method that expects a primitive long and I don't want to wrap it into a Long which is not necessary

👍 1
oyakushev11:11:45

I think this is a bug. Matching against a primitive interface shouldn't prevent type hints from propagating. Does this deserve an AskClojure thread, @U064X3EF3?

Stefano16:11:45

I can create the thread on AskClojure and post the link here if you want (and didn't already)

oyakushev18:11:05

Would be great if you did!

Brett10:11:52

Hi everyone. Question specific to dtype-next. Fantastic library, wish there was more tutorials / documentation. I have a simple struct :

(dt-struct/define-datatype! :vec3_t [{:name :x :datatype :float32}
                                     {:name :y :datatype :float32}
                                     {:name :z :datatype :float32}])
I have C API expecting an array of these struct:
void some_func(vec3_t* data);
I'm building an array of struct (I'm parsing a list of vertices, parsed-content) :
(def vertices-array (dt-struct/new-array-of-structs :vec3_t (count (:vertices parsed-content))))
Then I want to set all the struct from my array :
(let [xcol (dt-struct/array-of-structs->column vertices-array :x)
      ycol (dt-struct/array-of-structs->column vertices-array :y)
      zcol (dt-struct/array-of-structs->column vertices-array :z)]
  (doseq [idx (range (count vertices-array))
          [x y z] (:vertices parsed-content)]
    (dt/set-value! xcol idx (Float/parseFloat x))
    (dt/set-value! ycol idx (Float/parseFloat y))
    (dt/set-value! zcol idx (Float/parseFloat z)))
  )
Not sure this is the best way to do it. It seems to work, but I'm ending up with all the elements set to the same values !
[{:x -12.250777, :y -3.481309, :z 7.733591} {:x -12.250777, :y -3.481309, :z 7.733591} ...]
2 questions : * What is the canonical way of building array of structs ? * Any idea why i'm ending up with the same values for all elements ?

delaguardo10:11:54

> * Any idea why i'm ending up with the same values for all elements ? most likely because array-of-structs->column returns a new object. So any manipulations with it will not be reflected in vertices-array

Brett12:11:16

Ok that makes sense. I'll try something else

Brett14:11:12

Ok it's not because there is a copy but because I didn't use doseq correctly, it wil iterate i * j times and not i times binding values for idx and [x y z] (if that makes sense)

István Karaszi13:11:47

If I have a schema definition with clojure spec defined with s/keys, and I would like to transform that schema and make all of the keys optional, what is the easiest way to do it?

István Karaszi13:11:18

I mean programatically

István Karaszi13:11:06

like I have something like this:

(s/def ::test (s/keys :opt-un [::foo] :req-un [::bar]))

István Karaszi13:11:31

and I need a way to programmatically convert that to a definition of:

(s/keys :opt-un [::foo ::bar])

vanelsas13:11:25

Perhaps if you know under which condition you need the different formats, you could test for that condition and use the correct spec? Or is that not what you are looking for?

István Karaszi13:11:03

I have a definition where certain keys are required, and in other use-cases all of the keys are optional. I don’t want to write the spec down again, to avoid mistakes.

p-himik13:11:43

You can write a macro that does that - you'd have two separate specs and you'd choose between them at run time. Maybe you can use spec-tools for that, not sure. But if you need programmatic spec transformation more than once, I'd choose a different tool, e.g. malli.

István Karaszi13:11:36

I’ve came up with this so far:

(defmacro all-keys-optional [spec]
  (let [{:keys [req req-un opt opt-un]} (apply hash-map (drop 1 (s/form spec)))
        optional {:opt (vec (concat req opt))
                  :opt-un (vec (concat req-un opt-un))}
        spec-def (into {} (filter (comp not-empty second) optional))]
    spec-def
    #_(s/keys ~@spec-def)))

mpenet13:11:45

if it's clj (not cljs) you can juse use eval and skip the macro, depends how much you have to do this

mpenet13:11:59

(def keys [::foo ::bar ::baz]) (eval `(s/def ::full (s/keys :req ~keys)))

mpenet13:11:59

I guess you could also make a macro that would parse the s/form of the original spec and write the new one for opts

István Karaszi13:11:23

I have hard time applying the map as keyword args to s/keys

p-himik13:11:21

You don't need a map in the first place. But you can use (apply concat ...).

István Karaszi13:11:03

(defn all-keys-optional [spec]
  (let [{:keys [req req-un opt opt-un]} (apply hash-map (drop 1 (s/form spec)))
        optional {:opt (vec (concat req opt))
                  :opt-un (vec (concat req-un opt-un))}
        spec-def (apply concat (filter (comp not-empty second) optional))]
    (eval `(s/keys ~@spec-def))))

István Karaszi13:11:08

so this should be working, right?

István Karaszi13:11:09

gosh, it was working, I had another problem 😄

😅 1
p-himik13:11:54

s/keys also has the :gen argument, if you care about that.

István Karaszi14:11:56

thank you very much!

👍 1
Ryan Jerue17:11:19

Hehe there was just a clojure shout out from Nubank at aws reinvent keynote :)

💜 17