This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
- # aleph (8)
- # announcements (9)
- # babashka (15)
- # beginners (91)
- # calva (54)
- # chlorine-clover (3)
- # cider (25)
- # clj-kondo (9)
- # cljfx (4)
- # cljsrn (12)
- # clojure (40)
- # clojure-australia (2)
- # clojure-europe (77)
- # clojure-nl (10)
- # clojure-spec (22)
- # clojure-uk (9)
- # clojurescript (39)
- # conjure (12)
- # cursive (8)
- # datascript (17)
- # datomic (22)
- # emacs (2)
- # expound (6)
- # fulcro (25)
- # kaocha (7)
- # malli (9)
- # meander (5)
- # off-topic (13)
- # pathom (8)
- # pedestal (5)
- # portal (1)
- # rdf (58)
- # re-frame (65)
- # reagent (15)
- # sci (3)
- # shadow-cljs (50)
- # test-check (6)
- # testing (3)
- # tools-deps (1)
- # vim (7)
- # xtdb (10)
@javahippie peace and long life
@dominicm always difficult to remember pleasantries when flushed with so much adrenaline
You could also drop furniture
@dominicm people outside of JUXT are also allowed to use Crux IIRC
Only if there wasn't already a database in use 😢 I do hate having a schema for the work I do. Makes everything harder.
@otfrom tables don't exist in CRUX, only as a fiction when using SQL for querying data
cloud servicing the sky today
I seem to remember a library used for testing web-apps built on compojure, but my google-fu is not strong enough today. I also seem to connect this library with someone working at JUXT, Anyone know what lib I'm looking for?
I could of course just state my problem. I'd like to be able to pull all the routes out of a compojure route-definition, if that makes sense.
not sure if that's possible @slipset, short of parsing the source yourself. AFAIK compojure works by composing closures, they're opaque functions
Other solution could be wrapping compojure macros and extracting the routes yourself before it goes into compojure
In other news. Spent the weekend refactoring some rather opaque Clojure code. One could argue that that job would have been easier if I were working in a statically typed language. One could also question the ability to cleanly type this code and still end up in the same mess. I guess what I'm saying is that the types written for for the code as it were would have been as messy/opaque as the untyped code was.
An argument that I could accept is that given a statically typed language you wouldn't write as messy code, as your types would indicate that your code is messy, but then again, you didn't need types to figure that out.
@slipset not sure what point you're really circling here but it's true that you can write messy code in any language
also messy might be OK in some situations, like a code spike but not for the long term
when I write messy code it's a sign that I don't understand the problem or that I haven't yet found the correct abstraction
which is to say that a lack of some knowledge contributes to the mess
but it also feels to me that you can see it more clearly in Clojure than most languages
mess is like art, you know it when you see it 🙂
I guess what I'm circling around is that it would have been easier to do this refactoring if I understood the data being passed around. This understanding would have been easier to come by if the code had been typed.
But then I would argue that the types that would have made this stuff even compile would probably have been so obscure that any insights would have been lost.
So I'm basically arguing for and against static typing, and handwaving my way to figuring out that static typing wouldn't have helped even though it seems like it would have on the surface
thus creating peace of mind for current practices, which is an evolutionary mechanism to preserve energy
I think messy code that you actually have to maintain benefits from documentation of some kind. If you're just trying to incrementally modify that code, then types are extra due to the refactoring capabilities they provide. But if someone spent time writing types, why didn't they instead choose to simplify and/or document!
I guess because typing (or spec'ing) it doesn't break the existing code. My refactoring will have bugs in it.
true. At this point in time I guess the only requirements are that it works as it always has, Including whatever Hyrum's been up to.
All hail Hyrum. https://twitter.com/hyrumwright
Just automatically qualified deps in a large deps.edn file: https://github.com/borkdude/rewrite-edn/blob/master/examples/qualify_deps.clj
hmm... apache commons BZip2CompressorInputStream seems unusably slow for a file that inflates to 459MB
I'm reading on wikipedia that bzip2 has better compression at the cost of speed and memory usage
it would be worthwhile to measure the shelled out time against the Java time. Chances are bzip2 is just slow in general?
I have a function in my current buffer for un-gzipping natively in Java without deps:
(note: I untar at the end, for this I shell out, this is only necessary when dealing with multiple files)
(defn un-tgz [^java.io.File zip-file ^java.io.File destination-dir verbose?] (when verbose? (warn "Unzipping" (.getPath zip-file) "to" (.getPath destination-dir))) (let [tmp-file (java.io.File/createTempFile "glam" ".tar") output-path (.toPath tmp-file)] (with-open [fis (Files/newInputStream (.toPath zip-file) (into-array java.nio.file.OpenOption )) zis (java.util.zip.GZIPInputStream. fis)] (Files/copy ^java.io.InputStream zis output-path ^"[Ljava.nio.file.CopyOption;" (into-array [java.nio.file.StandardCopyOption/REPLACE_EXISTING]))) (sh "tar" "xf" (.getPath tmp-file) "--directory" (.getPath destination-dir)) (.delete tmp-file)))