Fork me on GitHub

i wasn't able to find anything on this -- how would i go about defining my own custom global vars, such as *assert* ? let's say i want to introduce a *my-assert* that's either set to true or false at compile time (configurable through lein), what would be the way?


i know i can probably use java system properties, but i was aiming for an approach that's portable between clj and cljs


(def ^:dynamic *my-thing*)


but can i then easily toggle *my-thing* through compile-time flags ?


i'll want something like

:global-vars {*my-thing* false}
in profiles, in this specific case to enable/disable some instrumentation


Something like

:injections [(set! my-thing)]


:thinking_face: ok that works indeed


Leiningen also offers :global-vars precisely for that IMO neither :injections or :global-vars are clean solutions because they don't transparently translate to an official Clojure feature so instead I'd compile (or run) my code while explicitly binding the given var


how do you explicitly bind your var at compile time though? do you use a compiler option of some sort? let’s say I want to turn on/off some instrumentation that affects defn definitions, how would you approach this without any compiler flags?


let's say that you're app's main ns is called I'd create a ns called my.main. It would look like the following:

(ns my.main
  ;; important! no requires here

(defn -main [& _]
  (binding [*assert* false]
    (require ') ;; compile `` with a specific binding
This way my.main would be the entrypoint in production. The technique works with AOT too


@lmergen Not portable, but CLJS offers goog-define for this. I bet you can make it portable using some .cljc though


yeah I think I’ll do something like that


I'm trying to figure out when the appropriate time is to use a transducer's single arity


the completion arity?


I'm making a thing that is sort of like a stream. it has an "input" function which sets up listeners to other streams and returns the next value, and I want to accept a transducer to transform that. e.g.:

(stream (remove even?) #(inc @counter))
when the stream is initialized, it needs to run the input function and set its initial value. however, if e.g. counter starts at -1, then (remove even?) should reject it
(even? 0)
;; => true
but right now, when the stream doesn't have a value yet, I thought I should use the single-arity of the transducer. but that simply passes the first value to the reducer :thinking_face: meaning that the first value of the stream is 0


ah, it's meant for completion?


[] [x] [x input] which arity do you mean?


that's the completion arity

  (map inc)
    ([sb] (.toString sb))
    ([sb input] (doto sb (.append (str input)))))
  [1 2 3 4 5])


example is illustrative, it has reflective calls


user=> (transduce (map inc) (completing conj! persistent!) (transient []) [1 2 3 4 5])
[2 3 4 5 6]


completing is a higher-order function that wraps a completing arity persistent! on an existing reduction function conj!


completion arity is useful for finalizing something after the "stepping" process is done


I guess then perhaps I should follow in the tradition of reduce transduce and allow the consumer to provide an initial value to the stream?


(stream (remove even?) #(inc @counter) 1)


not all transducible contexts take initial values, nor produce a finalized output. e.g. sequence or chan


With next.jdbc and reducible result sets, there's no obvious initial value because you don't know how the rows are going to be processed. Is the user aggregating numbers? Building a new collection? So users always have to provide their own initial value.