This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
- # bangalore-clj (16)
- # beginners (120)
- # boot (21)
- # cider (24)
- # clara (9)
- # cljs-experience (1)
- # cljsrn (1)
- # clojure (218)
- # clojure-dev (3)
- # clojure-italy (12)
- # clojure-losangeles (4)
- # clojure-norway (2)
- # clojure-russia (2)
- # clojure-spec (19)
- # clojure-uk (178)
- # clojurescript (52)
- # cursive (7)
- # data-science (55)
- # datomic (25)
- # defnpodcast (11)
- # emacs (5)
- # fulcro (27)
- # hoplon (2)
- # leiningen (14)
- # midje (9)
- # off-topic (132)
- # onyx (19)
- # other-languages (23)
- # portkey (2)
- # re-frame (31)
- # reagent (1)
- # ring-swagger (15)
- # shadow-cljs (58)
- # slack-help (13)
- # spacemacs (22)
- # sql (7)
- # test-check (13)
So my understanding of AD is in terms of dual numbers, see this section https://en.wikipedia.org/wiki/Automatic_differentiation#Automatic_differentiation_using_dual_numbers Theres also a good blog post on it in haskell here http://www.danielbrice.net/blog/2015-12-01/ It seems that its easier to implement if you can overload your operators to work on your newly defined "dual numbers" in addition to doubles and ints n everything. Could this by an argument in favour of static typing, which the clojure community is against? In this clojure-AD stub you can see that they exclude the imports of core operators +,-,* etc... and then define their new operators to work on dual numbers https://github.com/log0ymxm/clj-auto-diff/blob/master/src/ad/core.clj
now your functions maps dual numbers to dual numbers, if you evaluate the function at (x 1) you'll get (f(x) f'(x)) out, so just reading off the second number gives you the derivativve
I seem to remember that there isn't much flexibility to change the behaviour of basic operators like + - in clojure
That part I get. Also clojure has such a small lang core, that a dispatch operator table would be a fine solution
The part where I get a bit confused on is the backwards part and then the tracing through so that you can do logic operators
Still haven’t found time to look at all the docs though, so I’m sure I’m still missing big chunks
I'm not sure how the logic operators work. The example I have in my mind is a piecewise linear function, say (defn abs [x] (if (< x 0) -x x)). This is not differentiable at zero
im not sure what AD would do here, perhaps what waas said earlier is that the logical statement cannot depend on the variable itself
@michaellindon In the docs here, it says that it can handle conditional branches and whiles https://github.com/HIPS/autograd/blob/master/docs/tutorial.md
@michaellindon: 'not everything is differentialble everywhere' - that's the point that Dragan made a while back. Also doing this with overloading is not optimal (or method dispatch which would be even worse). The 'right' way is via program transformation. I believe @sophiago is working on a version which does this.
@michaellindon just that - the original code is transformed to calculate the derivatives along with the original caluculation. There are quite a few papers on this. Actually, I just checked and I see it is even mentioned in that wikipedia article: https://en.wikipedia.org/wiki/Automatic_differentiation#Source_code_transformation_.28SCT.29
Deoesn't really say much there though. Another reason why the overloading approach can be very suboptimal is you will likely be allocating memory at a high rate (and then either manually cleaning or exercising the GC - either of which will add loads of operations to the basic operation you really want)
@jsa-aerial Its hard for me to imagine how code transformation is different to symbolic differentiation
i can imagine specifying a set of term rewriting rules, which accepts a form and spits out another for computing the derivative, but the lines between symbolic differentiation and source code transformation AD seem to blur
'Basically' you have to write a specialized mini compiler. This will take the original source, generate an AST and symbol table, then you need some phases for attribute synthesis and then finally a phase for code generation. It is a lot more than just a few rules as for symbolic differentiation. The 'mini compiler' aspect though is an indication of how lisp languages can shine here - code is data - as for example: https://github.com/Functional-AutoDiff/STALINGRAD.
a compiler for the VLAD language? Nobody is using it. Why didn't they create something that serves a larger community like racket or clojure
Certainly somebody is using it, maybe just the authors. If they really were after the widest audience reach, they'd probably choose Java as a target, or .NET, or PHP (the papers are from the 2008), I guess...
Well, there are also some C++ libs for this (from the 'Almost Anything is Possible Dept' 😱) if that helps
But even that should be ok. They seem to have had a NSF grant, they did the work, someone got the PhD, goals accomplished. I doubt they were in the game to serve Clojure and Racket community for free 🙂
Also, in 2008 Clojure was barely known and the work was probably mostly done in the 2-3 years prior
That's an overloading approach that 'works', but is not exactly at the optimal level...
It's a good example of how memory use can be a killer here. All those recs being allocated...
@gigasquid "Python control flow operations are invisible to Autograd. In fact, it greatly simplifies the implementation". Sounds like Autograd just ignores conditionals.
Just that it's not something that leads to any viable solution to the topic at hand.
I did look at clj-auto-diff a fair amount and do think there is a middle ground here. One that isn't full code transformation, but which doesn't need to continually allocate memory, and which can 'resolve' method calls 'statically'. But it occurred to me that if you go that route, may as well go the Full Monty...