Fork me on GitHub
#data-science
<
2017-10-13
>
sb10:10:33

@fabrao do you need to learn the basics, means do you need to learn eg how works the convolutional neural network in first. After this, you will know the answer. Lot of course on the net, bests in Python.

michaellindon13:10:47

is anyone able to suggest a good automatic differentiation library for clojure? One that can handle expressions with matrix operations?

rustam.gilaztdinov13:10:45

@michaellindon thank for you talk about bayesian data analysis on clojure/conj, just saw video! I was wondering — why do you used commons.math instead of neanderthal? http://neanderthal.uncomplicate.org

michaellindon13:10:28

@rustam.gilaztdinov thank you! I don't use commons.math for any linear algebra, I actually ported some of their distributions to clojure and provided some idiomatic wrappers for interacting with their distributions. Their code is well maintained and used heavily, so I trust their random number generation

michaellindon13:10:33

Incanter for example uses a java library parallel colt for their random number generation, but parallel colt had not been maintained in a while i think... i trust commons math more than parallel colt

rustam.gilaztdinov13:10:43

yes, but for bayesian inference — performance is such important thing. Do you have any look on bayadera? Builded on top of neanderthal and has his own distributions implementation. I think it’s very cool

michaellindon13:10:40

i know of bayadera, but i dont know enough implementation details about it

michaellindon13:10:49

i dont know what algorithms are being used for inference

michaellindon13:10:42

if people are interested in the distributions library i talked about, its here:

blueberry16:10:09

Note also that Bayadera's approach to RNG is a bit different. The linked distributions are only plain functions for computing interesting (scalar) values in Clojure. RNG is supposed to be done in bulk billions directly from the (GPU) computing engine with sampler 's sample method (https://github.com/uncomplicate/bayadera/blob/master/src/clojure/uncomplicate/bayadera/core.clj#L190)

blueberry16:10:29

In that case, you are using (built-in or custom-written) OpenCL/CUDA distribution definitions. And, what is not covered in other libraries that happily sample "easy" 1-dimensional distributions or special n-dim ones, this is mainly beneficial when sampling wild hierarchical models. Of course, for easier stuff, you easily get billions of samples in a blink of an eye.

blueberry16:10:36

The main thing to note is that the point of (MCMC) sampling is to get the sample of an unknown (posterior) distribution. Getting the sample of a known distribution is in many (most?) cases applying circular reasoning. If I know the distribution at hand, then I can answer the questions about it directly, I don't need a sample from it.

michaellindon18:10:57

How does bayadera generate samples from the posterior distribution?

michaellindon18:10:39

its not clear from the slideshow i found online

blueberry19:10:53

MCMC on the GPU

michaellindon20:10:07

you'll need to be more specific

michaellindon20:10:14

MCMC is a broad class of algorithms, there are many

michaellindon20:10:32

i can find no details on what mcmc algorithm they are using