Fork me on GitHub
#data-science
<
2019-11-30
>
gigasquid01:11:11

Hot off the press from AWS - deep learning library for Java

🎉 8
gigasquid01:11:46

Haven’t looked at it deeply yet - but looks like another avenue for us to leverage

gigasquid01:11:50

From the contributor list - a lot of MXNet AWS folks involved

viesti10:11:27

;; deps.edn
{:deps {org.apache.logging.log4j/log4j-slf4j-impl {:mvn/version "2.12.1"}
        ai.djl.mxnet/mxnet-model-zoo {:mvn/version "0.2.0"}
        ai.djl.mxnet/mxnet-native-mkl$osx-x86_64 {:mvn/version "1.6.0-a"}}}

;; example
(ns clj-djl.core
  (:require [ :as io])
  (:import (ai.djl.modality.cv.util BufferedImageUtils)
           (ai.djl.mxnet.zoo MxModelZoo)
           (ai.djl.training.util ProgressBar)
           (ai.djl.modality.cv ImageVisualization)
           (javax.imageio ImageIO)))

(defn example []
  (let [img (BufferedImageUtils/fromUrl "")]
    (with-open [model (.loadModel (MxModelZoo/SSD) (ProgressBar.))]
      (let [predict-result (-> model
                               (.newPredictor)
                               (.predict img))]
        (ImageVisualization/drawBoundingBoxes img predict-result)
        (ImageIO/write img "png" (io/file "ssd.png"))))))

viesti10:11:46

I got a picture 🙂

viesti10:11:58

the Java API interops nicely

viesti10:11:30

learned how to do maven classifier in deps.edn too 🙂

viesti10:11:36

forgot to close Predictor in above, seems it implements AutoCloseable

viesti11:11:12

not meaning to start a wrapper, but put the above example into a repo: https://github.com/viesti/clj-djl/blob/master/src/clj_djl/core.clj

viesti11:11:15

the Java API might not need a wrapper at all, since interop works nicely

gigasquid13:11:33

Great! Thanks for blazing the way in trying it out :) I plan to check it out too in the near future

David Pham14:11:41

It seems great. (Wished it was more functional though) xD

viesti19:11:39

More tinkering (was looking at MXNet tutorials: https://mxnet.apache.org/api/python/docs/tutorials/getting-started/crash-course/1-ndarray.html):

clj-djl.core> (do (import (ai.djl.ndarray NDManager))
                  (import (ai.djl.ndarray.types Shape))
                  (with-open [nd-manager (NDManager/newBaseManager)]
                    (println (.randomUniform nd-manager 1 -1 (Shape. [2 2])))))
#object[ai.djl.mxnet.engine.MxNDArray 0x7b62d17b ND: (2, 2) cpu(0) float32
[[ 0.1527, -0.2471],
 [-0.2918,  0.2312],
]
]

👍 4
viesti19:11:19

seems that djl has a JNA wrapper around MXNet

viesti19:11:21

which is quite interesting, they have a tool to generate JNA mappings from the MXNet C header file: https://github.com/awslabs/djl/tree/master/mxnet/jnarator

viesti19:11:53

this generator tool reminds me of SWIG (http://www.swig.org/), although that generates straight JNI from C headers

viesti19:11:02

adopting new features from MXNet might be faster via a generator

gigasquid21:11:48

interesting - @chris441 brought up JNA for TVM / MXNet a bit ago https://discuss.tvm.ai/t/clojure-bindings-for-tvm/1127

viesti12:12:29

This 😄 > We would also like to see more java integrations to DMLC projects that allow easier access to multiple languages; ones that force unnecessary dependencies on the scala compiler also force dependencies on the root jvm runtime (mxnet, we are looking at you).

gigasquid21:11:03

The low level mappings are great - but there is a lot of other work built on the higher inference / training / dataset stuff too

gigasquid21:11:14

At least in the main MXNet library

gigasquid21:11:57

That said - it definitely is a nice accessible point of integration

gigasquid21:11:28

definitely could be useful for new stuff

gigasquid21:11:21

Please feel free to keep experimenting and pursue any good avenues. I don’t have a lot of time just now to dive in. Probably won’t have any until after the holidays 🙂