This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-10-29
Channels
- # aws (2)
- # bangalore-clj (2)
- # beginners (36)
- # boot (10)
- # cider (9)
- # cljs-dev (19)
- # clojure (47)
- # clojure-russia (4)
- # clojure-spec (18)
- # clojure-uk (4)
- # clojurescript (71)
- # core-async (20)
- # core-logic (2)
- # css (3)
- # cursive (5)
- # data-science (15)
- # datomic (7)
- # emacs (13)
- # figwheel (4)
- # klipse (1)
- # luminus (5)
- # lumo (1)
- # off-topic (33)
- # re-frame (17)
- # shadow-cljs (1)
- # spacemacs (5)
- # specter (21)
- # unrepl (1)
- # vim (7)
Is there any support for sparse matrices in core.matrix or Neanderthal
ok i see, seems like some support is there, some is missing
@michaellindon here is an example from when I was playing around with hypervectors https://github.com/gigasquid/hyperdimensional-playground/blob/master/src/hyperdimensional_playground/core.clj
@gigasquid I think you are right, I now see sparse-matrix in the documentation. This allows one to represent a sparse matrix, I wonder if mmul sparsematrix sparsevector will work as I expect. I think it depends on what implementation is being used for core.matrix. I guess an implementation is needed that supports linear algebra operations on sparse matrices
I fear it might just convert to a dense matrix and use dense linear algebra operations
I've always been using vectorz implementation
Googling shows vectors has some sparse support
I'll do some timings and see what happens :)
Thanks for your help
It is kind of hit or miss. Some operators work, but others (outer product for example) went totally off the rails (taking a couple sparse matrices and trying to move to dense => KABOOM)
What are you using sparse matrices for? I want to do some regression with very large but very sparse regression matrices
work on a new generalized RNN design that has variadic neurons, inherent self referential capabilities, and (so) doesn't need various special casings (such as for gates, etc.)
Actually the need for sparseness is due to the structure of network topology matrix. At the moment we are scratching our heads on how best to proceed on this...