Fork me on GitHub
#data-science
<
2017-10-29
>
michaellindon17:10:14

Is there any support for sparse matrices in core.matrix or Neanderthal

michaellindon16:10:45

ok i see, seems like some support is there, some is missing

gigasquid18:10:35

I think core.matrix has sparse matrix support ….

michaellindon18:10:15

@gigasquid I think you are right, I now see sparse-matrix in the documentation. This allows one to represent a sparse matrix, I wonder if mmul sparsematrix sparsevector will work as I expect. I think it depends on what implementation is being used for core.matrix. I guess an implementation is needed that supports linear algebra operations on sparse matrices

michaellindon18:10:52

I fear it might just convert to a dense matrix and use dense linear algebra operations

michaellindon18:10:03

I've always been using vectorz implementation

michaellindon18:10:15

Googling shows vectors has some sparse support

michaellindon18:10:23

I'll do some timings and see what happens :)

michaellindon18:10:46

Thanks for your help

jsa-aerial18:10:10

It is kind of hit or miss. Some operators work, but others (outer product for example) went totally off the rails (taking a couple sparse matrices and trying to move to dense => KABOOM)

michaellindon16:10:53

What are you using sparse matrices for? I want to do some regression with very large but very sparse regression matrices

jsa-aerial18:10:51

work on a new generalized RNN design that has variadic neurons, inherent self referential capabilities, and (so) doesn't need various special casings (such as for gates, etc.)

jsa-aerial18:10:53

Actually the need for sparseness is due to the structure of network topology matrix. At the moment we are scratching our heads on how best to proceed on this...