Fork me on GitHub
#shadow-cljs
<
2024-04-11
>
Vladimir Pouzanov13:04:04

does shadow-cljs still run things through gcc even with :js-provider :shadow? I seem to get gcc chokin on one of the npm dependencies with IllegalStateException: Expected a property access or array pattern: OBJECT_PATTERN and I'm trying to figure how to get around that. Is something like this https://code.thheller.com/blog/shadow-cljs/2020/05/08/how-about-webpack-now.html what I have to do for now?

1
thheller13:04:31

yes, shadow-cljs only uses the closure compiler for processing npm packages

thheller13:04:50

unfortunately the number of packages that don't work keeps growing, so I may have to change that at some point

thheller13:04:29

:js-provider :external is the "simplest" transition if you must use packages that are breaking

Vladimir Pouzanov13:04:59

it's the duckdb-wasm that's breaking, not a trivial dependency to replace unfortunately 🙂

thheller13:04:42

well wasm is also not supported so ...

Vladimir Pouzanov13:04:19

I think :external did the trick for me, though, so I'll use that in the meantime

thheller13:04:15

I would never consider such a package for anything frontend? how do you justify a 30mb+ wasm file? 😛

Vladimir Pouzanov13:04:38

datascript chokes and dies on my input json 🙂

Vladimir Pouzanov13:04:49

like, I'd really prefer to use datascript, but I can't even ingest the data I have, and duckdb can (I'm writing an irc log viewer/grep tool here for a hobby project and I basically have a 100mb long json)

thheller13:04:44

I hope you are doing that in a worker 😛

Vladimir Pouzanov13:04:36

I don't think service workers get more than 4g ram either

thheller13:04:45

normal web worker, not service worker

thheller13:04:09

no matter what you use loading a 100mb json file will probably lock your entire UI for seconds if you don't do it in a worker

Vladimir Pouzanov13:04:29

ah. well, it's not a problem because it's not that it loads long. it's that datascript ooms.

Vladimir Pouzanov13:04:03

duckdb doesn't have to load the whole file, though, so much lower memory requirements (with a feasible speed tradeoff)