Fork me on GitHub
#shadow-cljs
<
2018-06-28
>
lilactown05:06:43

I think you can configure the amount of memory the JVM will use

lilactown05:06:26

there’s a known issue where docker containers do not expose the amount of RAM actually available, so the jvm will consume more than allocated for the container

lilactown05:06:46

you should be able to manually configure the amount of memory the JVM will consume, either through shadow-cljs or through defaults of whatever java runtime you’re using in the container itself

bupkis05:06:52

is there a way to rename the default export when requiring a JS library? Looking at the "ES6 Import to CLJS Require" table in the User's Guide, but don't see that, so I was wondering whether I'm missing something obvious?

levitanong05:06:53

@thheller I’m getting an error when i open the output from build-report. It was a complaint about an odd number of map elements in a map, because node_modules/date-fns has 201 elements. I suspect this is because the date-fns library contains several sub-folders—one for each function—so that the user can cherry-pick the individual fns she wants to use.

pez06:06:56

@lilactown @richiardiandrea thanks, it was actually working for me as well. I was just tricking myself by running a separate watcher for the test target and looking there for test results. But the results where coming from my node-libarary target… Now I am running both targets from the same watcher and it works. Almost. I do get en error on startup:

Error: ENOENT: no such file or directory, open '/Users/pez/Projects/calva-fmt/.shadow-cljs/builds/test/dev/out/cljs-runtime/goog.debug.error.js'
Is there a way I can avoid that? Am I still going about it the wrong way?

pez06:06:07

(From there on whenever I save the file tests are run. But I would prefer they ran at startup too, w/o throwing errors in my face.)

thheller07:06:34

@lwhorton SIGKILL looks indeed like the process is getting killed by maybe the OOM killer. you can configure the max memory by setting :jvm-opts ["-Xmx2G"] in shadow-cljs.edn.

lwhorton14:06:56

yea, i finally figured it out. it was the 4gb-per-instance limit in circle, so if I set JVM_TOOL_OPTIONS -Xmx3g everything goes through fine

lwhorton14:06:15

didn’t know there was a jvm-opts config, though. thanks!

thheller07:06:12

@samuel.wagen you can choose whatever name you want for :default (eg. :default whatever). the name is completely up to you.

bupkis07:06:44

@thheller Oh now I see! it makes sense. thanks.

thheller07:06:32

@pez there was a race condition in :node-test. fixed in master, will make a release later

thheller07:06:01

@levitanong what is the error exactly?

levitanong07:06:04

For some reason it works now. It might be a lein clean thing. Sorry for the noise!

thheller07:06:14

I'm especially interested in those issues though. lein clean should NEVER be required ever. If it is its a bug I want to fix. please let me know if you run into this again

levitanong07:06:16

Reproduced it!

#error {:message "The map literal starting with \"node_modules/date-fn...\" contains 201 form(s). Map literals must contain an even number of forms.", :data {:type :reader-exception, :ex-kind :reader-error}}

levitanong07:06:30

happens when i run generate from the repl

thheller07:06:16

can you open the HTML file and look for node_modules/date-fn?

thheller07:06:21

is there really a ... in there?

thheller07:06:49

wait .. is the error from the CLJ side or CLJS side?

levitanong07:06:04

oh boy i shouldn’t have opened it in emacs

levitanong07:06:39

well. more on javascript side

thheller07:06:58

hehe yeah the file can be huge

grounded_sage07:06:06

I'm looking to run some node scripts to manipulate some files. Is Nashorn the best way to go about this?

grounded_sage07:06:45

It's kind of during the clj-run step.

grounded_sage07:06:01

I want to run Critical to inlines critical css and Subfont to do font subsetting.

thheller07:06:37

you can use clojure.java.shell/sh

grounded_sage07:06:43

Ok I had a feeling that was the case which is why I asked before I dove in.

grounded_sage07:06:07

Ahh yea I was about to ask whats the best way to trigger node scripts. But that seems way more simpler. Just jump into the shell.

thheller07:06:18

shadow-cljs uses babel internally for some of the node_modules rewriting and does so by launching a node process

thheller07:06:32

the node process just wait for input from stdin and writes replies to stdout

thheller07:06:53

it basically just communicates via EDN messages back and forth

thheller07:06:33

doesn't have to be this complicated if you just want to call something once

thheller07:06:50

but in case of the babel stuff starting a new node process for each transformed file was way too slow

grounded_sage07:06:41

I want to run Critical, Subfont during the production step. Lighthouse during an Audit step and Puppeteer for a testing step.

grounded_sage07:06:35

My preference would be to stay out of vanilla JS as much as possible. Just thinking through the best way to do it 🙂

thheller07:06:38

this is the file doing the node side of things

thheller07:06:59

you can easily do the same and just communicate via EDN messages

grounded_sage07:06:14

That's what I want to do so that should help me get there

thheller07:06:43

yeah there it is right at the bottom

thheller07:06:28

no idea how the ... gets there. its supposed to be using the safeguard for that so emacs settings *print-length* doesn't mess it up

levitanong07:06:53

i used vim for this

levitanong07:06:03

i generally use vim for huge files haha

thheller07:06:18

but you are using emacs cider REPL to run the generate

thheller07:06:35

and thats the problem since emacs unconditionally sets *print-length* for the REPL

thheller07:06:55

it will work if you do (binding [*print-length* nil] (generate ...))

thheller07:06:06

will add a fix for it

levitanong07:06:01

Yeah, that fixes it!

thheller07:06:47

cider setting *print-length* is really really bad for shadow-cljs 😛

henrygarner10:06:23

I have a project that builds in shadow 2.4.8 but fails in >= 2.4.9 with > The required JS dependency "entities/lib/decode_codepoint.js" is not available, it was required by "node_modules/htmlparser2/lib/Tokenizer.js". It looks as though the way package require handling changed in 2.4.9: https://github.com/thheller/shadow-cljs/compare/3a43f6d...a2c870c . But require in htmlparser2 looks okay to me... https://github.com/fb55/htmlparser2/blob/master/lib/Tokenizer.js#L3 (unlike the issue with clipboard I ran into last week, the required file does actually exist this time!)

thheller11:06:37

@henrygarner odd. I'll take a look shortly

👻 4
thheller11:06:53

@henrygarner fixed in 2.4.11. thanks for the report.

henrygarner11:06:02

Amazing! Thanks @thheller 👌

thheller13:06:16

@smnplk ideally you want to avoid running code "globally" and instead move it into a :dev/after-load callback

thheller13:06:36

since an exception while loading code prevents other code from being loaded properly

smnplk14:06:21

I have it in :builds :app :devtools :after-load --> app.main-view

smnplk14:06:36

I am using the latest version of cli and npm shadow-cljs

thheller14:06:32

ah right sorry. forgot that any error will cause the display above

smnplk14:06:34

The problem occurs only if i call (app-routes) inside main-view function.

smnplk14:06:54

but if i add it somewhere outside, all works fine

thheller14:06:55

(app-routes) probably uses goog.History which must only ever be created once

smnplk14:06:42

I am importing goog.History

thheller14:06:50

the error means that the div#wrapper element does not exist

smnplk14:06:53

Ill try defonce

thheller14:06:08

and goog.History will reset the dom when it is constructed when the page finished loading

smnplk14:06:51

whoa, didn't know that it resets the dom.

thheller14:06:04

goog.History is sort of ancient and should probably not be used anymore

smnplk14:06:45

suggestion for any better routing lib that uses Html5History ?

smnplk14:06:20

Oh, just found one. Bide 🙂

smnplk14:06:51

@thheller, I just switched to Html5History and the problem went away.

justinlee19:06:42

for grins i tried running shadow-cljs on an amazon c5d-xlarge instance. i am surprised that it only runs twice as fast as my 2012 era macbook. i guess single threaded performance just isn’t improving much now

thheller19:06:08

on my desktop I get cljs.core compile times of about 2.5sec which is substantially faster than my macbook with about 6sec

justinlee19:06:32

yea my macbook is about 6-6.5 seconds

thheller19:06:41

i have the i7-8700k which still seems to be the best in single core perf

thheller19:06:45

wonder how much compiler can still be tweaked to get more out of each core

justinlee19:06:32

i feel like there are opportunities for more sophisticated dependency analysis

thheller19:06:58

not for single-ns compiles

thheller19:06:04

can't do those in parallel

thheller19:06:41

big ns like cljs.core are the bottleneck since everything else has to wait for them to compile

justinlee19:06:00

i meant more like realizing that although a dependency i include changed, i really don’t need to recompile because the function i’m using didn’t change

justinlee19:06:38

i have tons of functions in one file that everything depends on, but in reality each file only depends on a small part. i could break it up (and eventually I will)

thheller19:06:01

yes incremental compiles could definitely be better with some more logic

thheller19:06:24

but I mean clean non-cached performance which I think can be at least 50% faster still

justinlee19:06:51

oh interesting. through parallel compile or something?

thheller19:06:57

no just by tweaking things. clojure compiles way faster but has to do a lot more work since the JVM is much less dynamic than JS

thheller19:06:16

so there are still probably some bottlenecks in the code somewhere that just need to be found and fixed

thheller19:06:29

its not a fair comparison since most of the clojure compiler stuff is written directly in java

theeternalpulse21:06:22

I was wondering if crowdfunding a slack pro for clojurians would be worth it?

theeternalpulse21:06:51

op, just looked at pricing lol

justinlee21:06:53

it would be insanely expensive

theeternalpulse22:06:37

Yeah, I thought I saw reasonable pricing a while back, but for that price it's not even worth the history

theeternalpulse22:06:29

And I clicked the wrong forum, my bad