Fork me on GitHub
#shadow-cljs
<
2021-05-05
>
Roman Liutikov08:05:55

is it possible to run advanced build in watch mode? trying it now and the output doesn’t seem to be optimized (not using :debug flag)

thheller08:05:26

no, not possible. due to the sometimes excessive build times of :advanced that simply isn't practical. much better to click one button or run one command when you are actually sure you want to build

👍 3
Aron09:05:11

should I use a linter? and which one?

thheller09:05:43

no clue, your choice. I've never used one for long, too much noise regarding macros in the ones I tested

Aron09:05:58

that's why I am asking here, seems like an unsurmountable hurdle

thheller09:05:02

I think the most popular one currently is #clj-kondo

Aron09:05:57

yeah, ale for vim has autoconfigure for it, also for joker, I tried both because it would help me as a beginner, i think, but it's probably not going to happen today. difficult to configure all the macros and i am not even sure what to do when it's not a macro

lispers-anonymous14:05:27

I've not had too many issues with clj-kondo. Dealing with macros is pretty easy. In my re-frame project, the only macro I have a special lint rule for is this

{:lint-as {reagent.core/with-let clojure.core/let}}
Which lives in project-root/.clj-kondo/config.edn Everything had been working great with it.

lispers-anonymous14:05:07

If you have custom macros that don't match any known macro linters I think you can tell the linter to ignore them, or perhaps tell it to :lint-as the comment macro

grumplet10:05:31

Should we expect the grand total of the build report optimised column from something like shadow-cljs run shadow.cljs.build-report app report.html to be about the same size as the target app.js I wonder? It’s falling short for me and so I’m trying to understand what I’m missing.

thheller10:05:34

within a certain percentage yes, it isn't 100% accurate but close enough

grumplet10:05:35

Thanks @thheller. I’ll make a gist to show you what I mean - I’m quite a long way short.

thheller10:05:03

build config would help. don't know what you are doing to the build 😛

thheller10:05:26

html report is also more helpful since it groups stuff

thheller10:05:53

jar | cljs/core.cljs | 750.5 KB

thheller10:05:00

looks like you are maybe using :simple? or pseudo-names?

thheller10:05:30

shouldn't be that large otherwise

grumplet10:05:13

Sorry - my mistake - I had dine a calva jack-in so was looking at a development app.js. Makes more sense after another build.

grumplet10:05:42

… and you were right - I did have optimisations simple.

Jakob Durstberger18:05:48

Hey, I am using shadow-cljs to run cljs on aws lambda. I want to use the npm package jsonwebtoken. The problem is when I build my lambda for prod the node_modules are not included in the output folder so just uploading the index.js file leads to a runtime error that jsonwebtoken could not be found. I copied node_modules manually and zipped it all up which worked but is tedious. Is this expected? I feel like I am doing something wrong and shadow-cljs would take care of that for me. shadow-cljs.edn:

{:source-paths
 ["src"]

 :dependencies []

 :builds
 {:wallet-lambda {:target :node-library
                  :output-to "./dist/wallet_lambda/index.js"
                  :exports {:handler com.ye-olde-shoppe.wallet.main/handler}
                  :compiler-options {:infer-externs :auto}}}}
package.json
{
  "name": "yos-backend",
  "version": "0.0.1",
  "private": true,
  "devDependencies": {
    "shadow-cljs": "2.12.5"
  },
  "dependencies": {
    "jsonwebtoken": "^8.5.1"
  },
  "scripts": {
    "build-wallet": "shadow-cljs release :wallet-lambda",
    "assemble-wallet": "npm run build-wallet && cd dist/wallet_lambda && zip ../wallet_lambda.zip *"
  }
}

thheller19:05:06

you can post process the :output-to file with something like https://github.com/vercel/ncc

thheller19:05:23

but honestly it is better to just copy the node_modules you need

Jakob Durstberger19:05:29

ok, thanks. I’ll probably add a copy node_modules step to my build script

Jakob Durstberger19:05:16

just to clarify, you mean to copy the whole node_modules folder right?

thheller19:05:28

create a folder with a package.json including the pacakges you need

thheller19:05:33

run npm install --production

thheller19:05:42

copy the resulting node_modules

thheller19:05:10

I think the AWS has a section for best practices regarding this

thheller19:05:14

just check their docs

Jakob Durstberger19:05:06

ah, thank you 🙂

rberger00:05:47

Here’s a https://gist.github.com/rberger/6d4c5db1ecb5ae452e1b5412a14c7e6d that showed how I just recently did something similar. Was targeting lambda@edge which had a max image size of 50MB, so besides yarn install --production I also used https://github.com/ModClean/modclean to get the zipped image with node_modules to less than 30MB.

👍 3
thheller07:05:08

@U07GPFFAS you can safe some time by combining

npx shadow-cljs release :lambda-viewer
	npx shadow-cljs release :lambda-origin
into just
npx shadow-cljs release lambda-viewer lambda-origin

thheller07:05:20

safes the second startup

rberger08:05:59

nice to know! Thanks

Jakob Durstberger15:05:09

That is very similar to what my script ended up looking like 🙂

jaime18:05:14

Hi everyone, I'm using shadow-cljs v2.12.5 for a react-native app The react-native console is showing error

ERROR  shadow-cljs - remote-error {"isTrusted": false, "message": "CLEARTEXT communication to 10.212.134.68 not permitted by network security policy"}
 WARN  shadow-cljs: giving up trying to connect to  
I think the problem here is that the IP above is not my local IP (192.168.0.184). When I search for 10.212.134.68 in my project, I found a matching text in target/index.js file Here is the first 5 lines of code
var $CLJS = global;
var shadow$start = new Date().getTime();
var shadow$provide = global.shadow$provide = {};
var goog = global.goog = {};
global.CLOSURE_DEFINES = {"shadow.cljs.devtools.client.env.repl_pprint":false,"shadow.cljs.devtools.client.env.reload_strategy":"optimized","shadow.cljs.devtools.client.env.devtools_url":"","shadow.cljs.devtools.client.env.autoload":true,"shadow.cljs.devtools.client.env.proc_id":"b2dc811d-cc6d-467e-81bf-e8f4eae2709a","shadow.cljs.devtools.client.env.use_document_protocol":false,"goog.ENABLE_DEBUG_LOADER":false,"shadow.cljs.devtools.client.env.server_port":9630,"shadow.cljs.devtools.client.env.server_token":"d98f5299-1231-4718-8262-3736140c13f2","shadow.cljs.devtools.client.env.use_document_host":true,"shadow.cljs.devtools.client.env.module_format":"goog","goog.LOCALE":"en","shadow.cljs.devtools.client.env.build_id":"app","shadow.cljs.devtools.client.env.ignore_warnings":false,"goog.DEBUG":true,"cljs.core._STAR_target_STAR_":"react-native","shadow.cljs.devtools.client.env.log":true,"shadow.cljs.devtools.client.env.ssl":false,"shadow.cljs.devtools.client.env.enabled":true,"shadow.cljs.devtools.client.env.server_host":"10.212.134.68","shadow.cljs.devtools.client.env.worker_client_id":2,"goog.TRANSPILE":"never"};
When I change the hardcoded IP in target/index.js file, the websocket connection is working and I'm getting a message
LOG  dev init time 839
 LOG  Running "Limeray" with {"rootTag":201}
 LOG  date is  null
 LOG  #6 ready!
My question is, is there a way to configure shadow-cljs to use specific IP for websocket? In my case 192.168.0.184 instead of 10.212.134.68 Previously when I encountered this issue, I was able to configure like below. But now, it does not seem to work
{:deps true
 ;; local network IP in order for websocket to send repl command over-the-network
 :http {:host "192.168.0.184"}
 :builds
 {:app {:target :react-native
        :init-fn limeray.mobile.main/init
        :output-dir "target"
        :js-options {:js-package-dirs ["node_modules"]}}}}

thheller19:05:35

@jaime.sangcap in ~/.shadow-cljs/config.edn you can configure :local-ip "192.168.0.184" (or in your shadow-cljs.edn local config as well)

jaime20:05:43

Awesome! Thanks a lot, it works. Shame on me, it is stated in the docs https://shadow-cljs.github.io/docs/UsersGuide.html#repl-trouble-react-native 😅