This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2021-07-30
Channels
- # announcements (4)
- # babashka (8)
- # beginners (124)
- # calva (13)
- # cider (10)
- # circleci (6)
- # clj-kondo (193)
- # cljdoc (1)
- # cljs-dev (4)
- # clojure (50)
- # clojure-europe (28)
- # clojure-serbia (1)
- # clojure-spec (22)
- # clojure-uk (30)
- # clojurescript (11)
- # clojureverse-ops (3)
- # community-development (1)
- # conjure (5)
- # cursive (1)
- # datomic (11)
- # depstar (1)
- # events (2)
- # fulcro (7)
- # graalvm (2)
- # graphql (10)
- # helix (43)
- # hyperfiddle (14)
- # introduce-yourself (6)
- # jobs (2)
- # jobs-discuss (14)
- # kaocha (4)
- # luminus (2)
- # malli (24)
- # meander (6)
- # off-topic (4)
- # pathom (1)
- # polylith (13)
- # re-frame (6)
- # releases (1)
- # remote-jobs (1)
- # sci (14)
- # shadow-cljs (209)
- # tools-deps (30)
- # xtdb (26)
@thheller this is not a shadow question per se, so I understand if you want me to ask this in #clojurescript instead, just let me know. This works:
node out/tbd_main.js test.cljs
with the :esm
target.
But when I install the project with npm install -g
and call it from some other directory, then I get:
$ tbd test.cljs
/usr/local/bin/tbd: line 2: import: command not found
/usr/local/bin/tbd: line 3: syntax error near unexpected token `('
/usr/local/bin/tbd: line 3: `const shadow_esm_import = function(x) { return import(x) };'
Should I configure npm to make it work well with the ecmascript module stuff somehow?The current state of the project: https://github.com/borkdude/tbd/tree/dynamic-import-esm
@borkdude look up node and esm. this is a general JS question, nothing specific to CLJS or shadow-cljs. I can't remember the exact rules for node and esm
I mean at this point it really doesn't seem worth the effort to use ESM at all, just more trouble than its worth
no, it is not. it is complaining specifically about import
. so maybe you need to pass an extra command line argument in the shebang
it just might not default to esm. the problem is that it treats the code as commonjs which doesn't have import
bash /usr/local/bin/tbd
gives exactly the same errors, that's why I think it's a shebang problem
:npm-module
is broken and pretty much unfixable in its current state. so don't build anything on top of that
Adding #!/usr/bin/env node
fixed it, but I expect npm to do this for me so it becomes a cross-platform viable way of installing the tool
as I said before from the build perspective this is all trivial. there just isn't a target that does this just yet for node. it could easily be built but for that I need more details about how you actually plan on doing any of this
my suggestion is to build all of this completely on top of :node-library
and then deal with the code splitting stuff later
it is an optimization after all so you don't need it from the start. if you just keep namespaces separate it'll be trivial to split out later
once I can actually see what you are doing I can make a suggestion and maybe a custom target to do what you need
great suggestion, I'll try :node-library
and will see what needs to be done for code splitting.
I'm not completely sure what you mean with runner.js, I will have to look into it. I'll try to break down the problems I encounter one by one and keep it focussed. The "hack" of require is done so the script user, who will call this hacked require, can load local npm modules.
(I'm looking at : https://github.com/thheller/shadow-cljs/blob/master/packages/shadow-cljs/cli/runner.js)
in your package.json you'll specify a bin
. that should be the runner. as seen here https://github.com/thheller/shadow-cljs/blob/master/packages/shadow-cljs/package.json#L21
that "runner" is then responsible for locating the correct .js file and "running" it
so when you run npx shadow-cljs
it'll call the runner.js and that will either locate the local node_modules/shadow-cljs
install and run that or if not found use a global install
well, SCI has access to the global object, so it can do interop on "everything" including require
. So when someone writes js/require
, we look up require from the global object and call it. This is why we need to put an override for require on the global object.
I must be missing something because I don't know why you need to provide a custom require
I was surprised by this too, but this problem is explained here: https://swizec.com/blog/making-a-node-cli-both-global-and-local/
the problem is that node require doesn't look in the local node_modules
when calling a globally installed CLI script
the runner.js
in shadow-cljs takes care of running either the LOCAL install OR the global install if that is missing
if I don't apply this hack, people will have to write (js/require "./node_modules/my_lib")
or so
you are talking about node_modules
which already has special rules so the user should NEVER type out node_modules
in anything and instead they should be tying (js/require "my_lib")
like I explained with the runner.js
logic. it maintains which version of YOUR tool is loaded. either the local or the global. YOUR tool, so tbd. why would you want that for the code the USER is loading?
they can already load local versions by default and you specically do NOT want it to fall back to global versions
let me put it differently, to make you understand what is not working. and then you can decide if you would solve this different.
I think what you are looking for is https://nodejs.org/api/module.html#module_module_createrequire_filename
there are uses for a custom require but never ever mess with the "global" js/require
. since that is not actually global but belong to the module you are currently in
and you definitely do not want that to the the reference of all your future requires
so if you want to expose a require for sci use the module.createRequire
based on the file it is currently eval'ing
the hashbang worked just with node, it just didn't have any hashbang, was the problem. But this "missing hashbang" problem isn't present in my non-ESM branch
never know if node people are actually going to remove support for commonjs at some point
that's kind of a conflict right: should I continue to work with the ESM stuff to make it "future proof" (with all kinds of async stuff around import) or continue with the :node-library
approach
You could also ask yourself: is node still around in 3 years or will people go all in on deno?
I'm not that familiar with the node ecosystem that much, it's an experiment for me at this point
I doubt deno will replace node ever but that doesn't mean ESM won't become more adopted generally
I don't think the async import stuff is a dealbreaker, I can make similar restrictions to dynamic requires as CLJS does, just support one top level ns form or require form
commonjs is certainly more convenient in some of these dynamic aspects but the focus has definitely shifted towards ESM
@thheller I've successfully got your shadow.esm/dynamic-import
exposed to SCI so you can write:
(-> (js/import "csv-parse/lib/sync.js")
(.then (fn [csv]
(let [csv-parse (.-default csv)]
(-> (csv-parse "foo,bar,baz\n1,2,3" #js {:columns true})
(js->clj :keywordize-keys true)
prn)))))
I think the (require '["csv-parse/lib/sync.js" :default csv-parse])
can be supported instead, but I wonder why I needed to write .js
... any idea here?
Should I wrap this function and automatically add this? That might be a bad idea. As (js/import "fs")
works and (js/import "fs.js")
for example doesn't.Should I just leave this alone in the ns
or require
form and just let the user figure it out?
why not just (js/import "csv-parse")
? specifying the extension when referencing a file is node stuff. I think thats a new requirement to esm, should have been that way always tbh
@thheller Now got this working as a "tbd" script:
(ns foo
(:require ["fs" :as fs]
["csv-parse/lib/sync.js" :default csv-parse]
[reagent.core :as r]
#_[reagent.dom.server :as rds]))
(println (str (.readFileSync fs "test.cljs")))
(prn :hello-the-end)
(prn (csv-parse "foo,bar"))
#_(prn (rds/render-to-string [:div [:p "hello"]]))
not loading reagent saves a bit of time, like 20ms or so. it's not a lot, but it shows that, architecturally, it's now possible to add more and more built-in libs, without impacting startup if you don't load all of those
If you have a local file relative to the script, e.g. foo.js
and you try to load it with:
(ns script (:require ["./foo"]))
This doesn't workuse the module.createRequire
if you must have a custom require (pretty sure there is a import equivalent)
@thheller I'm trying to do the createRequire thing. For:
import { createRequire } from 'module';
I'm writing:
(:require ["module" :refer [createRequire]]])
but then I get:
The required JS dependency "module" is not available, it was required by "nodashka/core.cljs".
Dependency Trace:
nodashka/core.cljs
Not sure if this is a correct message, since module
is built into nodeJS, I believe?When I try:
(def createRequire (.-createRequire js/module))
(def require* (createRequire js/module.meta.url))
I get:
ReferenceError: module is not defined in ES module scope
I have removed the hack as you suggested, but when I install nodashka
(current working name) globally using npm install -g nodashka
and then have a local node_modules
with e.g. the ink
dependency, it can't find it. That is the problem I'm trying to solve.
When you npm install -g ink
then it does work. Perhaps this should just be how it works...?
I guess this is perhaps just the way it should be. Mixing global deps from nodashka itself with local deps would perhaps be weird.
if you are still using the esm target you need to add "module"
to the :keep-as-import
set in your build config
otherwise it'll try to bundle it but since its a built-in node package it can't do that
I think I'm able to get "require" back through this createRequire thing even in ESM. I thought ESM would prevent me from doing so and all requires are now async. But I could revert that.
On the other hand, maybe for the future, one could require from some http address and then it would have be async anyway
I'll give it a try. I'm still confused about having nbb
(yes, changed the name) as a global tool from npm and using it with node_modules
in a local script dir. Do you think this is possible, or should a global tool always use the global npm deps?
btw you should maybe try to not use :default
. use the official ["thing$default" :as x]
instead of ["thing" :default x]
so using it dynamically requires some trickery, as seen in shadow.esm/dynamic-import
It looks like a function over here: https://github.com/thheller/shadow-cljs/blob/ba0a02aec050c6bc8db1932916009400f99d3cce/src/main/shadow/build/targets/esm.clj#L327
with js/eval
:
import.meta.url
^^^^
SyntaxError: Cannot use 'import.meta' outside a module
(def require* (createRequire (js/eval "import.meta.url") #_(.. js/shadow_esm_import.meta.url -meta -url)))
you want to create a require that uses the source of the current file as a reference
its not that complicated but helps to understand it. saves you from trying weird hacks 😉
require in node always has a "context". ie. the module you are currently in so it knows how to follow relative paths
great, now my requires look like this again:
(ns script
(:require
["csv-parse/lib/sync" :as csv-parse]
["fs" :as fs]
["shelljs" :as sh]))
Instead of (which was implemented via js/import):
(ns script
(:require
["csv-parse/lib/sync.js" :default csv-parse]
["fs" :as fs]
["shelljs" :default sh]))
probably shouldn't adopt :default
at this point since it was rejected from CLJS proper
I'll probably move this $
back to SCI proper so it can work for all CLJS applications using it
Hmm, it seems I'm still stuck in the async import
stuff. I did manage to create require
which resolves according to the file but you cannot require an ES Module with require
, so I'll have to keep doing that using dynamic import.
I found this issue about it: https://github.com/nodejs/node/issues/30645
It seems for the import stuff to work, I need access to:
import.meta.resolve also accepts a second argument which is the parent module from which to resolve from:
await import.meta.resolve('./dep', import.meta.url);
I tried to apply a similar trick to yours, but it didn't work.
"globalThis.shadow_esm_resolve = function(x,y) { return import.meta.resolve(x,y); }"
But I think, either the above import.meta.resolve
should be used, or, I should try something else than these ESM modules, or I should write my own resolver...
So, I've also created a branch node-library
here: https://github.com/borkdude/nbb/tree/node-library
The compilation works nicely. It's just that I want code splitting so I can load "splits" dynamically, as needed.
So perhaps we can look into making this works. It seems a lot "easier" from the nodeJS side than this ESM stuff
@thheller Good news. I was able to use the createRequire
stuff while also loading my own ESM modules async. So now the programs look "node-ish" , while still leveraring your ESM target with module support.
Once you support modules in the node-library target I'll likely move to that, but for now, we're good :-D
I will keep an eye on shadow for that feature, but I'm out of the woods now for my own needs
@thheller Btw, the way of resolving from the script file root got me thinking. I don't think the clojure CLI does this, does it.
e.g. foo/deps.edn
+ foo/script.clj
+ clojure foo/script.clj
will not use foo/deps.edn
but only the local deps.edn
Hello. I got one of these newbie questions: Want to run some cljs tests, but I guess my project layout is causing some problem:
/webapp/cljs/admin/shadow-cljs.edn
/tools/src/money.cljs
/test/money_test.cljs
/output/
/customer ; Separate cljs project also using tools, but not build by this shadow-cljs process
Here the relevant parts of the shadow-cljs.edn
{:source-paths ["src"
"../tools/src"
"../tools/test"]
:builds {:test {:target :node-test
:output-to "../output/node-tests.js"}}}
When I compile & run the tests (`pwd -> /webapp/cljs/admin`, I run node_modules/.bin/shadow-cljs compile test && node ../output/node-tests.js
), I get this output:
shadow-cljs - config: /webapp/cljs/admin/shadow-cljs.edn
shadow-cljs - socket connect failed, server process dead?
[:test] Compiling ...
[:test] Build completed. (51 files, 1 compiled, 0 warnings, 1.92s)
no "source-map-support" (run "npm install source-map-support --save-dev" to get it)
fs.js:114
throw err;
^
Error: ENOENT: no such file or directory, open '/webapp/.shadow-cljs/builds/test/dev/out/cljs-runtime/goog.debug.error.js'
at Object.openSync (fs.js:443:3)
at Object.readFileSync (fs.js:343:35)
at global.SHADOW_IMPORT (/webapp/cljs/output/node-tests.js:52:15)
at /webapp/cljs/output/node-tests.js:1524:1
at Object.<anonymous> (/webapp/cljs/output/node-tests.js:1575:3)
at Module._compile (internal/modules/cjs/loader.js:778:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10)
at Module.load (internal/modules/cjs/loader.js:653:32)
at tryModuleLoad (internal/modules/cjs/loader.js:593:12)
at Function.Module._load (internal/modules/cjs/loader.js:585:3)
The file exists, but in the wrong location:
ls /webapp/{,cljs/admin/}.shadow-cljs/builds/test/dev/out/cljs-runtime/goog.debug.error.js
ls: cannot access '/webapp/.shadow-cljs/builds/test/dev/out/cljs-runtime/goog.debug.error.js': No such file or directory
/webapp/cljs/admin/.shadow-cljs/builds/test/dev/out/cljs-runtime/goog.debug.error.js
My node version is v10.24.0
from the Debian Docker image I use and shadow-cljs is 2.15.2
.@igel don't use ../
in :output-to
. that is supposed to be a directory in the project.
if you want it outside the project dir setting :output-dir "../output"
in addition to :output-to
might work
We are building on https://github.com/PEZ/rn-rf-shadow and have just added [cljsjs/mqtt "2.13.0-0"]
as a dependency. In the code, we require [cljs.mqtt].
Problem: error from the shadow build app: "The required namespace "cljs.mqtt" is not available, it was required by "example/app.cljs"."
This all works fine in a web app we have, so I am guessing this is something to do with RN. Will shifting to interop with an RN MQTT NPM package help? We are trying: https://www.npmjs.com/package/react-native-paho-mqtt in the meantime.
@hiskennyness shadow-cljs does not support CLJSJS packages, just use the npm package directly
I'm guessing the webapp you have isn't built with shadow-cljs, otherwise you'd have the same issue there
Doh! Forgot to mention we are using figwheel on the web app.
I'm getting an error with a namespace in a test build, but not the browser build, and I don't understand why. Here the relevant parts I guess:
/shadow-cljs.edn
:
{:source-paths ["src" "test"]
:builds {:credit-assessment {:target :browser ;; Works!
:modules {:credit-assessment {:entries [credit-assessment.core]}}
:output-dir "../output/"
:asset-path "/cljs/output"}
:test {:target :node-test ;; Doesn't work :'(
:output-to "../output/node-tests.js"
:output-dir "../output"}}}
/src/credit_assessment/core.cljs
(requires credit_assessment.criteria
)
/src/credit_assessment/criteria.cljs
:
(ns credit-assessment.criteria)
/test/credit_assessment/criteria_test.cljs
:
(ns credit-assessment.criteria-test
(:require
[clojure.test :refer [deftest testing is are]]
[credit-assessment.criteria :as creteria]))
This works for the credit-assessment
build, but for the test
build, I get this:
[:test] Compiling ...
------ ERROR -------------------------------------------------------------------
File: /test/credit_assessment/criteria_test.cljs:1:2
--------------------------------------------------------------------------------
1 | (ns credit-assessment.criteria-test
--------^-----------------------------------------------------------------------
An error occurred while generating code for the form.
ExceptionInfo: no source by provide: credit-assessment.criteria
Hi all, I’m trying to build a shadow-cljs project in a luminus template, and then separate the build as I’ve been instructed here.
However, here is what happens:
So does this mean the luminus template has been updated to run shadow separately rather than as a lein task?