This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
- # adventofcode (7)
- # announcements (1)
- # aws (1)
- # babashka (127)
- # bangalore-clj (1)
- # beginners (54)
- # calva (8)
- # cider (12)
- # clj-http (2)
- # clj-kondo (8)
- # cljdoc (10)
- # cljs-dev (2)
- # clojure (48)
- # clojure-australia (14)
- # clojure-dev (6)
- # clojure-europe (105)
- # clojure-nl (6)
- # clojure-taiwan (1)
- # clojure-uk (62)
- # clojurebridge (1)
- # clojurescript (112)
- # code-reviews (4)
- # cursive (20)
- # data-science (1)
- # depstar (1)
- # events (8)
- # fulcro (6)
- # graalvm (11)
- # honeysql (2)
- # introduce-yourself (3)
- # jobs (1)
- # jobs-discuss (30)
- # malli (23)
- # nrepl (4)
- # off-topic (47)
- # pedestal (22)
- # polylith (10)
- # portal (3)
- # re-frame (16)
- # reitit (9)
- # releases (3)
- # rewrite-clj (1)
- # ring (1)
- # spacemacs (1)
- # tools-deps (12)
- # xtdb (7)
I see that freenode is imploding. It had been a while since I was in #clojure on there (it was pretty quiet when I was last in). But looks like at least some people have moved over to libera now.
(Annoying that Slack won't let you type #foo without it linking to #foo, if that exists as a channel)
The downside to text-based communication - I can't tell if you're aware of what's going on, and that's a joke, or if you're unaware and are referring to a literal netsplit
I'm not in many channels on freenode anymore anyway, and some of those that I'm in are fairly moribund these days
It seemed like a lot of staff. No idea how many staff freenode has altogether these days.
Hey, are there any tutorials or anything that really walk someone through core.async or promises in clojurescript? I'm really trying to understand what's going on here and it just ends up being a confusing mess. It's really quite frustrating because I can write async code in python et al really easily, but doing so in clojurescript just doesn't seem to click at all. I invariably get a channel or a promise and I can't get the value out of it...
Actually is there some way to drive it in the REPL better? Part of my frustration is that it's very hard to tell if things are working at the REPL. I just get back a
ManyToManyChannel, so I feel like I have very little visibility into how it's working.
if i need to inspect something in the cljs repl i do something like
(defn pval [p] (let [a (atom ::undefined)] (p/then p #(reset! a %)) a))
I think my primary issue is not being clear what I'm working against and the different constructs that I've seen give me inconsistent results.
For example, Using promises, I'm currently reading in a zip file and walking through the entries. I console.log the entry name and contents and they don't match. The entry name changes, but the contents seem to be permanently pointing to the first entry. Not really sure what's going on there.
I'd prefer to use core.async, but I'm not sure how to work with it in a repl in clojurescript. As I mentioned previously I constantly get
ManyToManyChannel being the evaluation result, so how do you evaluate functions and get the output to test your assumptions?
although unless you are talking
promise-chans then chans are going to behave differently
if you are wanting to test the async outputs in unit-tests then
async is your friend - https://clojurescript.org/tools/testing#async-testing
so you are having to convert from a js promises/streams API into something cljs ?
Getting lost trying to work with this and then mixing in streams and file api's stuff not working and having to convert to blobs etc...
I've gotten it working in the past where I get a massive string for the file contents, but that's really unhelpful when someone loads a giant file
it sounds entirely doable, but the mix of promises and streams in the js api probably presents some challenges mapping it over to cljs
we do loads of stuff like that on the backend, where we use manifold for both promises+streams, but i don't think we do anything like it in js
my default line of attack would be to use promesa for
promises, write something to convert between js
streams and core.async
chans, and write something which
reduces a core.async
chan to a
promise - that would give you something which looks roughly like the manifold api
don't know - I've never worked with js streams directly... but if you convert pipe then to/from core.async chans then you can just work with the chans
Finally figured it out, you're supposed to request a promise nested within the entry promise >_<...
The zip lib, I couldn't work out where the data was, I kept reading the buffer expecting it to contain the data, instead you have to within the promise call a function that doesn't appear in the method list (had to read source to find it) which creates another promise which when it resolves gives the file contents...
oh, right, so there's a first "open the zipfile" step, then repeated "fetch" steps to get at the actual data ?
It looks like opening the zipfile is more opening a table of contents? Then yes, fetching the contents... One thing I'm uncertain about is that a core async chan is supposed to have a buffer limit right? Is that going to be a problem for this kind of workflow where I could have a zip file with lots of entries and I can't queue them all up?
yeah, i think zip/tar are like a filesystem inside a file, with index/metadata pointing off to other offsets inside the file
one of the nice things about core.async (and also manifold and rx and js streams and vert.x etc) is backpressure - and core.async handles it very nicely. your
>! operation will not "return" until buffer space is available, so your producer process will be "parked" until buffer space is available... symmetrically, your consumer process will park on
<! until something is available
i use quotes because that's sync terminology, and it's all really async - the
go macro transforms everything into callbacks, so
>! gets transformed into a
put! with a callback and
<! gets transformed into a
get! with a callback etc
but in summary, you don't really have to worry about filling buffers up - there aren't any buffers by default, and your produce will only go at the pace your consumers dictate. if you add some buffers in between producer and consumer then that lets your consumers lag a bit without stopping your producer. in this particular case, where you have control of both producer (whatever is parsing the zipfile and putting content onto a chan) and consumer (whatever is reading content from the chan) you may not need any buffers at all
yes, absolutely - it's not blocking, it's parking - it makes a continuation closure fn, which is registered as a callback, to be called when data is available
not a million miles from what js
async/await does, or what the
promesa/let macro does