This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-11-08
Channels
- # bangalore-clj (4)
- # beginners (88)
- # boot (12)
- # cljs-dev (10)
- # cljsjs (1)
- # clojure (284)
- # clojure-denmark (2)
- # clojure-dev (35)
- # clojure-italy (8)
- # clojure-russia (36)
- # clojure-spec (38)
- # clojure-uk (51)
- # clojurescript (145)
- # cursive (6)
- # data-science (1)
- # datomic (8)
- # duct (43)
- # emacs (9)
- # figwheel (2)
- # fulcro (29)
- # graphql (1)
- # immutant (3)
- # instaparse (1)
- # jobs (1)
- # jobs-discuss (1)
- # lumo (16)
- # off-topic (50)
- # onyx (90)
- # re-frame (6)
- # reagent (20)
- # remote-jobs (3)
- # ring-swagger (18)
- # schema (8)
- # shadow-cljs (141)
- # slack-help (3)
- # spacemacs (36)
- # unrepl (7)
- # vim (1)
- # yada (2)
@seancorfield the cause isn't that helpful but I'll post it:
Exception in thread "main" java.lang.RuntimeException: Problem parsing near line 1 < [taoensso.timbre :refer [info warn error] :as timbre]> original reported cause is java.lang.NoSuchMethodException: clojure.lang.LispReader.matchSymbol(java.lang.Str
ing) -- java.lang.NoSuchMethodException: clojure.lang.LispReader.matchSymbol(java.lang.String), compiling:(/tmp/form-init450743688382126384.clj:1:72)
1.9 checks ns
syntax more stringently than 1.8 did
Yeah, that's what I was thinking having seen the error.
Or maybe an AOT issue?
@aaelony So it's probably specific to your code base setup rather than something in Marginalia per se...
timbre can be really bad for that, because people will write extensions to whatever java logging framework and depend on timbre for whatever reason, but they need to be aot compiled, so you can end up with an aot'ed version of timbre coming in from some other jar
Repro'd
Exception in thread "main" java.lang.RuntimeException: Problem parsing near line 1 < [taoensso.timbre :refer [info error warn] :as timbre])> original reported cause is java.lang.NoSuchMethodException: clojure.lang.LispReader.matchSymbol(java.lang.String) -- java.lang.NoSuchMethodException: clojure.lang.LispReader.matchSymbol(java.lang.String), compiling:(/private/var/folders/p1/30gnjddx6p193frh670pl8nh0000gn/T/form-init7849628237322743095.clj:1:125)
well, I can change anything. No issue though for lein marg under 1.8.0. I don't even have to use lein marg, but why not?
Marginalia https://github.com/gdeer81/marginalia/blob/master/src/marginalia/parser.clj#L160
Calls the LispReader directly -- I suspect the calling arity changed in 1.9?
https://github.com/gdeer81/marginalia/blob/master/src/marginalia/parser.clj#L117-L120
more info here: https://github.com/gdeer81/marginalia/issues/167
Ah, there we go. Looks like matchSymbol
expects two arguments now? A string and a resolver...
1.8 https://github.com/clojure/clojure/blob/clojure-1.8.0/src/jvm/clojure/lang/LispReader.java#L394 takes just a string
I added a comment to that issue. Maybe @gdeer81 will be able to chime in?
I wonder if Rich would be ok with clojure 2.0 having breaking changes to remove warts from the core language, or if he wants clojure to be eternally backwards compatible regardless of whether the occasional bad design decision aggregates or not
I mean clojure is a very well designed language but there are a few ugly parts that could use a do-over
i know he mentioned that if he could do it all over again reduce would require an initial value
and rename for
to something like sequence-comprehension
(my suggestion, not anybody elses)
and I tend to think that lazy evaluation of the elements of a sequence and the actual transformation performed over a sequence are orthogonal concepts. its unfortunate that transducers are complected with the idea of immediate evaluation.
are they? you can transduce lazily
but that’s not the only transducing context - and right, transduce is not lazy
what i'm really getting at is that the core api is more complex than it needs to be simply because it was necessary to keep it backwards compatible and not make a breaking change
what you'd really want to do is be able to define a set of transformations (probably a sequence of transformations actually)
sequence is lazy, and eduction
and defining a transducing function is exactly what you describe
then you'd want to separately define how those transformations are applied to a sequence of data (lazy, not lazy, across multiple cores or not, etc)
transducers are literally a way to define a set of transformations without worrying about representation. the transduce
function is perhaps a bit misleading as it's a way of applying transducers
@bcbradley you are describing exactly how transducers are used
@bcbradley Do you know where Rich said that about reduce requiring an init val? I remember that as well, but I tried looking for it once and couldn't find it.
there is something else i find a bit puzzling though-- we use transducers to convey what is basically loop unrolling to the compiler-- we really don't NEED then to pass around the idea of doing a specific sequence of transformations, because we could just use apply to apply a sequence of transformations or we could pass around an s-expression that represents the transformations
Hmm, I couldn't find that in the transcript https://github.com/matthiasn/talk-transcripts/blob/master/Hickey_Rich/EffectivePrograms.md
Yeah, I remember hearing/reading it longer ago than the conj.... I have a crappy memory though, so you never know ¯\(ツ)/¯
>>>Who knows what the semantics of reduce are when you call it with a collection and no initial value? [Audience response] No one, right. No one knows. It's a ridiculous, complex rule. It's one of the worst things I ever copied from Common Lisp was definitely the semantics of reduce. It's very complex. If there's nothing, it does one thing. If there's one thing, it does a different thing. If there's more than one thing, it does another thing. It's much more straightforward to have it be monoidal and just use f to create the initial value. That's what transduce does, so transduce says, "If you don't supply me any information, f with no arguments better give me an initial value."
Yeah, reduce
is ... weird. When I first implemented reducible-query
in clojure.java.jdbc
I got the no-init arity of reduce
wrong because I assumed the semantics of transduce
. The reduce
docstring is very clear -- the behavior is just a bit non-intuitive.
basically i've been thinking and i'm trying to understand whether or not transducers are really ACTUALLY simple, or whether or not they are just an implementation detail
wouldn't a sufficiently intelligent compiler be able to inspect an s expression before evaluating it and rearrange its subexressions algebraically to do what transducers are effectively doing?
i'm just thinking that transducers, while they capture the idea of performing a computation independent of the type of the collection or data source, and in that sense are a basic fundamental thing, realistically aren't NECESSARY to extract the information about what sequence of transformations are actually taking place, since this is a lisp after all
i'm really not sure how I feel about it, its either transducers (and the initial difficulty in learning them) or the magic of a macro
imagine that if instead of having go channels take a transducer you just had something like (magic (map f (filter g chan)))
, where magic
is responsible for doing what transducers essentially do for chans
it would do all the unrolling itself, and ensure that you didn't have intermediate collections or channels
I think there is an option somewhere in between those. I was thinking about that the other day. Transducers are kind of ... fragile?... they have all these rules like don't touch the result so far, call the next rf in the chain in the cleanup arity, zero arity just calls (rf), etc. Could a macro kind of like fn
take care of all that stuff instead of us having to write all that boilerplate?
Rich has made it really clear that despite some regrets, compatibility will be preserved. There will not be any attempt to "fix mistakes" and break everyone's code. This is why it's important to commit to and promise less
@madstap, there really aren’t that many types of transducers. it seems like it’s pretty uncommon to create a “new” transducer. out of curiosity, what kinds of transducers have you or others been creating that aren’t map
, filter
, cat
, dedupe
, etc?
Wouldn't your magic
macro have to know about all possible transducers though? How would it know if something can be made into a transducer?
I mean i understand that one of the really great things about transducers is that you can construct your own transducers and then use them with any transducible process. My criticism about it is that it seems to expose some limitation in the implementation; its not purely about separating the sequence of transformations from the sequence (or channel or whatever) of data-- you could make that separation without creating the idea of a transducer, by just passing around function that is composed of the data processing functions for instance, then apply that function directly to the data.
@smith.adriane Also I wrote this one for someone here on the slack https://gist.github.com/madstap/a7d158ef0c3e7b5bbf5cd55c5de4c913
just for my understanding, how does this compare to (comp (partition-by consecutive-fn) (map merge-fn))
?
The merge-fn is reduced over each partition instead of given a partition as an argument.
so more like (comp (partition-by consecutive-fn) (map #(reduce merge-fn %))
?
although, I think your example wouldn’t have to hold a whole partition in memory at the same time
so if you had large partitions
i’m trying to figure out if the same transducer could be created from two simpler transducers, but my brains a little fried at the moment
too much philosophizing about tranducers 😛
that’s kinda why i’m hoping you could make more complicated transducers by just composing simpler ones
because, as you mentioned, there are a bunch of rules that you have to follow
… or else
kaboom
Yes, exactly. I think that you mostly can do that, but sometimes there are just basic building blocks that aren't compositions of other transducers. And sometimes, like I did, you make a new one cause you don't have the imagination to combine two existing ones.
Take a look at source of the xforms library some time, some of those are basic building blocks. (Fair warning, trying to grok that is headache inducing.)
yea, it seems neat
thanks for the link!
@bcbradley, (map inc)
, (filter even?)
, etc. seem like fairly minimal descriptions of a “step”
which part of (map inc)
doesn’t need to be part of the transducer?
@bcbradley But how would you do filter in that scheme? A function that returns nothing, as something separate from nil?
how would you like it organized? currently, you can do (comp (map inc) (filter even?) (map inc))
There is also no clear definition in an impure language around what is pure vs impure, where can I move things around, etc
basically, (comp (map inc) (filter even?) (map inc))
is no better a representation of the process from an apparent point of view than the s-expression ((map inc) (filter even?) (map inc))
, or the s expression (map inc (filter even? (map inc)))
the idea of transducers were to separate the definition of what was being done from the thing upon which it was being done
the real reason transducers are around is because clojure see's (map inc (filter even? (map inc x)))
and makes three sequences
Transducers are independent of collections. They don't compare to a collection pipeline
But sure, can make a macro that does a lot of things. Transducers as they are in core hit a sweet spot with expressivity and leverage
The examples being bandied about (map inc) are nice because they capture only essential detail
There would have to be a huge benefit to incur additional complexity into the compiler or macro. Not saying that's a no, but it's unlikely to be sufficiently compelling.
my biggest gripe is that transducer's inherent value seems to be that you can describe a process to be done on the elements of a data source (channel, collection, sequence, iterable, whatever else) by "composing that process out of transducers"-- but you could already DESCRIBE that process anyway, without resorting to composing functions; you could just compose data in S-expressions that literally represent what you are doing to whatever thing you are doing it to; and if you want it to be independent of the thing you have that choice already! just omit it from the S-expression (its always going to be the most nested, rightmost element)
so i'm wracking my head trying to think, what do transducers really actually do? They don't necessarily allow me to communicate anything about the process of transformation on data moreso than i could without them, and they don't allow me to separate the process from the thing being processed moreso than i could do without them
i can only think that what transducers really do that the nested S-expressions don't is to capture the idea of doing a bunch of processes together at once in one pass on whatever the thing is
I think you could say that about functions in clojure in general
once you do (def myfn (fn [x] (+ x 1)))
you don’t pass around the s-expr
you pass around the function
and you can’t really take it apart later
if you hand somebody myfn
, they can’t tell if it’s made up for complicated stuff or that it just returns a constant
if the idea of transducers was to separate what is being done from the thing is is being done to, isn't that just the definition of a function?
i mean, you can take a look at the definitions of the different transducers
and they are “just functions”
I think it’s useful to say more about them and give them a name like transducers
for these types of functions, map
, filter
, dedupe
, etc
and having the tranducers as a separate thing has already paid off
originally, core async had their own map, filter, dedupe functions
but now you can reuse these core functions in core async or use them with lazy sequences
it seems kinda obvious that it should be possible, but if you look around at other ecosystems, they do have have a map
for Rx, and then map
for collections
i don't mean to imply that i think we should have a bunch of different maps for different data types
think about what you are actually feeding a chan for instance when you provide a xf argument
you can't for instance, give it just a general expression that expects a value in a particular place and can use any function ("transducible" or otherwise)
IF the transducible process had the macro capability to look inside what you gave it
THEN it could decide for itself what it can do, whether it has to break up some things or rearrange others into what we are calling "transducible process"
it could even rearrange things or break things up in more particular ways that are specific to the thing being worked on
transducers were a great way to follow DRY, but i feel like it was a missed opportunity to just fix the real issue
can you give an example of what you would like the code to look like?
i feel like it’s hard to be more concise than (chan 1 (map inc))
,
i'd like to say (map dec (filter even? (reduce conj [] (foo bar (xan zoo WHATEVER)))))
and have clojure know how to optimally unravel it so that it doesn't create more intermediate sequences than necessary, can do what is effectively known as a "transducible process" (doing things in one pass) on either side of the reduce, and can even do things in parallel if it can use the runtime to judge that doing so doesn't alter the semantics of what you are doing (it can't if you are just using immutable values) and if doing so would result in faster computation even after taking into account the overhead involved with concurrency.
in other words, the details exposed to us in the transducer api shouldn't really have to be exposed to us-- if that is how things need to work between "transducible processes" well ok, but why should i be bothered with it?
the fact that making new transducers means they have to do this and this and that, why do i care?
if i want to express the idea of a computation without the thing being computed, i can just say (map dec (filter even? (reduce conj [] (foo bar (xan zoo)))))
i just pass it as an s-expression and someone can inject whatever they want into it and then evaluate it
Verify that I've done these things is different than doing them automatically, though.
it would be cool if the compiler would take my clojure code and optimize/parallelize it for me, but that seems like a separate issue than the design of transducers
i guess i tend to think that ideally programs should just say what it is, semantically, that you want to do
the question of how that is done or how to make it performant is a completely separate and orthogonal concern
i don't like it when programming languages ask you to make them faster by instructing them on how to do the compiler's job
I think that's kind of not the clojure way, though. Like the difference between last
and peek
on a vector.
I think it's very valid to question things 🙂 Rich is not a messiah and blindly following him is not good for anyone.
What you're saying about a sufficiently smart compiler/macro really reminds me of both criticism and praise about haskell though. Where it can do some really cool optimizations, but that makes it hard to reason about perf.
honestly the jvm can be hard to reason about anyway, especially when you add clojure's runtime on top of that.
Hmm, I couldn't find that in the transcript https://github.com/matthiasn/talk-transcripts/blob/master/Hickey_Rich/EffectivePrograms.md
Re: writing your own transducing functions -- I wrote one that turned flat sequences of maps into sequences of threaded sequences of maps (based on certain fields acting as keys and back references). It turned out to be a very elegant solution to creating threaded conversations out of raw sequences of messages between members. It hid the complexity of the problem and it was easily composable with any other processing I wanted to do on messages (before transformation) or conversations (after transformation).
@bcbradley there's a lot of compiler tech out there that does this sort of thing. PyPy does it out-of-the-box at runtime, infact PyPy actually runs transducers over int arrays faster than Clojure. ClojureScript does as well. Clang can do this sort of stuff but it would require all your pipelines to be defined up-front and fully static
The beauty of the Clojure model, is that none of this took compiler support. It just works as a library.
That's the problem with a lot of these approaches. PyPy took 3-4 people 10 years to engineer. The JS JITs took less time, but larger teams. Transducers were mostly designed in about 3 weeks by one person.
And you'll see that pattern throughout Clojure, never over-engineer. If you can accomplish what you need with your existing tools, do it that way.
Not saying these other methods aren't valid, but they are much more complicated.
And for anyone here interested, here's pretty much the state-of-the-art for this sort of thing. From the Scala world, but the paper goes over Lisp and the like as well: https://infoscience.epfl.ch/record/180642/files/EPFL_TH5456.pdf
yeah the first 50 pages or so are an overview of why we need this thing
@tbaldridge: can you tell https://www.youtube.com/channel/UC6yONKYeoE2P3bsahDtsimg/videos to create a video walkthrough of the paper, step by step, implementing a minimal setup in clojure ?
Heh, most of it leans really heavily on static typing. How to adapt it to more dynamic languages is still a WIP
Were Clojure's transducers only able to be made so quickly because of the way the language is, or does putting them in the compiler just take longer? Since it's lispy i'm just curious if impossible to do it the library way in other languages.
well, transducers are very simple, language-agnostic concept - there are (library) implementations for javascript, ruby, python, and java as well
tim's point was just that transducers were invented in a very short time, and required no compiler support at all
It would be nice if there was something like defmacro that did not evaluate its arguments but that could return a value instead of code that gets evaluated. Is such a thing possible?
@nick319 well, it is possible as long as the value can be expressed as code... Isn't this what comment
does ?
(defmacro comment [& body] nil)
Could anybody help me with fuzzy multi-method dispatching? Say, I have maps with :os
and :version
fields. Basically, I need to dispatch them by OS, but for some specific versions I need to perform some other logic. What I’m trying to reach is:
;; let's say my dispatcher function is (juxt :os :version)
[:mac <any-version>] ==> the standard iOS algorithm
[:win <any-version>] ==>> the standard Windows algorithm
;; but!
[:mac :11.22.33] == > some Mac-specific algorithm for version 11.22.33
So once I’ve got some another buggy version, I just add another (defmethod...)
definition for that case. Probably, later I’ll need to add more minor keys to dispatch, say, browser name or something else.
I’ve tried a bit derive
and custom hierarchies but without success.@igrishaev http://clojuredocs.org/clojure.core/defmulti#example-57558046e4b0bafd3e2a0474
yes I’ve seen that example but cannot understand it completely so it prevents me from using it.
@igrishaev What's the question? I wrote that example 🙂
Basically: 1. If no 100% match, find the default method in the :default
fallback. 2. Add this fallback to the defmulti
@rauh makes sense. Another solution might be to derive any version over ::default
value once I’ve got that map.
@igrishaev Why not use something like this:
(derive :mac/high-sierra :mac/default)
=> nil
(isa? :mac/high-sierra :mac/default)
=> true
`@chrisbetz yes, it came to me afterwards. I’m going to derive version once I get a map to dispatch.
@igrishaev 🙂 Happy hacking.
@alexmiller or other Cognitect folks -- I noticed that the instructions in the readme.txt
in the Clojure repo are no longer complete and accurate -- I can build with ant or maven, but java -cp clojure-${VERSION}.jar clojure.main
no longer works; it throws an exception because of the spec dependency. Maybe worth adding a note to readme that spec has to be built and installed first? Note: I'm assuming that's what has to be done to fix; I haven't actually tried it at this point.
Didn't seem worth filing a ticket, since it's so minor, but a suggestion for the final 1.9 release.
There are several things in that readme that are lagging and need updating. We don’t plan to make any more changes for 1.9 though.
Just seems like it could trip up new users who are naively trying to follow the instructions in the readme.
@eggsyntax this is a known issue, and being addressed with the clj
tool
it will be integrated (including the readme) when 1.9 is the stable release, is what I heard
Right, I'm aware of the new CLI tool -- just suggesting that the readme be updated.
the tool is called clj, it's clojure.core plus a dependency manager to bootstrap
since clojure.core now needs other libraries to run
Why were those other libs not merged into clojure.core
to make sure it could run standalone?
they were originally in core and we split them out to allow them to be updated independently from core
https://groups.google.com/forum/#!msg/clojure/10dbF7w2IQo/ec37TzP5AQAJ
user=> (s/defn foo :- Bar
#_=> [qux :- s/Str, quux :- [s/Keyword]])
#'user/foo
user=> (doc foo)
-------------------------
user/foo
([qux quux])
Inputs: [qux :- s/Str quux :- [s/Keyword]]
Returns: Bar
How do you idiomatically spell (is (thrown-with-msg? ...))
except also binding the exception to a name so I can ex-data
it?
quoting doc of is:
(is (thrown? c body)) checks that an instance of c is thrown from
body, fails if not; then returns the thing thrown.
- so at the very least, thrown? lets you do that@U07QKGF9P - looks like it works with thrown-with-msg? too
kingfisher.core=> (clojure.test/is (thrown-with-msg? Exception #"foo" (/ 1 0)))
FAIL in () (form-init2223331148129063339.clj:1)
expected: (thrown-with-msg? Exception #"foo" (/ 1 0))
actual: #error {
:cause "Divide by zero"
:via
[{:type java.lang.ArithmeticException
:message "Divide by zero"
:at [clojure.lang.Numbers divide "Numbers.java" 158]}]
:trace
...
kingfisher.core=> (type *1)
java.lang.ArithmeticException
so the pattern would be to put the is
in a let block so you can make other assertions about the ex-data
it only explicitly documents the return-value of the thrown?
check, I guess it’s implied that thrown-with-msg?
would similarly return e
@bla The first prototype I've writen as comment (see: https://github.com/DomainDrivenArchitecture/dda-serverspec-crate/tree/development)
like
(def ServerTestConfig {
(optional-key :netcat-test)
{Keyword {:reachable? Bool}}, ; keyword is used to match test against fact
(optional-key :netcat-fact) ; parsed result of "nc [host] -w [timeout] && echo $?"
{Keyword {:port Num,
:host Str, ; may be ip or fqdn
:timeout Num}}, ; timeout given in seconds
(optional-key :netstat-test)
But the datatypes described here are splittet across several modules, so we will need a way to
1. add documentation as meta to elements like 'keyword' or 'defschema'
2. a way to render documentation on schema endpoints (sth. like explain
)
Does anyone have an idea what could be wrong here?
(defn testex []
(go
(let [response (try (<! (http/get "" {:with-credentials? false}))
(catch :default e
(println "error" e)))]
(println response))))
The URL returns HTTP 500 with content-type application/json but not a JSON body, so http/get throws a parse error, but I can't figure out why I can't catch it. I tried putting the try
everywhere.you can’t catch it because channel ops are lifted into core.async’s state machine
so the fact that http/get returns a channel means that you can’t catch its error - in order to handle that error you need to use the facilities your http library defines, and if they define none the library is broken
normal clj-http at least takes a :throw-exceptions
arg you can set to false - I don’t know what library http
is but with any luck they accept that arg and return the error response instead of throwing an error(?) - I have no idea why that wouldn’t be the default in an async lib frankly
I use this one: https://github.com/r0man/cljs-http . Are there alternatives out there that also use core.async?
:default likely doesn't work correctly inside clojurescript core.async go blocks anyway
@roti it can throw an error if its non recoverable, what would be the point? but yeah, it shouldn’t throw exceptions because there’s no reasonable block of code that can catch it
anyway looking at that code it doesn't seem like it would throw exceptions to me, it looks like it returns some sort of structured error
@roti looking at that lib it isn’t the one throwing, yeah
but really, you can just use xhr via interop, and use core.async inside the callback
that way you can attach your own error handler inside the callback
wrapper libs are severely overrated
ioc_helpers.cljs?rel=1509880107319:42 Uncaught SyntaxError: Unexpected token S in JSON at position 0
at JSON.parse (<anonymous>)
at cljs_http$util$json_decode (util.cljs?rel=1509894674732:63)
at core.cljs:4829
at Function.cljs.core.update_in.cljs$core$IFn$_invoke$arity$3 (core.cljs:4829)
at cljs$core$update_in (core.cljs:4820)
at cljs_http$client$decode_body (client.cljs?rel=1509894676467:83)
at Function.<anonymous> (client.cljs?rel=1509894676467:181)
at Function.cljs.core.apply.cljs$core$IFn$_invoke$arity$2 (core.cljs:3685)
at cljs$core$apply (core.cljs:3676)
at async.cljs?rel=1509880110795:712
right, so your server is giving your garbage that isn’t json, and cljs-http isn’t set up to catch that for you
you need to either replace wrap-json with something that handles the error nicely, or just use xhr and js/JSON yourself directly
my bet is that in the long term that last option is the least trouble
well, I was using JulianBirch/cljs-ajax before, which is ok, but I wanted a solution where my logic in not split in event handlers
for simple stuff there should be no problem, but as soon as you have something more complex, like several ajax requests in sequence it gets messy
right, you can use core.async in your callback - then you’re in clojure land
regarding "who should catch the error", my answer would be "the code inside go". core.async gives you the illusion to have sequential code when its actually asynchronous, right? so that is where errors should handled as well. ironically throwing an exception/error is actually a break in a sequential code. i don't know how it works internally, i just recently learned that they are actually not the same as continuations, so it may not be easy or possible, but that question definitely has an answer
@roti but that’s impossible - your go block can’t see errors inside another go block, and that’s what is happening in your code when you call http/get
this is a hazard of using library code that creates go blocks you can’t control
@noisesmith it may be technically impossible (though I'm not so sure about that), but not logically. my point is that asynchronous code should have a generic way of dealing with errors, like sequential code has. blaming the library for not following a convention and not having a way to deal with that is not good 🙂
OK - you would need to re-implement core.async to implement this
it’s a core structural problem, not a superficial oversight
hmm, i'm going to have to get to know core.async better to understand that. I come from JS promises, where errors are part of the "promise system", i.e. when you have a promise chain, and one of the steps throws an error it is caught by the "promise system" and the closest error handler (sort of like a catch) is executed. granted it's not perfect, for example errors while creating the promise chain are not caught.
this might be a better discussion for #core-async - and perhaps I’m wrong and there’s a way to put in a more generic exception handler for go blocks- but if that existed I’d think we would have seen it in action already
@noisesmith thank you
to be clear, I recommended using xhr and JSON via interop, but if that works and is easier that’s cool too 😄
@nick319 I would choose what looks best aesthetically (by which I mean what code is more understandable), but I guess performance may also matter in some cases. why do you ask?
@roti because wrapper libs are limiting, and interop is strightforward
using the js lib directly will be the most flexible option, and least likely to get you stuck