Fork me on GitHub

hi guys, got a question regarding the routing in clojurescript. I'm building a website with multiple pages and I'm confused if there's a need for me to use Secretary for routing? or Compojure for routing is enough?


robincarlo84: that really depends on if you’re building a single page app or not


oh, maybe I should read your whole sentence simple_smile yeah, you probably only need compojure


unless you want multiple ‘views’ in a single page, its always possible to combine server and client side routes, but seems like it could get messy


@matthavener: wow thanks for the quick response, ok cool


right make sense


also one last thing, do I need figwheel too for a website with multiple pages?


robincarlo84: i’d use figwheel if you’re doing any significant clojurescript work, but its just a development tool. you ‘need’ it as much as you ‘need’ a repl


Hey guys, I hope this is the right place to ask this question... I have a question about re-frame. Why does the framework restrict a 1:1 between dispatches and handlers? At first I found it a little weird- I can think of lots of realistic cases where "some event happens -> module a responds; module b responds". Then I thought... well, if one module has a handler that changes the app-db, every other module only needs to subscribe to the same query, and they won't need to also handle that event, it will happen reactively. But then I remembered that very frequently I need "some event happened; capture event in module a, do some async processing (fetch more data, usually), then update the db". If we only have a 1-handler per event restriction, how can I also have "some event happened; capture event in module a, do some async processing for module a; ??? ; do some async processing module b"? Any suggestions / patterns for solving this would be helpful.


use core async for that


some event gets triggered by the user interface, which needs to talk to the database


@matthavener: ok thanks, that really answer my question. I'm new in this language and was very confused with the different examples I found online, some uses, some didn't. Anyway, thanks for your time and answer, it did helped me a lot to start.


that event puts the info onto a core async channel which then talks to the database


then when the server responds it goes onto a core async channel who dispatches another event which can in turn change client local state


Okay @adamkowalski, am I understanding you correctly? A contrived but demonstrative case would be a produce shopping cart scenario. We have a list of produce [ apples, oranges, ... ] as a component, and another component that displays detailed information when a produce item is selected... they're not intricately related, and I want them to remain decoupled. Assuming the details for "apple" is just too large to have pre-loaded (maybe its the entire genetic sequence), when event [:item-selected :apple] is handled by the list-component... I need to somehow capture that not only in list-component, but also in detail-component. You are suggesting that the list-component handler throws the event into a channel, does some xhr, and spits out another, distinct [:item-selected-2 :apple]?


yeah, queues (core async channels) are all about communicating between different parts of your system while keeping them completely decoupled


sounds like the perfect tools for the job


reframe is really good at keeping your user interfaces in sync with your local app db using subscriptions, and handlers can modify your app db as needed


but you want to keep those quick and snappy, so offload anything long running to a channel


two questions arise from that ... 1) so where in the 'cycle' of reframe would you recommend dumping things into the channel? [:event] -> captured by handler -> thrown into channel -> pulled by another component and placed immediately into a handler -> consumed by other component? 2) how do you manage sharing of channels? spin up 1 global chan at runtime and pass it down? a d.i. system like ?


i guess ill break that down into separate answers


lets say you have a button which when clicked should hit the server database and grab the latest content and then put into your app database


button gets clicked -> fire on click handler -> dispatch event [:button was clicked] -> handler sees event and places request for latest content onto core async channel


now from the channels perspective


it gets a request to talk to server on channel -> talks to server (websockets/rest api/whatever) -> does nothing


the server gets the request -> gets the latest content -> talks back to client


another channel solely dedicated to getting responses from the server receives new content -> dispatches an event with the content as the payload -> a handler gets the payload and mutates the in app db to reflect the newest changes


whole system only needs two channels and no sharing. one channel for client to server coms, and one channel for server to client coms. a handler only needs to be aware of the client to server channel and your done


This makes sense, but what about this case... let's say the click-button-get-data scenario you just described is module A. We also have module B that needs to also go fetch data when the button is clicked. It's an independent dataset, and an independent component, but it still needs to know "button clicked"? dispatch event [;button was clicked ] -> module A handler sees event and puts request ... channel gets request, talks to server, does nothing ... server gets data -> responds to client... another channel gets new content -> dispatches [ what event ? module A event or module B event ? ] -> a handler (which handler, again mod A or mod B?) gets payload and mutates the in app db.


do you see the issue? how can separate modules share the same "kick start" event to go get data?


The "data received" can be decoupled, a channel can say "on data for A dispatch event for A, on data for B dispatch event for B".


but how can we communicate to independent modules that it's time to go get your new data?


yeah I see what you are saying


events that are dispatched are completely arbitrary and can have whatever payload you would like


also handlers can do arbitrary things


so as long as you have your subscriptions set up properly, it doesn’t matter how many different modules you have as long as they are looking for the data they need


then your event could be something like [:pull-content [:posts :photos :my-other-thing-i-want-from-the-server]]

Alan Thompson20:03:54

Hi - Anybody here familiar with the CLJS QuickStart on github? I think I've found an omission


what I'm trying to avoid is coupling between A and B if at all possible. Like I don't want a to dispatch [:tell-a-and-b-to-do-something ] with a special handler that knows about both A and B because that's super coupled.


then your handler for pull-content can parse the vector of things you want from the server and put all of them onto the channel


hmm ... so the dispatched event might be a higher level "domain" event, I suppose. [ :pull content ] is not exactly specific to A or B.

Chris O’Donnell20:03:49

if it's not important they happen together, you could dispatch two events


I thought about that -- have the handler for [:clicked] dispatch another event, but that means that suddenly A knows that B exists.


which doesn't scale well, and breaks any sort of fractability


like, you could no longer have a handful of "module A"'s floating around on the page with just one B.

Chris O’Donnell20:03:21

I mean, more like (do (dispatch a) (dispatch b))


maybe we should start a private message so we don’t clog this channel up


the same argument still applies, though... A shouldn't have to know that B exists.


i have an idea about how you could do this using core async channels but it would be a little more involved haha


i wonder if the most realistic solution is to rewire reframe to actually allow multiple handlers for 1 event


then a "domain event of importance" could be listened to by any number of components, and each one could have its own processing/fetching pipeline


what about this, you have a channel which is listening for events


on click thing a puts an event onto the channel


then anything else you need to dispatch events could have a pub/sub relation with that channel


that channel could essentially message all of its subscribers saying “hey i am about to go do something, do you want to add something to my list of things to do"


thing b is subscribed to that channel so it gets a message, and says “hey go do this other thing as well"


finally once it gets the aggregate response of all of those things, it will go do all of the things it needs to do and everything is fully decoupled


so 'eventer channel' eventualy ends up with a vec [ [:for-a stuff ], [:for-b things ] ... ] and runs map dispatch vec?


that's not a bad idea


each module that needs it could have a simple top-level defn { :on-this-event do-this, :on-that-event do-that }


thanks @adamkowalski and @codonnell I'll have to think on this, but right now the dog needs a walk


hello people, I'm trying to make closurescript use a custom way to require files, I noticed that between js/nodejs targets the compilation of the main file change, on the js side it does the document.write while on nodejs it does require. How can I supply a custom require function to be used there?


@wilkerlucio: I’m afraid :target option of cljs is compiler is hard-coded and cannot be customized, but based on our previous conversation, you are not interested in general solution, you want to handle :optimizations :none builds only. I believe your goal is achievable by leveraging goog.base’s machinery for providing custom implementation for require calls (see CLOSURE_IMPORT_SCRIPT)


in extreme case, I believe you could monkey-patch goog.base to do exactly what you need it to do and keep it transparent to cljs compiler


@darwin: the problem that I'm having now is that, when the chrome content script file is loaded, and it runs the document.write, it always overrides the entire page (I tried changing the run_at, but didn't helped), so I can't have the document.write there, and the CLOSURE_IMPORT_SCRIPT seems to be used only to load the later scripts, the first ones (`deps` and base) are always being loaded with document.write, that's breaking the page by blanking it


@wilkerlucio: you should not be using clojurescript’s output script, include goog/base.js by hand and require your root namespace via javascript calls to require


@darwin: that's a good idea, I didn't though of doing that way. I'm going to try it, thanks simple_smile


after you include goog/base.js you are free to override its settings, or monkey-patch, before you call your own requires


still need to require ../compiled/tests/tests.js, but that file only contains dependency information (a bunch of calls to goog.provide goog.addDependency) here is relevant compiler config: