This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2023-06-09
Channels
- # announcements (1)
- # babashka (14)
- # calva (8)
- # chlorine-clover (3)
- # clerk (6)
- # clj-kondo (27)
- # cljdoc (20)
- # clojars (6)
- # clojure (53)
- # clojure-denver (8)
- # clojure-europe (17)
- # clojure-nl (1)
- # clojure-norway (270)
- # clojure-uk (5)
- # clojurescript (35)
- # community-development (7)
- # cursive (12)
- # datalevin (3)
- # datomic (26)
- # etaoin (23)
- # exercism (1)
- # hyperfiddle (3)
- # java (14)
- # nrepl (2)
- # off-topic (12)
- # pathom (3)
- # portal (44)
- # practicalli (2)
- # reagent (7)
- # releases (1)
- # shadow-cljs (13)
- # timbre (3)
- # xtdb (4)
this might be more of a java question, but is there a cost incurred for passing in a parameter to a function that goes unused? (defn foo [a b c] (+ a b))
if c
is non-nil, does this have any sort of cost? what if the parameter is passed multiple calls deep and then only sometimes used?
From what I understand how Clojure works it creates a class with a method that has these fields. And as you didn't specify that your parameter is of a specific non-Object type (i.e. int
) then it means that you pass only a reference.
So whatever you pass you only copy the pointer.
And I think it won't be optimized away.
So if you care that your app won't have any duplicate pointers to things and it's already pricy for you then it has a cost for you. But if you aren't at that level of optimization then you shouldn't care about it.
yeah, i'm not really worried about it, just curious.
one more thing on the stack, but generally don't think there should be any cost beyond that
What about macros? What if I do a lot of macro expansions in real-time?
What does realtime mean?
Macro expansions will happen at compile time
Ah. I thought about the case when they also happen when I connect to a live app and actually ask it to run new code :thinking_face:
If you call eval you’re invoking the compiler
Ok. Then it means that I'll not have any problems if I'll connect via REPL because I'll already be using compiled chunks of code and only the top-level macro will be expanded by the compiler as deep as it needs to go.
For anyone else interested in OpenAI's near-to-medium-term plans (or at least the ones they're saying out loud), this summary from an interview with Altman is extremely useful. Definitely includes some things I was unaware of, notably intense GPU bottleneck, scaling hypothesis still holding, stateful API) https://website-nm4keew22-humanloopml.vercel.app/blog/openai-plans
The folks in the #C054XC5JVDZ channel are probably very interested in this...