This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-04-27
Channels
- # aws (19)
- # babashka (47)
- # beginners (111)
- # boot (3)
- # bristol-clojurians (3)
- # chlorine-clover (2)
- # cider (13)
- # cljs-dev (8)
- # clojure (143)
- # clojure-europe (11)
- # clojure-germany (10)
- # clojure-italy (3)
- # clojure-losangeles (1)
- # clojure-nl (1)
- # clojure-spec (6)
- # clojure-survey (3)
- # clojure-uk (42)
- # clojurescript (229)
- # conjure (131)
- # cursive (21)
- # data-science (18)
- # datomic (4)
- # emacs (21)
- # events (2)
- # figwheel-main (12)
- # fulcro (18)
- # graalvm (1)
- # hoplon (40)
- # jobs (1)
- # joker (17)
- # kaocha (1)
- # lambdaisland (1)
- # off-topic (19)
- # rdf (7)
- # re-frame (31)
- # reagent (26)
- # reitit (20)
- # rum (4)
- # shadow-cljs (106)
- # sql (17)
- # testing (5)
- # vim (2)
Just finished the documentation for now. I think? So there's core docs (`:h conjure`), Fennel Aniseed (`:h conjure-client-fennel-aniseed`) and Clojure nREPL (`:h conjure-client-clojure-nrepl`).
Hey thanks for all your work! I found something in the docs:
<localleader>rr Refresh all changed namespaces.
<localleader>rr Refresh all namespaces, even unchanged.
Is the same command?Going to have a go at making this stdout handle chunked nREPL messages now, but I think it might be close to release 😬 If anyone thinks docs / UI / UX could be improved for an initial version, please do let me know. I just want to make this the defacto version on the master branch soon so I can continue to improve it knowing there's no more new people using the old version.
The interactive school to teach you the mappings and UX will be post release, but I still want to do it
Amazing!!!!
How can I make some log to report a issue on the new conjure? I have a case while evaluating a big project here that makes my nvim hangs a few seconds after I send the evaluation command, I suspect is because the result is a really big message with the entire system map in memory returned (this worked fine in the old conjure)
I think GitHub issues is still the way to go, that's the best way for myself and others to keep track of it. If you're getting a HUGE result back it could definitely be the parsing and displaying of it, yeah.
or maybe a way to not parse a result or ignore the result?
I'd perfer to find a way to fix it for you. Maybe try :ConjureConfig clojure.nrepl/eval.pretty-print? false
and see if that makes a difference too :thinking_face:
ok I will think a way to reproduce this in a isolated project
is a file, but is a pathom project that reloads all the maps of resolvers and mutations and returns as result of the eval (don't know why they decided to this in this way)
Ah okay. Would be interesting to see if evaluating (do (slow-thing) nil)
and also evaluating as normal but with pretty printing off would make a difference.
I will try both
I haven't evaluated anything that returns a big enough result yet, but I suspect something with a LOT of lines involves time pretty printing, parsing and then formatting and inserting into the log buffer.
yeap pretty print false did the trick
can we disable it when the output is to big
but this lead to another question what is to big?
like full lock
had to kill my process
without the pretty print is super fast
I think the problem is the efficiency of inserting so many lines to be honest. I'll have a look at what could be slow at inserting lots into the buffer.
I will go with eval.pretty-print = false, since is similar of what I had on the older conjure
I'll get some way to reproduce it myself and try to fix it rather than just turn off pprint. It shouldn't really be a problem.
maybe parse a huge internet json
into a edn
and print
Hmm, evaluated range 10000 and it was fine, didn't lock up neovim at all, so I wonder if it's the syntax highlighting etc.
can I somehow store the eval result to check how many lines?
Hmm, you could capture it with (def x (my-code))
then (count (clojure.string/split-lines (with-out-str (clojure.pprint/pprint x))))
One other thing you could try since I can't seem to reproduce and I have a hunch: Open up the log buffer, run :set ft=text
and then try to reproduce the issue.
with the pretty-print true
still hangs, but maybe I have to disable the hud
because it stills shows up pprinted
still hangs
but is faster
with sintax highlight I locked vim forever
now It just hangs
some seconds
like 20 secs hahaha
Interesting! So it does look like there's some streaming / chunking options for nREPL evals. Maybe I'll make it cap at a save level and you can request the rest if you really need it and are okay waiting for everything to catch up. I've added it to my todo list and will try to address it asap alongside some other things around stdout display.
Turning off pprint will be the best bet for now, but yeah, I'll pick something big that's slow and try to get it evaluating efficiently. The best solution for now would be to wrap anything you know that's got a HUGE result in (do ... nil)
I think.
I'll find a way to stop parsing and printing when things get too large and give you the option to request the rest. Somehow.
strange is not soooooooooooooooo big is like 65000k lines
cool, I mean nice that is reproducible
probably chunk is a good option
CHARACTERS 3862488 SPACE 1531184
do you think is possible to conditionally do something? if not I'm fine with sintax off
ah, another question, is there a way to clear the buffer?
Although I'll be doing dinner and coming off soon, I'll work on this tomorrow too. I want Conjure to easily handle this kind of output. Maybe a little delay, but it should still be usable.
When I clear the buffer I just open it in a split or tab then dgg, like any old buffer 🙂
thanks
ahh ok
sure, I still didn't realized that is a normal buffer haha
quite cool
Updated Aniseed's string splitting algo (Lua doesn't come with a good string split so I had to write my own Clojure like one and it was SLOW)
dude you fast
I will update here
yeap looks like is solved
I will keep testing it on my production apps haha
(I would expect something that big to slow the buffer down no matter what, it's just huge)
yeap, seamlessly
but if I do some evals
after some I have to clean the buffer
because is huge anyways
computer problems
but is really fine now
So the log does trim automatically when it's too long but there's an edge case that you're hitting. I don't trim as well when there's one REALLY big result.
yeah is a big big map
So if want it to trim more aggressively (depends on your CPU and RAM I guess) you could set it to like trim at 3000 lines down to 1000?
OK cool
perfect
So if you had 50k line result in the buffer, it'll fill the buffer. Then as soon as you eval anything else it'll ALL get deleted. Or do you think just cutting the top off the big result is better :thinking_face:
I kinda think just delete the whole thing, it's too big, and you can see it again with <prefix>v2
to view the 2nd most recent result again.
yeap I think delete all will be easier to handle
WDYT about add some information about Piggieback and cider-nrepl in the readme installation part?
Very subtle as Tim Pope did in fireplace.vim
Or just saying you will have limited options without then
What are the functions supported using :ConjurePiggieback (shadow/repl :browser)
?
I tested auto completion worked quite well
But go to definition I get this error
; def (word): my-radical-function
; Unsupported operation: info
; Ensure the CIDER middleware is installed and up to date
;
which is strange because docs works fine
; --------------------------------------------------------------------------------
; doc (word): my-radical-function
; -------------------------
; components/my-radical-function
; aaaaaaaaa