Fork me on GitHub
#clojure
<
2022-05-06
>
Patrick Brown00:05:44

Any library recommendations for simplifying getting/changing values from nested maps? My primary desire is to simplify code and increase code reuse. I've got a lot of thread first with keywords going on, so something that doesn't require me to change a lot of code when my data structure changes. CHEERS!

devn01:05:55

get-in/update-in with a defined path not enough?

devn01:05:43

an example of what you have and what you'd like it to look like is always helpful

1
seancorfield01:05:41

Meander and Specter are the two that I see people talking about. We don't use them (at work) -- we don't find any need for that sort of thing.

Patrick Brown01:05:00

I'm greenfield, so my understanding of the problem is growing as well as my needs. I've got a large map of attributes that I'd like to grow as so... {:attribute-name {:datomic [:short :schema :def] :type :boolean :migrations {:csv {:table :table-name :column :col-name}} :fulcro {:reports [:names :names] :allowed-components [:components :list]} :alexa true :aliases #{:short-name :spanish-name :etc} }} I moved the datomic type out of the vector into it's own higher level key yesterday. Then had to deal with the cascade of tiny changes. The ultimate desire is to get this map as well defined and powerful as possible, but I'm reaching a point where I'd like the process to be less manual and more automatic. @U04V70XH6 I'll give it a look.

devn01:05:29

I'd skip adding a dependency first and see if you can formulate some helpers that just use plain get-in and update-in. Both libraries mentioned are great, but may be heavy-handed depending on what kind of read/update you need.

devn01:05:34

Seeing the structure is useful, but (and I don't mean to harp on it) if you have a specific input and output you want to be cleaner, it's useful to see a mock-up of “I have {…}, and I want a function that produces {…} from it”

potetm01:05:16

So, I’m not sure how much this applies to you but

potetm01:05:36

I’ve often noticed that people use nesting to ape namespaces.

☝️ 2
potetm01:05:54

Taking your example:

{:attribute-name {:datomic [:short :schema :def]
                  :type :boolean
                  :migrations.csv/table :table-name
                  :migrations.csv/column :col-name
                  :fulcro/reports [:names :names]
                  :fulcro/allowed-components [:components :list]
                  :alexa true
                  :aliases #{:short-name :spanish-name :etc}}}

potetm01:05:19

So you can take a lot (but not all) of that nesting and flatten it.

potetm01:05:51

This might get rid of a lot of update-ins and get-ins.

devn01:05:45

fwiw my response was based on reading that they had a lot of thread first to access keys

Patrick Brown01:05:33

@U07S8JGF7 That is actually a thought I had. I've had the same thought that @devn had first, but I'm leaning with keeping the map only 1 level deep and using the naming convention you outlined. The top level keys are already doing the same. like :motor :motor/stages :motor/bit-to-bend all in the first level. I like your style, one refactor and then it's simple from there.

devn01:05:40

or at least it seemed like that's what they were saying. agree with what you're saying either way

potetm01:05:43

get-in / -> :k :k — tomato tomáto

devn01:05:36

fair but you could let out a path like [:k :k] and reuse it, which is subtly different imho

potetm01:05:06

I’m not sure it ever occurred to me to do that 😄. But it does make sense.

potetm01:05:11

fwiw I’ve become a big fan of cramming a bunch of stuff into a map over the past few years. Namespaced keys allow you to shove all the context you could ever need into one place.

seancorfield01:05:36

Qualified keys in maps can often go a long way to flattening structures out -- and they have the benefit of being both more descriptive and providing more context. It's why next.jdbc using them as the default behavior (and why I strongly encourage people to stick with the default -- and get used to qualified keys). :table as a key means very little, :migrations.csv/table provides a lot more context.

seancorfield01:05:33

We use qualified keys in maps a lot at work to disambiguate things that would otherwise just be :id or :member-id (in what context?!?) etc.

potetm01:05:21

I thought of you (and next.jdbc ) the other day when I found a buddy of mine works at a place that does (camel -> kebab -> snake -> kebab -> camel) for every request.

😂 1
potetm01:05:54

Such insanity to rename things. So glad there are tools out there that push you toward the sane thing!

devn01:05:48

I remember a long time ago Zach Tellman showed a massive performance discrepancy between people who felt the need to keywordize string keys so they could call kws as fns

seancorfield01:05:01

Hahaha... and next.jdbc will provide kebab <-> snake builders if you have the CSK library on your classpath... but only because I got tired of people complaining about having to write those builders themselves 😐 I don't see the point, myself.

😭 1
😂 1
devn01:05:06

on the server side to be more specific

Patrick Brown01:05:10

So far, the good advice I've gotten goes towards flattening out the data structure more along many avenues, this is looking promising on the fulcro client-db and integrant fronts too. I'm still learning Clojure, but I like the philosophy. I don't know why I asked. @U07S8JGF7's advice is too simple.

Patrick Brown01:05:04

Thanks for the nudge in the right direction. I've been complicating things too much this week.

seancorfield01:05:14

@devn That's why I encourage folks to try to work with plan in next.jdbc and reduce over results: that lets you completely avoid the whole Clojure map / keyword construction stuff!

potetm01:05:36

This is how I feel about basically all of Rich’s talks.

potetm01:05:48

“Wait. You can just… not do that?”

devn01:05:53

I'm not familiar, could you say a bit more?

potetm01:05:02

About Rich’s talks?

devn01:05:21

sorry that was meant for Sean

✔️ 1
devn01:05:34

I was curious about avoiding the performance penalty

seancorfield01:05:50

Turning ResultSet into vectors of hash maps with qualified keys has an overhead -- in time and memory -- but plan produces a Reducible instead. When you reduce over it, that gets the Connection, runs the query, and then provides a thin hash-map-like wrapper around ResultSet that calls .getObject() directly using a simple label, so you can process a (large) result set without creating any hash maps (or vectors) if you can formulate your processing as a reduction.

💯 1
devn01:05:50

ok, one vote for making people opt out of doing it this way

devn01:05:02

this is clearly better

devn01:05:13

sorry we’re fully off-topic now, but I did not know about this, and it's definitely an improvement

Matthew Hughes02:05:06

I started reading a paper called "Build Systems a la Carte" and am attempting to implement it in Clojure, and while I really like the ideas, it makes heavy use of applicatives and monads which feels a lot more Haskell than Clojure to me (it's written in part by Simon Peyton Jones, so no shock there). Here's what I have currently:

(def tasks 
  {:B1 (fn [fetch] (+ (fetch :A1) (fetch :A2)))
   :B2 (fn [fetch] (* 2 (fetch :B1)))})

(def input
  {:A1 10
   :A2 20
   :C1 1})

(defn busy [tasks key store]
  (let [store (atom store)]
    ((fn fetch [key]
       (if-let [task (get tasks key)]
         (let [val (task fetch)]
           (swap! store assoc key val)
           val)
         (get @store key))) key)))

;(busy tasks :B1 input) -> 30
;(busy tasks :B2 input) -> 60
It's supposed to be a basic model of calculating Excel formulas. The Haskell implementation in the paper has about double that in types alone. The main difference here is in the paper, a task like (fn [fetch] (+ (fetch :A1) (fetch :A2)) uses applicative syntax to string together the fetch calls and the operators to be performed on them. Something like:
(+) <$> fetch "A1" <*> fetch "A2"
with fetch being a function from your cell key to a monad or applicative wrapping the resulting value. I thought I was being slick with the implementation above, but quickly ran into something that I think is pretty neat, but not as easily achievable with my current setup. They plugged in the const monad and made a fetch function that just returns the requested key in a list, which when applied to other fetches will concat the keys, and does no computations. Effectively creating a list of dependencies for any given key. I think it's really cool that they can get this functionality without modifying the existing build system. Am I being silly here? I'm sure this wouldn't be to tough to just do with a separate function, but I've been trying to come up with a way to achieve a similar effect with a simpler structure. Any ideas on what I could do differently or if I should just give it up and make separate functions?

hiredman03:05:47

You can do this in clojure just fine, I suggest you follow the Haskell in the paper more closer, and don't admit mutable state like your atom

hiredman03:05:52

Monads and applicatives are perfectly doable in clojure

hiredman02:05:20

That is the difference between a functor and a monad

hiredman02:05:41

With monadic tasks the composition is opaque (bind takes a function that could do anything and the returns a monad value)

hiredman02:05:34

So while you can get a list of keys from monadic tasks, if a task has a conditional in it, you may just be getting keys from one branch

hiredman02:05:33

A functor is less powerful but it is possible to get everything

hiredman02:05:58

The paper covers this, it is a great paper

zeitstein09:05:41

I was surprised to find workarounds are needed for a transient stack:

;; pop - works
(let [stack (transient [1 2])]
  (persistent! (pop! stack)))

;; empty? - CLJ-1872
(let [stack (transient [])]
  (zero? (count stack)))

;; peek - CLJ-2464
(let [stack (transient [1 2])]
  (nth stack (dec (count stack))))
I would have thought transient stack/queues are a fairly common need (e.g. in complicated loop-recur situations). Now I'm wondering whether it's worth doing at all 🙂

didibus17:05:10

There's a lot of missing APIs for transients that have not been implemented yet. The one I miss most is subvec.

1
didibus17:05:47

Clearly, it seems people made due without, I think its non-trivial to implement each one on a transient, and it is rare people need to go that way, hence why it is still not implemented

gratitude 1
bortexz10:05:53

Having a small dilemma when writing predicate docstrings, considering these options: • “Returns true when… false otherwise…“, but having the fn return truthy/falsy vals instead of a proper boolean • “Returns true when… false otherwise…“, and wrap the fn body in (boolean ...) • Writing “Returns trythy val… falsy/nil otherwise” if not coercing to bool Would like to hear some opinions about this

p-himik10:05:16

clojure.core has some examples. E.g. some has "Returns the first logical true value of (pred x) for any x in coll, else nil."

bortexz11:05:14

“logical true” sounds good, there was something that didn’t like about “truthy” and i think it’s because truthy is more used in js land than clj land

bortexz11:05:06

logical true sounds more clojury, I’ll go with that, thanks @U2FRKM4TW!

👍 1
Alex Miller (Clojure team)12:05:21

The official docstrings only use “logical true” and “logical false” to talk about implicit boolean conversion.

Alex Miller (Clojure team)12:05:33

Truthy/falsey is something that emerged in the community, not sure where exactly. It's in Joy of Clojure but I'm not sure if that's the origin or not

devn14:05:33

Definitely not specific to Clojure or originating from JoC. Would be interested to know the true origin though.

Alex Miller (Clojure team)14:05:03

well, the "truthy" word I think came from Colbert's old show :)

Alex Miller (Clojure team)14:05:21

not sure when it crossed over to Clojure

Joshua Suskalo14:05:51

Wasn't truthy a holdover from old lisps that js people adopted because js was based on scheme?

Drew Verlee16:05:30

Just say what it returns, trust the reader to know those values are truthy. That's my take at least.

roklenarcic13:05:02

Easiest way to update last item in a List? Everything I come up with is super awkward code.

Jon Boone13:05:48

What ideas have you explored?

roklenarcic13:05:19

something like (apply list (concat (butlast lst) [(update-fn (last lst))]))

Ben Sless13:05:32

Obvious question first: why/what's your use case?

☝️ 1
roklenarcic13:05:55

Imagine a macro where you get a list and you want to augment the data in the last item in the list

enn13:05:02

what’s special about the last item?

enn13:05:11

is the list a fixed length? like these are positional parameters?

roklenarcic13:05:08

no it’s not fixed length

roklenarcic13:05:28

last item is interpreted specially by some code down the line

Ben Sless13:05:16

Is it a user facing macro or just for you?

roklenarcic13:05:37

Possibly user facing

roklenarcic13:05:50

Ok this seems to havent got a simple answer

Jon Boone13:05:35

Is this list a queue or other familiar pattern?

enn13:05:37

that’s very general, I think specifics would help here. it may be that a list is not the right data structure, if you need random access to the last item often.

roklenarcic13:05:25

Given that I expect small size there I can use recursion to update it and construct a new list

roklenarcic13:05:33

Well thanks anyway

ghadi13:05:37

macros are probably the only place I'd use butlast lists have linear traversal costs

Ben Sless13:05:40

I had a think about this recently, unless I have to I'd rather not expose a macro API. Is it possible to do it with a function?

manutter5114:05:52

(def some-list '(1 2 3 4 5 6 7 8))
=> #'user/some-list
(defn update-last [coll f]
  (loop [result []
         [item & more] coll]
    (if (seq more)
      (recur (conj result item) more)
      (list (conj result (f item))))))
=> #'user/update-last
(update-last some-list inc)
=> ([1 2 3 4 5 6 7 9])

roklenarcic15:05:48

You’ve got a vector in a list. You need to apply list.

👍 1
Drew Verlee16:05:08

The way i think of it, the reason for a list is because the items are meant to be considered in order. Good you use a stack? A stack is designed to give you the last item added to a collection.

Drew Verlee16:05:57

Pop the item, update it, put it back on.

pinkfrog13:05:34

Hi. How to convert this to use transducer

(->> xs
      (filter #(.isFile %))
      (map-indexed vector))

p-himik13:05:13

I'll give you one better - you can convert it into a single function, using keep-indexed as a replacement for filter + map-indexed.

👍 1
p-himik13:05:17

But if you still need a transducer version and need to produce a lazy seq, just wrap it into sequence.

p-himik13:05:48

And if you don't need a lazy seq, wrap it with (into [] ...).

Noah Bogart13:05:55

to make it explicit, (into [] (comp (filter #(.isFile %)) (map-indexed vector)) xs), right?

p-himik13:05:28

Right, although personally I'd still use keep-indexed as a transducer.

👍 1
pinkfrog13:05:09

keep-indexed might not do it because the meaning of index changes.

roklenarcic13:05:44

you can use sequence or eduction

roklenarcic13:05:50

to run the transducer

roklenarcic13:05:25

with eduction you can skip the comp

p-himik14:05:27

> keep-indexed might not do it because the meaning of index changes. Ah, of course, good catch.

jpmonettas14:05:48

Hi everybody! Happy to share the latest version of FlowStorm, a Clojure (and soon ClojureScript) debugger. Here is a video showing a "one liner" to instrument and debug the ClojureScript compiler as an example https://youtu.be/YnpQMrkj4v8?t=533 Here is the github repo: https://github.com/jpmonettas/flow-storm-debugger What do people think about this way of debugging/exploring a Clojure application?

Drew Verlee16:05:36

I'll have to take a look, for cljs work you typically just have to use a var to trap state for inspection. I have been curious how it might feel to have a more traditional debugger there.

jpmonettas17:05:02

I think it can be useful for people to understand the compiler (or any clojure codebase) since you can see the values, something that is pretty hard to understand by reading code because of dyn typing

Drew Verlee23:05:18

@U4E9AHMGW ^ you could try this.

Darrick Wiebe11:05:05

I've been using this to debug a project I've been working on and it's really nice! I used to use sayid for the same purpose occasionally but it was very tricky to use. Flow storm works so well I'm starting to pop it up more and more often. Nice work @U0739PUFQ!

jpmonettas12:05:26

glad to hear you find it useful already! if you have any questions, or ideas let me know!

Darrick Wiebe12:05:36

A couple ideas so far: • The green color indicating the currently active/selected form is very low contrast vs the black basic text, so it's hard to see which is selected. Higher contrast would be an improvement, for instance a lighter shade of green, etc. • It'd be nice if I could filter traces somehow from within my code. I saw in your demo that you can search within the UI, but it'd be nice if I could do something like the cider debugger and attach metadata that can make that filtering more powerful and easier to control • sometimes you need to right click and "show function calls", but it'd be nice if just clicking or double-clicking could perform default actions a little more smoothly...and I don't risk accidentally un-instrumeting when I meant to view.

Darrick Wiebe12:05:56

Another one: Instead of a dropdown for "Print all args", maybe checkboxes to turn on/off arg0, arg1, ... argn? It'd be nice if it refreshed the currently selected function when changed, as well.

jpmonettas18:05:09

great ideas @U01D37REZHP thanks!, will add them to my notes and then create some github issues

jpmonettas18:05:50

> It'd be nice if I could filter traces somehow from within my code. I saw in your demo that you can search within the UI, but it'd be nice if I could do something like the cider debugger and attach metadata that can make that filtering more powerful and easier to control I'm not sure what you mean by that one, can you elaborate?

folcon21:06:25

It would be nice if you could bookmark things? Each time you say come back here quickly for example.

Darrick Wiebe02:06:59

Hey @U0739PUFQ, somehow I didn't see your question before. Anyway I was looking through your commits at what changed and it seems like you nailed it! Thanks, I'm going to update and try it some more! 🙂

jpmonettas14:06:59

hey np @U01D37REZHP, https://www.youtube.com/watch?v=cnLwRzxrKDk that short video demos the features added on the last release

folcon14:06:45

@U0739PUFQ, that looks really cool. I'm wondering, does the #ctrace work with exceptions? Or do I need to specifically catch those and then add then mark them to be traced? I'm asking as there are some cases where libs generate exceptions, but they're not super clean about it, so you just get things like null pointer exceptions with no stacktrace or anything, so this would be super useful to pin down where problems are occurring.

jpmonettas17:06:59

thanks @U0JUM502E! #ctrace doesn't do anything special about exceptions, is just like #trace but disables tracing so you can selectively activate them with ^{:trace/when ...} . I'm not sure I follow, can you please elaborate a little more on the exception case? Maybe describe the problem and how you see a tool helping with it?

folcon17:06:14

Oh, just that as I mentioned in certain cases when an exception occurs you just get a bare exception. Which makes it very challenging to work out what's causing it. I've built up over time a series of techniques for determining what causes them, but they're still rather painful to source. So perhaps I've not thought about #ctrace in the right way, but with #trace is there a way where if something like an exception does occur, I can easily backtrack to work out where it's coming from through the trace?

jpmonettas17:06:35

oh I think I understand now. Currently it isn't doing anything, but a previous version used to have that feature. I removed it because it was slowing down execution too much. Instrumentation was basically surrounding every expression in a (try ... catch) so it could report the exact expression that was throwing it. Maybe I should add it back optionally some how. I'll add it to my notes. Thanks!

folcon17:06:49

Ok, cool. I mean even just being able to mark certain boundaries as dangerous similar to how you mark functions for instrument might be useful? You already allow functions to be re-run.

folcon17:06:24

That reminds me, not sure if I can reproduce it, but the refresh symbol is supposed to be re-run right?

folcon17:06:55

If so, I had some cases where doing that would instead clear all of the traces collected.

jpmonettas18:06:31

yes that should re-run the original expression. Maybe you can create a issue with that specific case? sounds like a bug

folcon18:06:36

I'll have to see if I can reproduce it 😃

folcon18:06:59

It was a repeated problem, but not sure how to create a minimal case, it was happening in one of my larger projects.

jpmonettas18:06:06

oh, ok I'll also try to reproduce, I haven't use that functionality a lot, only when I instrument entire codebases from flow-storm.api/cli-run

folcon18:06:27

Yes, I've not been doing that, primarily calling api functions from the repl

folcon19:06:18

Hmm, ok, so I'm trying out the new version. Some odd behaviour (this is inside a clojure project, the project itself is public, so happy to help you repro this stuff, but not sure if I have time to make specific repro cases) • calling fs-api/instrument-forms-for-namespaces it doesn't seem to try and activate the forms in a namespace, skipping some. (It does activate some, so something is working) • calling fs-api/instrument-var, it can respond with Couldn't find source for symbol, with the symbol I'm giving being fully namespace qualified. • #trace on a top level form in the source appears to work fine. • In the Browser, selecting a form to instrument doesn't necessarily instrument it.

jpmonettas13:06:09

@U0JUM502E hmm could be very helpful to see examples of that, there are obviously bugs there. instrument-var uses clj.repl/source-fn inside to find the source of the function. This only works with functions that where loaded from a file, so repl defined functions will not work with istrument-var`

jpmonettas13:06:59

there are a bunch of reasons why istrument-forms-for-namespaces` can skip functions, I'll improve the errors and warnings there so it is easier for users to see why it can't instrument some specific form. One example happens if the form is too big, because clojure has a limit on how big a form can be to evaluate it and instrumenting adds a lot of expressions to the form, so sometimes it ends up bigger than what clojure eval accepts.

jpmonettas13:06:26

@U0JUM502E one thing you can use for quickly finding what form fired a exceptions is moving to the last trace (I'll add a button so that can be done with one click)

Darrick Wiebe00:06:06

Here's another idea. The instrument-forms-for-namespaces function is handy while debugging, but having a whole bunch of namespaces instrumented quickly becomes overkill. I think it'd be useful to have a corresponding un-instrument function available.

jpmonettas01:06:29

@U01D37REZHP yeah, that would be useful, I'll be adding an option to instrument entire namespaces from the browser, where you can also temporaly disable instrumentation. But maybe will add both. Thanks for the idea

folcon18:06:34

@U0739PUFQ btw, using this a bit more, there are definitely some issues / fiddlyness in some areas. I could put together a gist or something if that would be helpful? One definitely missing thing is these are static, if there was some way to inspect them, as at times there's fundamental info hidden within this. Also maybe on a tooltip to def or somewhere, state that it defaults to defining the var in the user ns. Otherwise this is really something for debugging fairly fiddly stuff like datascript!

jpmonettas21:06:44

@U0JUM502E Yeah, I haven't made locals inspectable yet, I should add a context menu so we can define them on the repl also. For now one way to inspect a local binding value is to click on the code, over the form that is generating their value. I'm also aware of some issues with locals showing there even if they should be showing yet, it is on my todo list to fix it. Good idea to remember the user that the var is going to be defined under user ns.

jpmonettas21:06:32

I'm trying to improve the ergonomics to the point it is a replacement for print statements in almost every situation, not just for complex codebases debugging.

1
folcon07:06:56

Not sure if you want to support this, but being able to say, what is the path difference between these two runs, would be interesting.

jpmonettas12:06:18

@U0JUM502E hmm that is a interesting idea to think about, how do you think we can visualize it? and what do you see as a use case for it?

jpmonettas12:06:22

I just created #flow-storm channel if you are interested in discussing this ideas, or anything debugging related

Kelvin15:05:30

Is there an "official" regex for Clojure symbols and keywords, particularly with regards to Unicode characters?

Kelvin15:05:17

I know https://clojure.org/reference/reader page talks about what characters can be included, but doesn't mention Unicode characters (which can be used in symbols and keywords)

Kelvin15:05:40

Plus I think that page may be outdated even for ASCII chars; for example I'm able to write symbols using $ and % even though that's not mentioned on the Reader page

Alex Miller (Clojure team)15:05:17

there are things explicitly allowed, and things explicitly disallowed, and .... other things

Alex Miller (Clojure team)15:05:03

so it depends whether you are trying to match the current implementation, or what is currently specified as legal

Kelvin15:05:47

So my goal is to create/revise regexes on symbols and keywords (or at least their string forms), and it would simplify things if you don't have to explicitly filter out chars that aren't allowed anyways

Alex Miller (Clojure team)15:05:21

This is an area that is intentionally underspecified to allow for future growth

Kelvin15:05:07

i.e. you don't box yourself in saying "these chars are allowed which means these other chars are not allowed"?

Alex Miller (Clojure team)15:05:40

yes, some things are not not allowed :)

🙃 1
Kelvin16:05:13

Seems like the (disappointing) upshot is that I should not simplify my regexes, at least if I want to keep them implementation-agnostic

Kelvin16:05:38

(Especially since there's like, what, two characters that are explicitly disallowed?)

Alex Miller (Clojure team)15:05:33

(and the restrictions on top of that as defined in the symbol match code)

Alex Miller (Clojure team)15:05:27

(also note that this regex is actually buggy - https://clojure.org/guides/faq#keyword_number for more on that)

Sameer Thajudin21:05:44

Hi, I am trying to import a Clojure record defined in one namespace into another. I followed the examples online but am unable to get it to work. Any help is greatly appreciated.

Alex Miller (Clojure team)21:05:21

you'll need to share some code, what you did, and what you get

Sameer Thajudin21:05:54

(ns nlp-ws.nlp-results.concepts (:require [clojure.tools.logging :as log] [buddy.core.codecs :as codecs] [buddy.core.kdf :as kdf] [buddy.core.nonce :as nonce] [cheshire.core :refer :all] [cheshire.generate :refer [add-encoder encode-str remove-encoder]] [scout-nlp-ws.db.nlp_data :as nlp]) ) (defprotocol ConceptHelpers (describe [x])) (defrecord Concept[concept_id concept_name domain_id vocabulary_id concept_class_id standard_concept concept_code valid_start_date valid_end_date invalid_reason] ConceptHelpers (describe[this] (:concept_name this)))

Sameer Thajudin21:05:25

Here is the record defined called Concept

Sameer Thajudin21:05:03

And I am trying to import this into the following namespace

Sameer Thajudin21:05:04

(ns nlp-ws.model.surgical-complications (:require [scout-nlp-ws.nlp-results.concepts :as concepts]) (:import [scout-nlp-ws.nlp-results.concepts Concept]))

Sameer Thajudin21:05:03

And this is the error

hiredman21:05:25

the name of the class will have _ instead of -

hiredman21:05:07

in general importing classes created with deftype or defrecord is something I would recommend against doing (there are places where it is unavoidable though)

hiredman21:05:37

if you are just creating new instances of the type, use the factory functions ->Concept in your case

devn22:05:19

I have a question regarding *read-eval* .

(binding [*read-eval* false] (read-string "#=(inc 1)"))
does what I expect. Likewise,
(binding [*read-eval* false] (read-string "[#=(inc 1)]"))
However, if I have a file containing ["#=(inc 1)"] and do:
(binding [*read-eval* false] (read-string (slurp "thefile")))
There is no exception. Why is that?

ghadi22:05:56

because you're reading a vector with a string in it @devn

ghadi22:05:20

a file containing ["#=(inc 1)"] is like read-string on "[\"#=(inc 1)\"]"

ghadi22:05:09

(read-string "just late friday afternoon stuff")

devn22:05:30

right, sorry, i must have gotten my wires crossed because i swear i saw a (binding [read-eval] (read-string (first (read-string (slurp “thefile”))))) evaluate without an exception and wondered if i was hallucinating. just tested again and it would appear that i was

devn22:05:46

ah, my confusion was a string containing yet another string of “#=” e.g. (read-string "(read-string \"#=(inc 1)\")")