This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-11-03
Channels
- # announcements (2)
- # asami (1)
- # babashka (32)
- # beginners (125)
- # calva (4)
- # cider (1)
- # clj-kondo (16)
- # clj-together (1)
- # cljs-dev (15)
- # clojure (30)
- # clojure-australia (3)
- # clojure-europe (41)
- # clojure-italy (1)
- # clojure-losangeles (1)
- # clojure-nl (4)
- # clojure-spec (68)
- # clojure-uk (28)
- # clojurescript (36)
- # conjure (2)
- # cryogen (1)
- # cursive (2)
- # data-science (2)
- # datascript (2)
- # datomic (70)
- # events (2)
- # fulcro (11)
- # graalvm (1)
- # jobs (4)
- # kaocha (4)
- # leiningen (4)
- # malli (52)
- # meander (21)
- # off-topic (11)
- # pathom (7)
- # pedestal (17)
- # reagent (23)
- # reitit (5)
- # remote-jobs (5)
- # reveal (7)
- # shadow-cljs (24)
- # spacemacs (36)
- # sql (21)
- # vim (18)
- # xtdb (7)
I'm trying to transform a collection of data items into XML. The items each represent one instance of one of about 500 different classes, which will be read into a Java program via JAXB (what can I say, this system was designed in the early 2000's). I have the .xsd files for these classes. Is there any way to easily import these .xsd's into clojure, turning the types described therein into corresponding records? Similarly, I need to also do the reverse process - reading the XML data objects into Clojure. Again, is there an easy way to do this? Alternately, is there a simple JAXB wrapper that I can use for these operations?
I have had some success with clojure.data.zip
in the past (https://github.com/clojure/data.zip) but it seems there's this which looks newer https://github.com/clojure/data.xml and may be more what you're looking for? Writing a custom JAXB handler probably isn't much work though ...
Thank you all. I'm considering the suggestions you've all given. They've been helpful.
Hi guys I wrote a clojure package for composing sql statement. I have been using it for my internal project and I am happy with it. The main reason being, it has lower learning curve and I get to write mostly sql statements. Check it out here https://github.com/ludbek/sql-compose The package has not been published yet due to following artifact issue during deploy https://stackoverflow.com/questions/64654868/deploy-clojure-packages-to-clojar Cheers
did you run lein jar
first?
are you using up to date versions of lein and clojure?
$ lein --version
Leiningen 2.9.4 on Java 10.0.1 Java HotSpot(TM) 64-Bit Server @vm
Clojure version is 1.10.1
i'm not actually sure what the issue is. from the error message it seems like it's having trouble finding files to deploy. is your project.clj
file shareable?
Here is the link to it https://github.com/ludbek/sql-compose/blob/main/project.clj
The issue has been fixed. Not sure what went wrong with lein. Instead of using lein I used mvn
to deploy the package.
The blog post below was helpful.
https://oli.me.uk/clojure-projects-from-scratch/
what channel should i use for questions regarding generating csv files?
here should be fine, are you using clojure.data.csv?
yes. i'm trying to generate a csv with a large amount of data. my function works fine with smaller datasets but gets stuck when dealing with a large dataset. was wondering if anyone ran into this issue using clojure.data.csv
it just delegates to the .write
method of the writer - https://github.com/clojure/data.csv/blob/master/src/main/clojure/clojure/data/csv.clj#L105
Paste your csv writing code @jovannie.landero396
(defn maps->csv
"Converts list of hash-maps to CSV."
[file data]
(let [filename (str "resources/csvs/" file "-" (time/instant) ".csv")
headers (->> data
first
keys
(map name)
(map str/capitalize))
row-data (map vals data)]
(with-open [writer (io/writer filename)]
(csv/write-csv writer (cons headers row-data)))))
I believe such code will cause the entire sequence row-data
to be realized in memory all at once, with none of it beginning elements GC'able garbage, because of the row-data
reference to the head of the list. If all of that data is large compared to your JVM's max heap size, that will cause it to go slower and slower, GC'ing more frequently, until eventually it could cause an OutOfMemory exception.
I also believe there are small variations of your code that should avoid "holding onto the head", e.g. do not declare row-data
at all, and use the expression (map vals data)
instead of the one place where row-data
occurs now. Note that if data
is a lazy sequence and its head is held onto outside of the function map->csv
, then it could also end up consuming lots of non-GC'able memory.
(non GC'able until some later time, when your code no longer keeps a reference to the beginning of such a sequence)
I thought locals clearing was smart about escape
hm yep data is a lazy sequence as well and i believe its head is held onto outside of the function but just to make sure we're on the same page.. what do you mean by its head is held onto outside of the function?
eg. this runs forever but doesn't increase heap usage
user=> (let [r (range)] (run! identity r))
@jovannie.landero396 if there's something outside maps->csv that holds data, that means the realized values can't be garbage collected
so it's a potential heap bomb
@jovannie.landero396 anyway, this should be easy to test, with eg. visualvm or yourkit attached to your process, look for heap usage, which methods are being called etc.
ok let me test
so the amount of memory being used it 54.7 MiB without running it through that function