This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-04-02
Channels
- # beginners (29)
- # cider (41)
- # clara (1)
- # cljs-dev (17)
- # cljsrn (1)
- # clojure (158)
- # clojure-dev (2)
- # clojure-dusseldorf (2)
- # clojure-italy (8)
- # clojure-mexico (1)
- # clojure-russia (2)
- # clojure-spec (43)
- # clojure-uk (1)
- # clojurescript (44)
- # community-development (98)
- # cursive (9)
- # data-science (8)
- # datascript (4)
- # datomic (30)
- # emacs (6)
- # fulcro (11)
- # graphql (6)
- # jobs (1)
- # jobs-discuss (27)
- # lein-figwheel (5)
- # luminus (13)
- # lumo (4)
- # off-topic (28)
- # onyx (9)
- # parinfer (12)
- # perun (2)
- # portkey (5)
- # re-frame (48)
- # ring (2)
- # shadow-cljs (52)
- # spacemacs (29)
- # tools-deps (15)
- # unrepl (9)
- # vim (7)
- # yada (3)
How can I only get a small part of Incanter read-dataset
of big CSV file?
@stardiviner I don't know that incanter has lazy CSV reading capabilities (though someone please correct me if I'm wrong). You could take a look at this thing I wrote: https://github.com/metasoarous/semantic-csv
👍 4
@aaelony Well, depends on what you need; If you need the first N rows, then sure. But if, say, you need to do a filter, or some aggregation, semantic-csv lets you consume large csv files lazily.
As a filter you could even stream them into an incanter dataset datastructure if you like