This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-05-08
Channels
- # aws (9)
- # beginners (69)
- # boot (14)
- # cider (26)
- # cljs-dev (56)
- # cljsrn (9)
- # clojars (4)
- # clojure (229)
- # clojure-brasil (1)
- # clojure-france (11)
- # clojure-greece (2)
- # clojure-italy (4)
- # clojure-mke (6)
- # clojure-serbia (6)
- # clojure-spec (83)
- # clojure-uk (38)
- # clojurescript (171)
- # core-async (3)
- # cursive (11)
- # data-science (11)
- # datomic (27)
- # emacs (113)
- # funcool (6)
- # hoplon (4)
- # jobs (1)
- # luminus (13)
- # lumo (44)
- # off-topic (148)
- # onyx (5)
- # overtone (1)
- # pedestal (4)
- # powderkeg (1)
- # proton (2)
- # re-frame (150)
- # reagent (16)
- # ring-swagger (43)
- # spacemacs (4)
- # specter (36)
- # vim (4)
- # yada (10)
I am just starting out with Spark and am lost as to why my task is failing. I have downloaded the latest version of spark (2.1.1) and have included [org.apache.spark/spark-core_2.11 "2.1.1"]
and [org.apache.spark/spark-streaming_2.11 "2.1.1"]
in my project. I also started the master via ./sbin/start-master.sh
and a slave ./sbin/start-slave.sh
.
Once I load my ns that require
's powderkeg in the REPL, I get a bunch of output that all looks unalarming. However, if I try to run one of the examples in the powderkeg README, I get the below exception:
The example:
(into [] ; no collect, plain Clojure
(keg/rdd ["This is a firest line" ; here we provide data from a clojure collection.
"Testing spark"
"and powderkeg"
"Happy hacking!"]
(filter #(.contains % "spark"))))
Exception: https://pastebin.com/Ght3ZAci
I have spent several hours googling this error and have not found any solution that fixes the issue. Any ideas on how to proceed?