This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2015-11-02
Channels
- # admin-announcements (15)
- # aws (35)
- # beginners (6)
- # boot (183)
- # cider (51)
- # clara (17)
- # cljs-dev (32)
- # clojure (67)
- # clojure-dev (7)
- # clojure-india (1)
- # clojure-japan (3)
- # clojure-norway (1)
- # clojure-russia (26)
- # clojurescript (85)
- # clojurex (4)
- # community-development (1)
- # cursive (18)
- # data-science (1)
- # datomic (46)
- # devcards (29)
- # events (7)
- # funcool (21)
- # hoplon (10)
- # ldnclj (2)
- # lein-figwheel (16)
- # off-topic (60)
- # om (37)
- # onyx (8)
- # re-frame (23)
- # reagent (5)
- # yada (6)
So would the best thing be to put data right into Kinesis from the API endpoint, and then pull off and only mark as processed when it’s successfully written?
the cool thing is the various consumers of the kinesis stream you may have maintain their own idea of where in the stream they are
and if your consumers use the kinesis consumer library to do their receiving, this is handled automatically by the library, which stores things in dynamo
It’s hard to tell whether it’s more or less robust than a single server with dodgy monitoring set up by a monkey (me)
If I have to worry about the data consumer going down anyway, I think I’d just put it right in Dynamo
Much as I would like to think that thousands of people will be purchasing Cursive per second, I suspect I’m unlikely to have a legitimate big data problem
I’m still fairly tempted by something bare-bones with just an embedded k/v store on my server, but Lambda seems like a pretty easy option.
one of my favorite things about lambda is it automatically does what most bad programs need done
> My main goal isn’t gigantic scale, it’s not having to maintain and monitor things What about something like GAE?