Fork me on GitHub
#off-topic
<
2018-01-19
>
dimovich00:01:57

what would you suggest for live monitoring different twitter keywords?

dimovich00:01:19

maybe a saas?

noisesmith00:01:27

sadly the old firehose doesn’t exist any more, but there’s a paid service that provides something similar

noisesmith00:01:27

without something like that (or a contract deal with twitter directly) you are limited to use of an auth token with limits that are pretty easy to max out

dimovich00:01:02

@noisesmith thanks! so it seems you deal directly with twitter now

noisesmith00:01:10

oh wait, you don’t want the api, you want an app that uses the api?

dimovich00:01:20

no, I need the api 🙂

noisesmith00:01:36

for getting things figured out / proof of concept, there’s extensive docs of the twitter api - you’ll just run into their quotas on usage pretty fast https://developer.twitter.com/en/docs

noisesmith00:01:58

if you are doing the feed for a specific customer, you can get them to auth for you and do the calls on their behalf, but that still hits limits

dimovich00:01:12

will check the specific limits they impose... I think eventually we'll have to switch to the paid version

noisesmith00:01:54

one of the apis shows you exactly how many calls you have in the next 15 minutes for each endpoint in a json map, and/or how long you have until the 15 minute quota period expires

noisesmith00:01:11

it’s pretty strightforward to use once you get the oauth thing hooked in

dimovich00:01:59

thanks for the suggestions!

bja02:01:36

TIL that reduce doesn't zip higher arities ala map. kinda bummed me out.

seancorfield04:01:05

@bja Easy enough to do (map vector coll1 coll2 coll3) and then reduce that tho'...

bja04:01:22

sure, I was just kinda sad that reduce didn't do it for me

bja07:01:54

the best time to do a db upgrade is off-peak hours for the majority of your userbase. the best time to get featured in the app store is thus clearly when doing a db upgrade.