Fork me on GitHub
#ldnclj
<
2015-12-16
>
afhammad08:12:33

non clojure specific question: What do you guys use for advanced reporting (medium data, not big data)? do you build reports from scratch with each new system? do you dump your data to a separate denormalised db? use a 3rd party tools and just plugin ur queries?

xlevus08:12:00

@afhammad: my $client dumps all their data into Big Query

xlevus08:12:48

like, all their data.

afhammad09:12:41

@xlevus: does it have its own reporting UI?

xlevus09:12:29

nah. But they build queries to report on it daily.

xlevus09:12:01

which are run/export into google docs

xlevus09:12:06

it's not the prettiest. But it allows them to do other things, like run their marketing poo off bigquery too

afhammad09:12:03

interesting, will look into it, thanks

xlevus09:12:50

I've also found it helps for debugging. Shit goes wrong, you can spend a day drudging through terrabytes of data working out why

xlevus09:12:17

It's probably closer to the 'big' size of data. Some of the tables are in the tens of millions of rows a month area

afhammad09:12:13

Yeh not quite at that level, more so in the hundreds of thousands. I have a couple of clients (currently running on Rails) that are approaching the need for more advanced reporting, im looking for options to decouple it from their systems and kill a few birds with one stone.

mccraigmccraig10:12:01

@afhammad: i've used elasticsearch to great effect - the aggregations framework is great for building fast analytic queries - responses in <100ms over tens of millions of docs

afhammad10:12:06

@mccraigmccraig: thanks. I’ve used elasticsearch briefly and its one of the options i’m considering.

glenjamin22:12:03

Amazon redshift seems pretty good