Fork me on GitHub
#clojure-uk
<
2018-08-10
>
mccraigmccraig07:08:36

what PGP help dyu need @yogidevbear?

yogidevbear07:08:19

Morning Craig 🙂 I came right in the end. I was trying to figure out how to encrypt using a public key that someone gave me. I was trying to use PGP instead of gpg, but was showing the error of my ways by @dominicm

mccraigmccraig07:08:24

ah,, yeah, gpg all the way...

agile_geek07:08:48

Bore da pawb welsh_flag

jasonbell09:08:02

The deed is done. Flights are booked, hotel is booked and talk outline and updated bio are sent off..... y'all have to chip in for an assassin to stop me speaking at ClojureX (or bribe @jr0cket quite heavily 🙂 )

bananadance 4
aw_yeah 4
practicalli-johnny07:08:03

@jasonbell It would have to be a bribe of biblical proportions for me to stop you from speaking at the conference. Then I would use the money to run a "Jason Bell is awesome" conference and have a whole day of you speaking :)

jasonbell07:08:14

LOL Oh behave yourself 🙂

agile_geek10:08:25

@jasonbell I find @jr0cket is a cheap date... I could just buy him a pint! Seriously though...looking forward to seeing the video of the talk!

jasonbell10:08:37

Looking forward to it. The last two have been a hoot.

alexlynham10:08:52

what are you talking on?

alexlynham10:08:58

or is that spoilers

jasonbell10:08:11

@alex.lynham - Learning how to develop self training AI with Clojure, Kafka and DeepLearning4J (with a bit of possibly core.async or factual durable queues throw in for good measure). @otfrom told me about Factual and I really like using it.

alexlynham10:08:09

what's kafka for in that piece?

alexlynham10:08:21

(also where did you get to do this? sounds cool)

jasonbell10:08:45

It was a proof of concept which I put together for my Strata London talk.

jasonbell10:08:41

The reason for using Kafka was to show Kafka in a more real world context. But the same could be applied to core.async or Factual, so I'll keep the Kafka stuff fairly short.

alexlynham10:08:10

running it as like a single node for integration or something?

jasonbell10:08:32

Kafka is usually over kill for most things in terms of size. I'm processing email events with Factual now, it's a much better fit for me.

jasonbell10:08:41

But I do miss messing around with Kafka.

jasonbell11:08:59

@alex.lynham <<what's kafka for in that piece?>> Sorry I didn't answer that: basically raw data transport and events.

agile_geek11:08:51

@jasonbell maybe next year you can come talk at http://bitrconf.org for practice? Might need to dumb down the Clojure bit tho!

💯 4
maleghast12:08:34

Gosh, it feels as though I've not launched Slack for days...

maleghast12:08:57

@jasonbell - I am reading back... What is this Factual of which you speak..? It sounds, for all the World, like a queue...

alexlynham12:08:34

^ I was just about to ask that 🙂

alexlynham12:08:52

I'm using kinesis for a lot of that stuff atm

alexlynham12:08:21

works well, except that whatever your upstream/operational system is usually tends to be some messy legacy system

alexlynham12:08:35

and then that's the pain point for getting events cleanly into the system

maleghast12:08:03

I thought that Kinesis is / was a sliding window, so there's a problem with persistence..?

maleghast12:08:18

(I may have completely misunderstanding Kinesis)

alexlynham12:08:36

it's 24hrs configurable up to 168 so fine for events

alexlynham12:08:54

if you need kafka-style persistence you're looking at something jury-rigged with dynamodb

alexlynham12:08:57

...or just kafka

dominicm12:08:49

can kinesis stream into s3 for replay?

maleghast12:08:34

@dominicm - I was wondering the same thing, seeing as a lot of AWS tech "talks" effortlessly to S3

alexlynham12:08:17

@dominicm iirc there's a way of using a tool called VCR for reply yes

alexlynham12:08:37

so you're using KCL and dynamo (+s3?) essentially

alexlynham12:08:55

though I guess you could also bash everything into parquet on S3

alexlynham12:08:03

or x other format

alexlynham12:08:03

could be good to drop that into a container

jasonbell12:08:21

I'm using it for email confirmation processing: event happens on api, triggers email. queue works a treat for it.

maleghast13:08:00

Oooh, this all looks very nice 🙂 thanks @jasonbell and @alex.lynham

jasonbell13:08:30

@maleghast To be honest @otfrom told me so thank him.

maleghast13:08:39

@alex.lynham - I wonder if you could get Kinesis -> Athena

maleghast13:08:49

@otfrom - Thanks, via @jasonbell for Factual 🙂

firthh13:08:05

> can kinesis stream into s3 for replay? Kind of. There is - https://aws.amazon.com/kinesis/data-firehose/ But last time I used it you had to write a connector to go from Kinesis to Kinesis firehose

alexlynham13:08:31

@maleghast I... imagine you could? Don't know much about Athena mind you

maleghast13:08:55

Me neither, but the ability to query the data in-place while it's "resting" seems useful

alexlynham13:08:21

so is athena built on top of s3 and parquet?

alexlynham13:08:19

was looking today and AWS aurora serverless now supports MySQL and Postgres is on the way

maleghast14:08:55

That's quite xciting

maleghast14:08:13

Athena is the S3 tech that allows you to use SQL to query stored text

alexlynham14:08:04

yeah that's pretty cool

alexlynham14:08:29

deffo easier than my first skunkworks thing of deploying json blobs to IPFS and then picking them up later

alexlynham14:08:42

I think I need to reduce myself to a part time job where I only do super serious proper engineering and then spend all my spare time relentlessly pursuing my silly ideas, it'd be ace

cider 4
Ben Hammond15:08:53

next Glasgow meetup Wednesday next 6.30pm

jasonbell16:08:46

Have a great weekend if I don't catch any of you.

🎉 16