Fork me on GitHub

Looks like it’s likely to be a bug in Onyx but I can’t quite nail what’s gong on yet. I’ll have to get back to you tomorrow.


Hey guys, a question I might be able to get answered here ( general kafka consumer question ). What is the best way to handle say a batch of items retrieved from kafka ( consumer ) and the majority of those items get sent off to say a db . How do you handle the case of a failure in the middle of that batch ? I don’t think you want to redo the whole batch because you may duplicate some data ?


@camechis you generally need the data to be keyed so that you can make sure that your writes to your db are idempotent. If you’re writing to another kafka topic it’s a little tough because kafka can’t do transactions. Luckily this is coming in the next release


cool, yeah. Was starting to get my head wrapped around it