Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> processing billions of Kafka events per day

Except that the burden is on all clients to coordinate to avoid processing an event more than once since Kakfa is a brainless invention just dumping data forever into a serial log.



I'm not sure what you're talking about.

Do you mean different consumers within the same consumer group? There's no technology out there that will guarantee exactly-once delivery, it's simply impossible in a world where networks aren't magically 100% reliable. SQS, RedPanda, RabbitMQ, NATS... you call it, your client will always need idempotency.


That is called a 'consumer group' which has been a part of Kafka for 15 years.

The author is suggesting to avoid this solution and roll your own instead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: