Good practice when using kafka with jpa

zibi :

I'm currently in a project where JPA and Kafka are used. I'm trying to find a set of good practice for combining those operations.

In the existing code, the producer is used in the same transaction as jpa, however, from what I have read, it seems that they don't share a transaction.

@PostMapping
@Transactional
public XDto createX(@RequestBody XRequest request) {
    Xdto dto = xService.create(request);
    kafkaProducer.putToQueue(dto, Type.CREATE);
    return dto;
}

where the kafka producer is defined as the following:

public class KafkaProducer {
    @Autowired
    private KafkaTemplate<String, Type> template;

    public void putToQueue(Dto dto, Type eventType) {
        template.send("event", new Event(dto, eventType));
    }
}

Is this a valid use case for combining jpa and kafka, are the transaction boundaries defined correctly?

gagan singh :

this would not work as intended when the transaction fails. kafka interaction is not part of transaction.

You may want to have a look at TransactionalEventListener You may want to write the message to kafka on the AFTER_COMMIT event. even then the kafka publish may fail.

Another option is to write to db using jpa as you are doing. Let debezium read the updated data from your database and push it to kafka. The event will be in a different format but far more richer.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=35875&siteId=1