Spring Boot | Apache Kafka JSON Serialization & Deserialization Example | JavaTechie
Ғылым және технология
In this tutorial, We will learn How to send and receive a Java Object as a JSON byte[] to and from Apache Kafka using JsonSerializer and JsonDeserializer .
👉 How to configure Serializer and Deserializer using application.yml
👉 How to configure Serializer and Deserializer using java base config approach
#JavaTechie #SpringBoot #Kafka
Spring boot microservice Premium course lunched with 70% off 🚀 🚀
Hurry-up & Register today itself!
COURSE LINK : javatechie5246.ongraphy.com/
PROMO CODE : JAVATECHIE50
GitHub:
github.com/Java-Techie-jt/kaf...
github.com/Java-Techie-jt/kaf...
Blogs:
/ javatechie
Facebook:
/ javatechie
Join this channel to get access to perks:
kzread.infojoin
guys if you like this video please do subscribe now and press the bell icon to not miss any update from Java Techie
Disclaimer/Policy:
--------------------------------
Note : All uploaded content in this channel is mine and its not copied from any community ,
you are free to use source code from above mentioned GitHub account
Пікірлер: 62
I just finished your entire playlist on Kafka, Amazing content. Thank you for helping me
Dear Sir, Namaskaram, And first of all Thank You for teaching us so briefly. and also i have been learning a lot from you . the way you have said "IT IS CRYING" while changing String message to Customer customer object at 5.32, i laughed HARD man. Thank You so much sir. its a little thing but i laughed a lot clapping for 5 minutes .. THANK YOU AGAIN SIR
@Javatechie
26 күн бұрын
Hello Vivek thank you so much for your kind words 😊 and I am so happy to hear that you enjoy the content 😀. Keep learning buddy
Kafka series are well structured and theoritical explanation. Superb!!! Keep on doing!!
For the first time i completed a tutorial . Thank you so much for this amazing content. Now I got an idea on kafka
as coures progress we see less and less views 😀😀😀. but seriously this course is good those of you who completes it helps a lot
Thanks so much Basant, appreciate your efforts!❤
amazing content i followed you since i started my carrier. thanks for you support.
as usual -- awesome.. Thanks Basant
Thanks 🙏❤
Thanks you so much learned apache kafka within few hours
Yeah great content, keep up the good work!!!👍
Amazing❤ Big follower of your content brother😊
Amazing content!! Thank you so much ❤
Amazing series, Please continue this series with Kafka Streams.
@Javatechie
10 ай бұрын
Yes I will continue from coming weekend
Good Content 👍
thanks for ur efforts
Thank you
Great thanks
another great video! can you cover authentication authorization used in kafka?
So it's Crying... got me 😂😂
Hello Sir, You are really doing good job. I am watching your videos and Its really very amazing. Can you please make video on Kafka interview questions?
@Javatechie
8 ай бұрын
Yes buddy i will
Great please continue. how to get 1000000000 record get from DB and save to another DB by Kafka
Thank you so much sir. Can you please upload spring boot annotation second part.
@Javatechie
11 ай бұрын
Yes buddy i will but required a few more times to prepare for the PPT
can you please make a video on how to use common pojo and dto classs instead creating same Customer class in producer and consumer projects as you mentioned at 5:33
can you please explain why you create producer factory and kafka template bean? at 19:30 and how that both are working in configuration? I was able to understand kafka template and kafka listener when you create bean in service class in previous section of kafka series.
I have one doubt only suppose i m using 10 consumer in consumer_group in kafka all consumer need to implement producer data type class(DTO) to deserializing. any better way to do that because code is repeating in each service . what i mean by that suppose i m producing order data in one of topic in order creation then it is necessary in each consumer class like invoice email and inventory to create DTO class for order to deserializing. Any better way GraphQL? can we use
Great job, on to the next topic. Can you use Kafka Streams or KSQLDB? Thanks for all.
@Javatechie
11 ай бұрын
I will definitely do that.
Hello sir!! What approach can we use for consumers not to connect to kafka directly instead of some api?? Also if the producer wants to move to a new message broker, in that case how the consumer should not be affected by the change at the producer???
@Javatechie
11 ай бұрын
No if there is a configuration change in the producer then consumers need to have those changes . We can use some centralized configuration approach to manage it to avoid manual efforts
sir have you changed this playlist ? before it was 21 videos, but now only 13
@Javatechie
3 ай бұрын
No it's 13 only . I haven't uploaded 21 videos. Please check properly
At the consumer side, if i am using a config file and not the yml file, with the same configuration and classes like in the yml i.e. the server, string deserializer and json deserializer i am getting an error by the org.springframework.messaging.convertor.MessageConversionException saying : cannot convert from java.lang.String to com.example.dto.Customer. This would occur when the serialization and de serialization do not match or may be something else. What bugs me is if i use the yml file everything runs smooth but when i switch to a config file the consumer application cannot convert the message to Object type. Is there any solution?
@Javatechie
5 ай бұрын
Please share your GitHub link
I have com.producer.dto.Customer for Producer application. And com.consumer.dto.Customer for Consumer application. Since these paths do not match the process does not work. And it is not possible to specify in .yml configuration that the incoming com.producer.dto.Customer should be mapped to com.consumer.dto.Customer. So I have to rewrite one of the applications completely to match the package structure or create a custom Json deserializer.
@machinegunkohli
26 күн бұрын
could you please send rewrited code here
Bro why didn't we use kafka temple in choreography
@Javatechie
11 ай бұрын
I have used spring web flux so the sink is used as a consumer over there . If you start implementing the same with the traditional approach then you can go with the Kafka template
@jay-rathod-01
11 ай бұрын
Got it thanks👍
Error:Can't serialize data [com.spring.kafkaproducer.example.dto.Customer@4ad91b0e] for topic [demoCust] I am getting ths error Please help... Below is the config spring: kafka: producer: bootstrap-servers: localhost:9092 key-serializer: org.apache.kafka.common.serialization.StringSerializer value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
@Javatechie
27 күн бұрын
Please configure trusted package
@prajulakottai1338
27 күн бұрын
But producer is throwing the error, though I added the trusted package in consumer yml
@Javatechie
27 күн бұрын
@@prajulakottai1338 you need to add in both producer and consumer buddy
@prajularao
26 күн бұрын
@@Javatechie still not working>please help org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry' Caused by: java.lang.IllegalStateException: java.lang.ClassNotFoundException: com.spring.kafkaproducer.example.dto.Customer below is the config producerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); producerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class); producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,JsonSerializer.class); producerProps.put(JsonDeserializer.TRUSTED_PACKAGES,"com.spring.kafkaproducer.example.dto"); consumerProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092"); consumerProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); consumerProps.put(ConsumerConfig.GROUP_ID_CONFIG,"group4"); consumerProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class); consumerProps.put(JsonDeserializer.TRUSTED_PACKAGES,"com.spring.kafkaproducer.example.dto"); consumerProps.put(JsonDeserializer.USE_TYPE_INFO_HEADERS,"false"); consumerProps.put(JsonDeserializer.VALUE_DEFAULT_TYPE ,"com.spring.kafkaproducer.example.dto.Customer"); consumerProps.put(JsonDeserializer.TYPE_MAPPINGS, "customer:com.spring.kafkaproducer.example.dto.Customer");
@avognonlionel7767
12 сағат бұрын
@@prajularao I faced the same issue but solved it defining in consumer application the Customer object with the same package as defined in producer's application.
how do we handle this sir? .KafkaMessageListenerContainer : Consumer exception java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer at org.springframework.kafka.listener.DefaultErrorHandler.handleOtherException(DefaultErrorHandler.java:198)
@janyajoshi
10 ай бұрын
Can you share your git link. I'll try to see if I can help.
@KAMMARIRAJESH151
10 ай бұрын
@@janyajoshi rajeshoo7/Kafka-project/tree/master
@mohamedsedahmad585
10 ай бұрын
did you find the solution ?
@gayannanayakkara8386
10 ай бұрын
I found the solution. its because of type mapping. JavaTechie doesnt get it as producer DTO and consumer DTO is in same package. You should provide like this. is my DTO. Producer : (you should mention your DTO package name in producer as type with the Class name) properties: spring: json: type: mapping: customer:com.gnanayakkara..dto.Customer Consumer : (You should mention where is the mapping Class available in Consumer side.) properties: spring: json: trusted: packages: com.gnanayakkara.kafkaproducer.dto type: customer:com.gnanayakkara..dto.Customer
@Javatechie
10 ай бұрын
Awesome and appreciate your findings, but just keep a note this is not a recommended way to send raw objects as a Kafka message. I will upload a video on it . Hints : Kafka provides us with flexy that we can deal with record with any data type
Sorry to bother you too much, I am using the same deserializer , and I am getting exception its not a trusted source even after configuring. properties: spring: json: trusted: packages: com.example.kafka.consumer.dto It keeps on giving me error on this one Caused by: org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition TestTopic-1 at offset 0. If needed, please seek past the record to continue consumption. Caused by: org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition TestTopic-1 at offset 0. If needed, please seek past the record to continue consumption. at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:309) at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:263) at org.apache.kafka.clients.consumer.internals.AbstractFetch.fetchRecords(AbstractFetch.java:340) at org.apache.kafka.clients.consumer.internals.AbstractFetch.collectFetch(AbstractFetch.java:306) Caused by: java.lang.IllegalArgumentException: The class 'com.example.kafka.producer.dto.Customer' is not in the trusted packages: [java.util, java.lang, -java.util-java.lang-com.example.kafka.consumer.dto.*]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*). Posted in instack over flow stackoverflow.com/questions/77804089/unable-to-deserialize-events-from-spring-boot-consumer-getting-an-error-even-aft gitlab.com/kishore87jetty/kafkaproducer gitlab.com/kishore87jetty/kafkaconsumer
@Javatechie
5 ай бұрын
Could you please share your code in GitHub? I will take a look and figure it out
@kishore87jetty
5 ай бұрын
@@Javatechie Added details in my first comment itself
@kishore87jetty
5 ай бұрын
There are one more branch ProducerConfig and COnsumerConfig in respective repository . the config once are also throwing some error.