-
Notifications
You must be signed in to change notification settings - Fork 62
Is there a way to prevent the JSON structure that is pulled from mongo from being destroyed? #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi, we wrote this connector for working with the confluent schema registry. Since the schema of the documents may change, if we preserve the schema, it may not be compliant with the schema on the schema registry. |
Thanks for the quick reply. If it's easy to maintain the schema of the original message, that would be best. Many of us using Mongo prefer things to stay in JSON format so we don't need to do ETL every step of the way. Everything else I have running through Kafka is 100% JSON so we can consume it quickly. Having this one (hopefully small) change would be huge for me (and I'd suspect many others). |
Actually, if you could point me to the code where the conversion of the JSON from mongo is performed, I could probably fix it up. |
Hi, the conversion is here. |
Ok, I see that section where the schema section and payload section are built. I'm not much of a Java person, so I'm having a difficult time seeing where "Document=" gets inserted. I don't see any reference to "Document=" in the code. toString() as far as I am aware simply ensures that the data being pulled is completely in the form of a string. |
Hi, the document is flattened using messageStruct.put("object", message.get("o").toString()); The problem is that when using a schema you need to know the schema of the object beforehand because is difficult to extract the schema from every mongo documento. |
@patelliandrea @grindthemall |
@Ronniexie it's using the converter you set in the connector properties (distributed or standalone). |
Ok, so I love the connector - it makes getting data from Mongo to Kafka easy, however when the data gets pulled from mongo, it loses its own JSON structure. Is there a way to prevent this from happening?
Example:
My JSON looks like this in Mongo:
However when I retrieve it from Kafka this is what it looks like in the payload object:
Document{{_id=56feaa424f1249736af0ba4f, ts=Thu Mar 31 09:00:00 PDT 2016, type=hrlyEvtLvls, events=Document{{TechDump=4, Reboot=174, FatalError=0}}}}
I don't have an easy way to reprocess the received data back into JSON, so is there some way I can prevent my JSON from being lost in the first place?
The text was updated successfully, but these errors were encountered: