Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kafkaMdm sending streams in strange encoding format to Kafka #484

Open
Megge1 opened this issue Dec 3, 2021 · 10 comments
Open

kafkaMdm sending streams in strange encoding format to Kafka #484

Megge1 opened this issue Dec 3, 2021 · 10 comments

Comments

@Megge1
Copy link

Megge1 commented Dec 3, 2021

Hi all,
I hope I'm right here with this questions.
Sorry, my knowhow with Icinga and garbon-relay-ng ast all is very limited :-(

I've tried to setup the carbon-relay-ng on my colleagues Icinga server and I was able so far to send the DataStreams over to Kafka, but I fetch them in a bad encoded format on Kafka site.
They look like that
Key: Value:��Id�"1.60bbd43ae111c32e6e25dade557ef919�OrgId�Name�<icinga2.test-mock08_test.host.icmp.perfdata.pl.max�Interval<�Value�@y�Unit�unknown�Time�a�5=�Mtype�gauge�Tags�
Key: Value:��Id�"1.7ef266878bd7f299c9bf4a12272fe3e9�OrgId�Name�Aicinga2.test.host.icmp.perfdata.rtmax.value�Interval<�Value�?o�˯�[!�Unit�unknown�Time�a�5=�Mtype�gauge�Tags�
Key: Value:��Id�"1.2bca5085a41bb2760ab3958cea37d17e�OrgId�Name�Aicinga2.test.host.icmp.perfdata.rtmin.value�Interval<�Value�?j��U����Unit�unknown�Time�a�5=�Mtype�gauge�Tags�
% Reached end of topic monitoring-eventmgmt.stage.icingametrics-json [0] at offset 545145

Any idea where the problem could be?
My carbon-relay-ng.conf settings that could be important for that part are:
[[route]]
#which compression to use. possible values: none, gzip, snappy
codec = 'none'
#possible values are: byOrg, bySeries, bySeriesWithTags, bySeriesWithTagsFnv
partitionBy = 'bySeries'
schemasFile = '/etc/carbon/storage-schemas.conf'

I've tried several different settings with partitionBy and codec, no improvement.

Would appreciate every help.

Thanks a lot and cheers
Markus

@zerosoul13
Copy link

Hey @Megge1,

The message is posted to Kafka encoded with msgpack format. When the message is consumed, must be unmarshalled to a struct so it makes sense to you.

There's a similar discussion here: #281 (comment)

@Dieterbe
Copy link
Contributor

Dieterbe commented Dec 7, 2021

seems like the format should be documented in the relay-ng docs, it currently isn't.

@Megge1
Copy link
Author

Megge1 commented Dec 7, 2021

Hi all,
thanks for your replies.
@zerosoul13 I saw tnbis discussion you've mentioned. But because it's 3 1/2 years ago I thjought maybe there are already some news about that.
Do you mean unmarshalled with a transformation within Kafka? I don't think that there is already a solution in place?
Thanks and cheers
Markus

@Dieterbe
Copy link
Contributor

Dieterbe commented Dec 7, 2021

The messages are schema.MetricData which is defined here https://github.com/grafana/metrictank/blob/master/schema/metric.go#L32
that package also comes with functions to encode/decode messagepack (see route/kafkamdm.go for how it's used to encode the data), the metrictank code base shows how they can be decoded again.
but this is only for Go. if you use another language, you would need to write your own code to decode the messagepack.

@Megge1
Copy link
Author

Megge1 commented Dec 10, 2021

Thank you for your help @Dieterbe and the hint. Sorry also for the late reply. I'll check once if I can find a solution with verifying the kafkamdm.go.
Have a good day and take care
Cheers
Markus

@Megge1
Copy link
Author

Megge1 commented Dec 13, 2021

Hi all,
I was looking deeper into the code from the route/kafkamdm.go.
So far I saw it could be a possibility to change the line
data, err = metric.MarshalMsg(data[:])
to
data, err = metric.MarshalJSON()
could it be posible that I would have than a properly JSON format within Kafka?

Maybe you would think now that I can try it :-) I'll do it, but needs some time because first I've to check how I can change this code and pack it again, everything new for me. Also my colleague who is the owner of this server and has installed the carbon-relay-ng is out of office.
Thanks, have all a good day and stay safe
Cheers
Markus

@zerosoul13
Copy link

@Megge1

I'm not sure what the intention is by changing the serialization at this level. The messages can be pulled from Kafka without the need of converting to JSON. The fix could be somehow unmaintainable if implemented as proposed.

Here's a preview of the code I've submitted to go-carbon repository to add support for msgpack - https://github.com/go-graphite/go-carbon/blob/6ad5b88eb4c5489aad5109bc06a2b2b543031445/receiver/parse/msgpack.go

If I can be of help getting things moving on your setup, feel free to reach out to me via email. @gmail.com

@Megge1
Copy link
Author

Megge1 commented Dec 14, 2021

Thanks a lot for your help @zerosoul13
Me too :-) Try and error :-(
The problem is our Kafka colleagues want to have that in a properly format. So will give it a try to change that code and if there should be issues we have to go the other way, with unmarshal that afterwards like you've mentioned already once.
Also thanks for your code example, I really appreciate that.
I keep you uptodate who it is going.
Thanks and have a good day

@Dieterbe
Copy link
Contributor

I guess when you say "proper" maybe you mean "human readable". Messagepack is a binary format but it's a nice, compact serialization format with libraries in many languages :)

@Megge1
Copy link
Author

Megge1 commented Dec 15, 2021

Hi @Dieterbe , yes for sure I've mentioned "human readable" :-)
So far I read Messagepack should be really good, it's only not easy if everything is new, so learn step by step :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants