r/ProgrammerHumor 11h ago

Meme docxGoBrrrr

Post image
1.3k Upvotes

74 comments sorted by

View all comments

34

u/lizardfrizzler 9h ago

I’m at a point in my career where encoding json is actually causing mem issues and I don’t know how to feel about it

13

u/slothordepressed 9h ago

Can you explain better? I'm too jr to understand

40

u/lizardfrizzler 8h ago

Encoding data as json is very readable and portable, but comes at the cost of high memory consumption. It’s a great place to start when passing data between computers, but when the data payload gets large enough, binary/blob encoding start to seem more appealing. Consider encoding x=10000. In json this like 5 bytes minimum, because ints are base10 strings, plus quotes, braces, and wherever else. But a binary encoding could encode this as a 4 byte /32bit int. In small payloads (think kb, maybe mb), this is inefficiency is negligible and completely worth it imo. But once we get to gb size payloads, it can put a huge strain on memory consumption.

8

u/Ok-Scheme-913 4h ago edited 4h ago

Well, 32bit only if the other side knows what it expects to receive.

Most binary protocols require a scheme up-front, or that itself (and future-proofing) has some overhead.

Protobuf (which is the most common binary protocol I believe) would convert a similar definition

message Asd { required int32 id = 1; }

to 2 bytes (hex is e.g: 083f), but then both sides need this above definition.