Serialization Performance comparison (C#/.NET) – Formats & Frameworks (XML–DataContractSerializer & XmlSerializer, BinaryFormatter, JSON– Newtonsoft & ServiceStack.Text, Protobuf, MsgPack)

Hi big shot!

This time I’ll talk about performance of serialization. Serialization is a common task we use mostly for communication and storage. This post will give a wide serialization performance comparison.

Real life scenarios- Lately a micro services architecture become very common, in such architecture you have to provide a way for your micro services to communicate between themselves so you’ll have to serialize your objects for that. Normally you will use messaging frameworks that will do it for you but it’s essential to understand what’s going under the hood. In other cases, you’ll need to develop a framework or a tool that will need to serialize it’s objects itself. In addition, in many communication frameworks you’re able to change the serialization method so it’s important to understand what you’ll achieve by doing so.

In this post I won’t talk about the advantages or disadvantages of each format/framework, I’ll stick strictly with performance (Space and Speed). That said, when you choose a format and framework you sure need to think about much more than that. Examples are ease of use, extensibility, flexibility, versioning, and much more.

Although my test run in C#, this post applies to any technology. The size obviously will be the same, and about the speed I believe that it will differ, but the ratio between the formats will be roughly the same in most cases.

I’ll test popular formats and frameworks –
XML (XmlSerializer, DataContractSerializer) , BinaryFormatter – Are included with the .NET framework http://msdn.microsoft.com/en-gb/vstudio/hh341490.aspx
JSON http://james.newtonking.com/json, https://servicestack.net/text
MsgPack https://github.com/msgpack/msgpack-cli
Protobuf https://code.google.com/p/protobuf-net/

The way I built the testing code it’s very easy to set up new tests for other frameworks. You just need to implement the abstract methods of the SerializationTester<TTestObject> class. So with the provided code you can easily test different serialization frameworks with different sample data that fits your needs the best. This is something you can do within minutes with my code.

So why do I need to change the serialization method in my application? In most applications you probably don’t.
However, it’s important to be aware of the differences which are significant. If your application is heavy on serialization you should consider it.  It’s important to understand that we’re testing two factors –
Space which affects storage/memory (storing objects) and network usage(communication)
Speed which the time takes to serialize/deserialize the object.

So let’s begin with the comparison. To compare we need data, I’ve chosen to use a list of Belgian beers, after all – who doesn’t like beer? So by Wikipedia there’re 1610 different beers made in Belgium and they all listed on this page – http://en.wikipedia.org/wiki/List_of_Belgian_beer.

Then I run the tests, and plotted them on a chart for you.  Of course the code to retrieve the list and all the tests is available on GitHub – https://github.com/maximn/SerializationPerformanceTest_CSharp

How a Beer looks like?

image

 

It’s pretty straight forward, a beer has – Brand, Brewery, Alcohol level and a List of sorts that apply to this beer.

I wanted to do the comparison for large data and small data so I used the list of all 1610 beers and one beer respectively.

This is the summary of the results (Size in bytes, time in milliseconds).

image

But of course I’ve added charts to make the comparison easy

serialization_size_small serialization_size_big
(* smaller is better)

We can see huge differences between formats. Interesting to note that Binary which would be in the middle for large objects would be the worst for small objects.

Now for the speed, for easy comparison I’ve decided to normalize the results so in both charts it shows the time for 1 item (beer). For the large data (list) I just used the time took to handle the list divided by the number of items(1610). I’ve ran the test 100 times and took the average run speed.

serialization_speed_smallserialization_speed_large
(* smaller is better)

All the test ran on my laptop – Lenovo X1 Carbon (i7-3667U). The tests obviously gave me different results each run but what’s really interesting is the ratio between the formats.

  • It is important to say that I’ve used all those frameworks with their default settings. I’m sure it’s possible to tweak it a bit but I just wanted to compare their defaults.

A bit about the testing code – All testers implement SerializationTester<TTestObject> class. They implement an Init, Serialize, and Deserialize methods. The base tester will run the tests 100 times and will output the results to the console. All the tests measure in memory serialization/deserialization so hard disk speed doesn’t affect the results.

So which one should you use? I don’t really have an answer for that one. It all depends on your needs, but this post intends to help you make the right decision.

25 thoughts on “Serialization Performance comparison (C#/.NET) – Formats & Frameworks (XML–DataContractSerializer & XmlSerializer, BinaryFormatter, JSON– Newtonsoft & ServiceStack.Text, Protobuf, MsgPack)

  1. Arnon

    realy great job on showing the diffrences but im not sure its a good practice to use different serialization framework on different enviourment unless you doing some POC, it its important for me at least that while developing ill be as close as possible to the production system, there are enough thing that makes difference between them for me to create an intentional gap. but other than that, realy great post to help consider different tools for different jobs. great article hope to see more

    Reply
  2. Ruffy

    It would be easier to understand your graphs if you will add a quick note how to determine which is the fastest like putting something “smaller is better”. You might also want to indicate the y-axis is actually time in milliseconds. Other that that, it is a good article.

    Reply
  3. Pingback: Realtime Analytics: Just how many CPU/Processors would you need if your Market Data Service feeding thousands of ticks per second? | gridwizard

  4. Pingback: Serialization of 3D Models | Phillip Hamlyn

  5. codematrix

    I have a version of my serializer that I developed that is 25% faster than both Protobuf & Msgpack. However for my version, the size is 30% lager, but still better than the other serializers mentioned. The other disadvantage for my version is that you need to implement a Write/Read method. Pros and Cons either way. For us, speed was paramount as were dealing with High Frequency Trading.

    Reply
  6. Pingback: Deep cloning objects - DexPage

  7. Pingback: How to compare performance of messagepack-cli and json.net deserializers? - BlogoSfera

  8. Pingback: Deep cloning objects | ASK AND ANSWER

  9. Fa Rid

    Thank you, for this post Maxim.
    Please help clearing the confusion. The title of the last two graphs cannot be “Speed for …” with a comment “(*) Smaller is better”.
    Clearly, lower speed is not better.
    It should be something like “Processing time for large data”
    right ?

    Reply
  10. Matthew Whited

    Not sure your performance tests are valid. You get to the beginning of the memory stream by seeking in the tests that were reported as slow but just change position in the fast versions.

    Reply
  11. Pingback: Use ProtoBuf.Net to improve serialization performance - So many manuals, so little time

  12. Pingback: News-y programistyczne 24-09-2016 | Blog Programisty.NET

  13. Scott Gammans

    Interesting. I am trying to eke out as much serialization-deserialization performance as possible in a new application framework I’m designing, and came here looking for comparisons between DataContractSerializer, Newtonsoft JSON and Protobuf.

    My biggest concern (besides speed and size) is using something that’s not part of the core Microsoft mainstream, for future maintenance. Our projects, once deployed, often have a life of iver 10 years. Will Google still be supporting and maintaining Protobuf in the 2020’s? (But then again, the road is also littered with the cast offs of once-popular MS technology stacks–looking at you, Silverlight!)

    Reply

Leave a Reply