Serialization of decimals does not respect precision
See original GitHub issueProblem
C# decimal
value serialized to JSON and de-serialized back to decimal
gives a number with different precision.
Explanation
Decimals in .NET are tricky: besides the number itself, they store the number of digits necessary to represent it. For example, numbers 15 and 15.0 stored in decimal variable will be represented differently in memory, though considered equal in comparison. When we serialize and de-serialize numbers, it is important to keep this information.
Steps to reproduce
using Newtonsoft.Json;
using System;
namespace JsonDecimalIssue
{
class Program
{
static void Main(string[] args)
{
decimal before = 15;
string serialized = JsonConvert.SerializeObject(before); // produces "15.0" <- incorrect
decimal after = JsonConvert.DeserializeObject<decimal>(serialized);
Console.WriteLine(before); // Writes "15"
Console.WriteLine(after); // Writes "15.0"
Console.ReadKey();
}
}
}
Possible solution
The issue can be solved by keeping the necessary number of decimal digits in JSON representation of the number, e.g. serialize decimal 15 as integer “15”, and decimal 15.0 as “15.0”. This is exactly how Decimal.ToString()
works. Then the number of digits can be respected when de-serializing back to decimal.
Issue Analytics
- State:
- Created 5 years ago
- Reactions:11
- Comments:12 (3 by maintainers)
That is expected behavior. Json.NET always serializes floats and decimals with a decimal point.
You could write a JsonConverter for
decimal
and write it without the trailing.0
I agree that this is unexpected behaviour at the very least, and imho it is also a bug. For
15
, the precision is 2 and the scale is 0. For15.0
, the precision is 3 and the scale is 1. They’re two different things.I wholly understand that this could be a major breaking change, but can you please reconsider it?