Why does a C# System.Decimal remember trailing zeros?

It can be useful to represent a number including its accuracy – so 0.5m could be used to mean “anything between 0.45m and 0.55m” (with appropriate limits) and 0.50m could be used to mean “anything between 0.495m and 0.545m”.

I suspect that most developers don’t actually use this functionality, but I can see how it could be useful sometimes.

I believe this ability first arrived in .NET 1.1, btw – I think decimals in 1.0 were always effectively normalized.

Leave a Comment