I'm working on converting a project from Mono to .NET Core. I'm running into a strange issue with a handful of unit tests that follow roughly this pattern:
- Save an entity to the database (Postgres) that includes a
DateTime property set to
- Read the entity back from the database.
- Assert that a bunch of properties (including the
DateTime property) are equal.
Under Mono (ServiceStack 4.5.10) these are all passing. Under .NET Core (ServiceStack 5.4.0) these all fail due to Postgres seemingly dropping the least significant digit off of the
Is this just an inherent precision incompatibility between .NET Core and Postgres? I don't actually care about 100,000th of a second precision, but I haven't been able to figure out an elegant/global way to get these passing again.
Yes, it is just an inherent incompatibility between .NET of either variety, Core or Framework, and Postgres. Postgres supports microsecond precision, .NET uses 10-nanosecond precision. I don't know why Mono seems to work, but I'd guess it forces the least significant digit to 0 somewhere.
When dealing with precision data types in tests I recommend using NUnit's Within precision modifier which we generally use when comparing DateTime's, e.g: