DumpTable fails with Oracle data types

Using Oracle as a backend causes exceptions with certain utilities with dynamic results or results to a dictionary. If we use POCO classes things work. However, we have a need for adhoc queries which require an unknown result structure.

For example. We have an Oracle table which uses oracle types number(*) and Date. Using DumpTable we get an exception in ServiceStack.Script.DefaultScripts.dateFormat(DateTime dateValue). The value is nullable in the table. When using PrintDump we see
__type: “ServiceStack.OrmLite.Oracle.OracleValue, ServiceStack.OrmLite.Oracle”

for those two oracle data types. Finally, using ToCsv on the results all of those columns are shown as {} in the result set.

Is there something to be done here?

You can try registering an Auto mapping Converter that converts it to a DateTime if that’s the cause of the Exception it’s trying to coerce to.

Otherwise you’d need to pass it through your own santize script method which modifies a Dictionary<string,object> to replace all non serializable values to their system types.

So looked into the script stuff… bit much to try and wrap my head around. I was able to make a simple converter that meets my needs. Seems to work fine. This was a unit test.

     AutoMapping.RegisterConverter((Dictionary<string, object> d) =>
                var nd = new Dictionary<string, object>();
                d.ForEach((s, o) =>
                    nd[s] = o?.ToString();
                return nd;
        var dbFactory = ServiceProvider.GetRequiredService<IDbConnectionFactory>();

        using var db = dbFactory.Open();
        var result = await db.SelectAsync<Dictionary<string, object>>("Select * from companies");
        var e = result.ConvertAll(input => input.ConvertTo<Dictionary<string, object>>());
        var t = e.DumpTable();
1 Like