I’m wanting to introduce some automated testing to verify we haven’t introduced any breaking changes between updates of our API. There is a StackOverflow answer that suggests using integration tests using old DTOs for this.
However I’m not sure how to set this up. I guess I’d want to use the tests and DTOs from the previous release but the interfaces and DTOs from the release candidate. My problem is how to use two different versions of the ServiceModel (DTOs) DLL. Would I need to change the assembly probing path for the integration tests proj and locate the old DLLs outside of the bin? I could be way of track here
Does anybody have any suggestions on how to set this up? Or any other suggestion for verifying that no breaking changes have been introduced.
There are likely many options but the 2 I’d consider is having the same integration test in different git branches and have your CI run tests on different branches. So before you upgrade your Integration tests to use a new version, create a branch so it captures the current Client Library + Service Model versions and configure your CI to run tests in the different version branches.
The other option is data-driven integration tests approach where you could capture the Raw HTTP Traffic of existing requests with their expected results and play them back by sending raw http traffic e.g. by using HttpUtils Send raw String APIs + Headers.
I should also probably add that introspec opens up a lot of interesting possibilities for us which are on our todo list.
the servicecop itself will run as a service which we could use during design or build time to automatically block breaking changes from being deployed without explicit authorisaton.
Another such possibility is a service which can use the spec to generate test DTO’s to submit against the services for both contract verification and load-testing.
Also on the todo list, is to add a “WhatIf” flag to our Discovery plugin that combined with some trickery can fake execution of not only the request method, but any of it’s downstream Gateway calls and generate execution maps, coupled with our correlation plugin, we can also use this for creating timing sequence diagrams (ala chrome tools)
there’s a ton more things we have in store but we see a lot of value in this approach.
Oh and I forgot the main course. The DocIt site which is another on our roadmap. From our spec we are going to create a documentation site (think SwaggerUI type thing). Combining this with a custom consul gateway, we can aggregate the specs from all services into one place.
The fact that we can convert the spec into Postman and RAML (and potentially more formats) also means that we can generate http request code samples using existing tooling libraries in a ton of different languages.
And for the finale, given that the doc site is a servicestack site (albeit with with a custom metadata plugin that hooks into our gateway) it can be used as a single servicestack reference url in our projects. We combine this with the include filtering that we PR’d and it means that we can offer consumers a basket style of picking DTO’s to add as a service reference at design time.
Hi @mythz, the problem I’m having is that the integration tests project is self-hosting the assembly so the ServiceModel is shared by the tests and the interfaces proj. So when running the old tests, the old ServiceModel is used by the new interface proj. If I make a non-breaking change such as adding a new optional field the old tests fail with MethodMissing during testing. Should I just host the service externally to the tests?
I’d just have the integration tests pointing to an external staging/uat instance of your current Service, so all integration test versions are calling the same url (i.e. running your latest Service).
The tests would just include the client libraries so there shouldn’t be any incompatible .dll issues with your Server library.