Problem
We should have a more clear distinction between testing contexts. Further, the tests should catch issues that would otherwise result in a build that is horrifically broken being published. The goal is not to have perfect test coverage so that we never ship a single bug, cause we would be at this for a life time. The goal is to never ship a version of this tool that doesn't work at all.
Solution
I propose a list of testing steps, going in order of "Fail Fast As Possible" - The time it takes for each group of test to pass must be increasing.
Linting
Linting will catch syntax errors and various small issues. It's the fastest type of test, and so we run this first.
We should not give warnings - only errors. If it's not worth being an error, then don't use the rule.
Unit Tests
Unit tests mock out any type of IO, as well as dependent modules. Unit tests test individual exported methods to ensure that their interfaces are upheld, and that any logic that the function is implementing is documented via a use case formatted unit test.
Ex of good unit test case:
it("returns null if you try to divide by zero", () => { ... })
Ex of bad unit test case
("it doesn't divide by zero", () => { ... });
Integration Tests
Integration tests are the most intensive test, and will generally take much longer than the others. They should test this tool from end to end, using the filesystem and everything just as a user would.
Integration tests should accomplish the following main points.
Make a build
The integration tests should use the built code, and should not run via ts-jest, but rather as nodejs targeted javascript.
Generating clients from examples
Using build/bin/cli.js
, run the generator against each of the examples found in @open-rpc/examples. Write each generated client to the test/
directory.
Integration Testing the generated clients in each language
For each generated example client, we must ensure that each language's client is functioning as intended. To do this, we must start a mock server, and run the language-specific integration test script.
node_modules/.bin/open-rpc-mock-server -s { "openrpc": "1.0.0-rc1", ... } -p 6969
./test/${exampleName}/${language}/bin/test.integration.sh
.
To make this work, we obviously need to add a file to each templates/{language}/static/bin/
a test.sh
script that would run tests in the particular language-specific way.
Language Specific Integration Tests
The generated client for each language should have a tests suite that is generated alongside it. This means that the language specific client implementation MUST make provisions to generate these tests. Since each language has its own tools for testing, each language implementation must have an integration test script that follows the naming convention:
./test/${exampleName}/${language}/bin/test.integration.sh
.
Once the script is run, the language specific tests should test the following:
- the client can be imported.
- for each method in the client:
Changes to generator-client's package.json
- This will add the following "scripts" to the root package.json:
{
"test": "jest --coverage",
"test:unit": "jest ./src",
"test:integration": "jest ./integration-tests"
}
- Requires generating a 'test application' alongside each client, which will template the calls to each method, asserting the result is correct.