Testing the Taxonomy

To test the Taxonomy, we use Kaocha test runner. There are two types of tests: unit tests test isolated parts of the code and should run very fast for quick feedback during development. are located under test/clj/unit. Integration tests test various aspects of the system as a whole and are located under test/clj/integration. The integration test setup creates a temporary database for each test, which makes it safe to do any modifications without leaving traces behind. Test resource files can be found under test/resources. There is also some old tests that has not been broken apart yet to either integration or unit under directory base.

clj -M:test:kaocha will run all test on the default database backend (Datomic) in memory, and exclude Datahike specific tests.

It is possible to run the test suite with a specific database backend that is specified using the environment variable DATABASE_BACKEND. The unit tests use the configuration file test/resources/config/config.edn, which lists the backends with ids :datomic-v.kaocha and :datahike-v.kaocha. By setting the environment variable to one of these keys before we invoke the above command, the tests will be executed with that backend. In Bash, the syntax lets us set the environment variable on the same line as the command is invoked on:

DATABASE_BACKEND=:datomic-v.kaocha clj -M:test:kaocha

Useful Kaocha Commands

Kaocha provides several flags that can help tailor your testing process. Here are a few important ones:

--reporter

By default, Kaocha doesn't show much detail about the tests being run. To get more information, you can use the --reporter option. A good choice is the documentation reporter, which shows detailed output for each test.

  • Purpose: Controls how test results are displayed.
  • Example:
    clj -M:test:kaocha --reporter kaocha.report/documentation
    
    This command provides detailed output about each test run.

--focus

  • Purpose: Runs only a specific test or group of tests by name.
  • Example:
    clj -M:test:kaocha --focus my.test.namespace/specific-test
    

--focus-meta

  • Purpose: Runs tests that have specific metadata tags.
  • Example:
    clj -M:test:kaocha --focus-meta :integration
    

--seed

  • Purpose: Sets the seed for random test order to make test runs reproducible.
  • Example:
    clj -M:test:kaocha --seed 12345
    
    This ensures that tests run in the same order every time, which is useful for debugging.

--watch

  • Purpose: Automatically re-runs tests when files change.
  • Example:
    clj -M:test:kaocha --watch
    
    This keeps Kaocha running in the background, watching for file changes and rerunning tests as needed.

--fail-fast

  • Purpose: Stops the test suite as soon as a failure is encountered.
  • Example:
    clj -M:test:kaocha --fail-fast
    
    This is useful when you want to address the first error before running the rest of the tests.

You can combine different flags and commands to suit your needs. For a full list of reporters, see the Kaocha documentation.

Running Specific Tests

Kaocha lets you focus on specific tests with these options:

  • --focus: Run a specific test by its name.

    Command:

    clj -M:test:kaocha --focus my.test.namespace/specific-test
    
  • --focus-meta: Run tests that have specific metadata tags.

    Command:

    clj -M:test:kaocha --focus-meta :integration
    

Testing profiles

Select a test profile by providing the --profile <keyword> flag. Like clojure -M:kaocha --profile ci. Config files for the different profiles can be found under env/kaocha/resources

ProfilePurposeTest PathsPluginsReporterOutput Options
ciFor continuous integration. Runs a comprehensive set of tests with detailed reporting.test/clj/base
test/clj/unit
test/clj/integration
test/resources
:kaocha.plugin/cloverage
:kaocha.plugin/profiling
:kaocha.plugin/gc-profiling
:kaocha.plugin/junit-xml
kaocha.report/documentationJUnit XML report (target/junit.xml)
LCOV coverage report
devFor development environment. Runs different sets of tests with minimal output for quick feedback.test/clj/unit
test/clj/integration
test/clj/base
test/clj/utils
test/resources
Nonekaocha.report/dotsSimple console output
fullFor full test suite execution. Provides detailed coverage and profiling data.test/clj/unit
test/clj/integration
test/clj/base
test/clj/utils
test/resources
:kaocha.plugin/cloverage
:kaocha.plugin/profiling
:kaocha.plugin/gc-profiling
:kaocha.plugin/junit-xml
kaocha.report/documentationJUnit XML report (target/junit.xml)
LCOV and HTML coverage reports

Running code coverage

Code coverage is enabled by default in the :ci and :full profiles. This means that we will get a coverage report when running CI-pipeline. See --profile for more information.

The coverage report is generated by cloverage managed by the kaocha-cloverage plugin. The report is generated in the target/coverage directory.

To run with coverage enabled from the command line:

clj -M:test:kaocha --plugin cloverage

Testing in the REPL

Start the REPL with the test alias enabled, then run the tests:

user=> (use 'kaocha.repl)
user=> (run 'jobtech-taxonomy-api.test.graphql-test/graphql-test-1)

How to write an integration test

File and namespace

The tests and test resources reside in the test directory.

The test files are separated into two categories: unit and integration. The unit tests are for testing functions directly, while the integration tests are for testing calls through the API.

Test files are stored in test/clj/unit or test/clj/integration depending on what kind of test they contain. From that root they mirror the namespace they are testing. For example, the namespace jobtech-taxonomy.api.routes.services would have its unit tests in test/clj/unit/jobtech-taxonomy/api/routes/services_test.clj and its integration tests in test/clj/integration/jobtech-taxonomy/api/routes/services_test.clj. Sometimes when a module is large it can be tested in multiple files, for example test/clj/integration/jobtech-taxonomy/api/routes/services_test.clj and test/clj/unit/jobtech-taxonomy/api/routes/services_graphql_test.clj. This is generally an indication that the module should be split up.

You need to require [jobtech-taxonomy-api.test.test-utils :as util].

Define fixtures

Place one occurrence of this line in your test file: (test/use-fixtures :each util/fixture).

Define a test which calls functions directly

Here is a simple example of a test which asserts a skill concept, and then checks for its existence.

First, require

[jobtech-taxonomy-api.db.concept :as c]

Then write a test:

(test/deftest ^:concept-test-0 concept-test-0
  (test/testing "Test concept assertion."
    (c/assert-concept "skill" "cykla" "cykla")
    (let [found-concept (first (core/find-concept-by-preferred-term "cykla"))]
      (test/is (= "cykla" (get found-concept :preferred-label))))))