Integration Testing

(2Q19)


This article discusses integration testing of eXist-db applications. It also covers recommendations for the configuration of automated test environments, and explains the minimum testing requirements for apps that are published under the eXist-db namespace.

It assumes that you are familiar with the XQSuite framework for unit testing, and the general strategies for designing tests in eXist-db.

Introduction

Creating an automated minimal test suite is possible with relatively little effort. It pays to take the need for testing into account when you start developing your application. This enables others to extend your program with new features by knowing that these don't break existing functions. It also allows test-only contributions, helping you to gradually improve your test coverage. The following section will walk you trough the three main aspects of such a minimal test setup.

Building on a clean system

Before you start designing tests, you should start to automate your build process. This ensures that things don't only work on your system, and it can catch some common errors.

The examples in this article will use Travis CI as it is the most popular continuous integration (CI) service used by the eXist-db organization on GitHub. Other popular choices include AppVeyor, Jenkins, and CircleCI.

CI services typically require a small configuration file, instructing the service how to run your code on a clean virtual machine, without the risk of local files interfering. For Travis the required name of such a file is .travis.yml, and in its simplest form it would look like this:

# Tell Travis that we want Java.
language: java
dist: bionic

# This should be the minimal Java version required by eXist-db.
jdk:
  - openjdk8

# This makes the build command explicit.
install:
  - ant

For the correct way to create such a configuration file for other CI services please consult their documentation. In all cases, since eXist-db is written in Java, your app should be built on a system that comes with the minimal Java version required by eXist-db.

Most CI services will automatically detect your build tool and run the required command even if you don't specify it. In the above example, our app to be tested uses ant as a build tool; change ant to suite your needs, e.g. maven clean package, npm install, etc.

If you have multiple build targets for production and development, you should make sure that each build target is actually run by the CI service.

You can extend this basic template according to your needs, e.g.: apps written in Java might want to run the build step on multiple Java versions (by adding - openjdk11); or to test building on different operation systems. You should consult your CI service's documentation for the list of available configuration options.

Add a running eXist-db instance and install your app

The next step takes the result of your automated build process and installs it in a running eXist-db instance (external tools might want to talk to a running eXist-db instance in some other way). We are going to use eXist-db's Docker images for this, since it is supported by all CI services, and it tends to be the quickest way of getting an instance up and running. Let's extend the file created in the previous section.

language: java
dist: bionic

jdk:
  - openjdk8
  - openjdk11

# Tell Travis that we are using Docker.
services:
  - docker

# Always test against current and upcoming releases.
env:
  - img=existdb/existdb:latest
  - img=existdb/existdb:release

# Download exist into the test VM
before_install:
  - docker pull $img
  - docker create  --name exist-ci -p 8080:8080 $img

install:
  - ant develop

# take the .xar created above and install and deploy in a clean eXist-db instance.
before_script:
  - docker cp ./build/*-dev.xar exist-ci:exist/autodeploy
  - docker start exist-ci
  # exist needs time
  - sleep 30
  - docker ps

The :release and :latest tags are specifically designed for use in CI environments. You can also specify exact version to use (e.g., :4.4.0) to ensure backwards compatibility. These two tags will ensure that your code is tested against both the most current stable release and upcoming changes ahead of time.

To actually install the app, we copied it into eXist-db's autodeploy folder, which will make sure that any dependencies that you declared for your app will also be installed. If you require more complex installation steps, you can find more examples and links in the docker-existdb readme.

So far, we have simply automated the basic steps of building and installing your app. This already catches some basic and particularly severe errors, but it is not a very realistic test of what users actually experience when they install your application. Before we can refine the way we simulate usage patterns, we first need to add the means to run actual tests within our CI environment.

Integrating unit tests into CI

Integration testing and unit testing go hand in hand, since one without the other does not work well. If you are writing an application, you already should have unit tests for the functional components of your code. By running your unit tests inside your CI server, these become immediately visible to potential contributors, and you have the advantage of immediate feedback on every code change.

Important:

Tests that are invisible to other contributors because they are hidden away and have only ever been run on the original author's system are of very limited use.

As with the previous options there are different test runners to do this work for you, such as JUnit for Java, Mocha for JavaScript, XQSuite for XQuery. To run your tests, we are going to leverage the support for running unit tests of our build system (e.g.: npm test, mvm test, etc.):

language: java
dist: bionic

jdk:
  - openjdk8
  - openjdk11

services:
    - docker

env:
  - img=existdb/existdb:latest
  - img=existdb/existdb:release

before_install:
  - docker pull $img
  - docker create  --name exist-ci -p 8080:8080 $img

install:
  - ant develop

before_script:
  - docker cp ./build/*-dev.xar exist-ci:exist/autodeploy
  - docker start exist-ci
  # exist needs time
  - sleep 30
  - docker ps

# This makes the test command explicit.
script:
  - ant test

Just as with the building example, many CI services will execute this command automatically. But even in a simple case it helps others understand your code, and to find your tests, if you make the test command explicit.

If you use more then one test runner, you can simply add additional test commands to the script parameter. In the case of our app, eXist-db is already running in the background, so it is also possible to run your XQSuite unit tests. You can see how this is configured, for apps using the yeoman templates.

How to write good unit tests is beyond the scope of this article. Whenever you are struggling with your integration test, however, you should ask yourself if what you are trying to achieve might not be better served by creating unit tests. Whichever solution works best for you, you should not rely on either unit or integration tests alone, and both should be integrated into your CI pipeline.

Testing your app in a controlled context

Unlike unit tests which excel at testing individual functions and are quick to write and perform, interaction tests focus on the complex interaction between your code and that of the larger environment. Just search for 2 unit tests 0 integration tests to see what we mean.

For eXist-db applications, integration tests will typically involve a browser, as we are trying to mimic the way a user interacts with our application from their system.

While our examples focus on browser testing, you can see shell-based integration tests at the test suite for building the Docker images we used earlier.

Common tools for browser testing include Cypress, Selenium, and WebDriver. As before the choice is up to you, whichever you choose it should be clearly documented so your contributors know how to adjust test cases for new features and how to maintain your tests. We focus on Cypress, as it does not require any additional steps for configuring a browser first.

If you need to perform cross-browser testing you can take a look at services such as Sauce Labs

You can simply execute the Cypress test command inside your CI test script after the unit test command we added earlier.

language: java
dist: bionic

jdk:
  - openjdk8
  - openjdk11

services:
    - docker

env:
  - img=existdb/existdb:latest
  - img=existdb/existdb:release

before_install:
  - docker pull $img
  - docker create  --name exist-ci -p 8080:8080 $img

install:
  - ant develop

before_script:
  - docker cp ./build/*-dev.xar exist-ci:exist/autodeploy
  - docker start exist-ci
  # exist needs time
  - sleep 30
  - docker ps

script:
  - ant test
  - npx cypress run

With Cypress you write your tests in the same fashion as you would with Mocha unit tests; however, you now address the rendered document inside a browser instead of individual JavaScript functions.

describe('The dashboard', function() {
  it('should load', function() {
    // Go to Dashboard
    cy.visit('/dashboard/index.html')
  })
  // Click the login button
  describe('login', function() {
    before(function() {
      cy.get('#user_label').click()

      cy.get('#dijit_form_ValidationTextBox_0').type('admin')

      cy.get('#dijit_form_Button_0_label').click()
    })

    // check if Collection browser is there
    it('should see Collection Browser', function() {
      // click Collection Browser
      cy.contains('Collections').click()

      // more tests go here …

      // close the window
      cy.get('#inlineClose')
    })
  })
})

The above example opens a page (the dashboard) in the browser, logs in, and closes the window it just opened. You can do many more things, but these examples are meant to provide a good starting point for creating your first integration test. If there are any console errors or problems with rendering content Cypress will create an error message and your tests will fail. To check the syntax of these commands, and to see many more examples please visit the Cypress documentation.

Now that we have built our app in a clean system, executed its unit tests, and opened the start page of our freshly installed app in clean eXist-db instance, we have achieved a basic smoke test. We switched it on, and there was no smoke. Proper testing can now commence. Obviously you might want to visit multiple pages, or compare screenshots to avoid visual regressions, or compare images in multiple browsers, etc.

All of which are excellent ideas, and with these basics in place it hopefully no longer seem so daunting a task. Depending on your own specific requirements, we would encourage your to browse other eXist-db repositories in addition to the documentation of your CI services and test suite frameworks. Chances are someone already has created a solid test similar to your needs.