SKM IT World

Just another blog about IT


2 Comments

Test Coverage Reports For Maven Projects In SonarQube 8.3.x

Some years ago I write a blog post about how to generate test reports in SonarQube separate in test report for unit tests and for integration tests. Since SonarQube 6.2 the test report isn’t separate in these categories any more (see SonarQube’s blog post). SonarQube merges all test reports to one test report with an overall coverage. So how to configure JaCoCo Maven Plugin if you have separate your tests in unit tests (running by Maven Surefire Plugin) and integration tests (running by Maven Failsafe Plugin) in your Maven project.

In the following sections, a solution is presented that meets following criteria:

  • Maven is used as build tool.
  • The project can be a multi module project.
  • Unit tests and integration tests are parts of each module.
  • Test coverage is measured by JaCoCo Maven Plugin.

The road map for the next section is that firstly the Maven project structure is shown for the separation of unit and integration tests. Then the Maven project configuration is shown for having separate unit test runs and integration test runs.  After that, we have a look on the Maven project configuration for the test report generation that covers unit and integration tests. At the end, SonarQube’s configuration is shown for the test report visualization in the SonarQube’s dashboard.

Maven Project Structure

At first, we look at how a default Maven project structure looks like for a single module project.

my-app
├── pom.xml
├── src
│   ├── main
│   │   └── java
│   └── test
│       └── java

The directory src/main/java contains the production source code and the directory src/test/java contains the test source code. We could put unit tests and integration tests together in this directory. But we want to separate these two types of tests in separate directories. Therefore, we add a new directory called src/it/java. Then unit tests are put in the directory src/test/java and the integration tests are put in the directory src/it/java, so the new project structure looks like the following one.

my-app
├── pom.xml
├── src
│   ├── it
│   │   └── java
│   ├── main
│   │   └── java
│   └── test
│       └── java

Unit And Integration Test Runs

Fortunately, the unit test run configuration is a part of the Maven default project configuration. Maven runs these tests automatically if following criteria are met:

  • The tests are in the directory src/test/java and
  • the test class name either starts with Test or ends with Test or TestCase.

Maven runs these tests during the Maven’s build lifecylce phase test.

The integration test run configuration has to be done manually. It exists Maven plugins that can help. We want that the following criteria are met:

  • integration tests are stored in the directory src/it/java and
  • the integration test class name either starts IT or ends with IT or ITCase and
  • integrations tests runs during the Maven’s build lifecycle phase integration-test.

Firstly, Maven has to know that it has to include the directory src/it/java to its test class path. Here, the Build Helper Maven Plugin can help. It adds the directory src/it/java to the test class path.

            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>build-helper-maven-plugin</artifactId>
                <version>3.1.0</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>add-test-source</goal>
                            <goal>add-test-resource</goal>
                        </goals>
                        <configuration>
                            <sources>
                                <source>src/it/java</source>
                            </sources>
                            <resources>
                                <resource>
                                    <directory>src/it/resources</directory>
                                </resource>
                            </resources>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

The above code snippet has to be inserted into the section <project><build><plugins> in the project root pom.

Maven’s build lifecycle contains a phase called integration-test.  In this phase, we want to run the integration test. Fortunately, Maven Failsafe Plugins’ goal integration-test is bind to this phase automatically when it’s set up in the POM. If you want that the build fails when the integration tests fails then the goal verify also has to be added into the POM:

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-failsafe-plugin</artifactId>
                <version>3.0.0-M4</version>
                <configuration>
                    <encoding>${project.build.sourceEncoding}</encoding>
                </configuration>
                <executions>
                    <execution>
                        <goals>
                            <goal>integration-test</goal>
                            <goal>verify</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>

Again, the above code snippet also has to be inserted into the section <project><build><plugins> in the project root pom. Then Maven Failsafe Plugin runs the integration tests automatically, when their class name either starts with IT or ends with IT or ITCase.

Test Report Generation

We want to use the JaCoCo Maven Plugin for the test report generation. It should generate test reports for the unit tests and for the integration tests. Therefore, the plugin has to two separated agents, that have to be prepared. Then they generate the report during the test runs. The Maven’s build lifecycle contains own phases for preparation before the test phases (test and integration-test). The preparation phase for the test phase is called process-test-classes and the preparation phase for integration-test phase is called pre-integration-test. JaCoCo binds its agent to these phases automatically, when its goals prepare-agent and prepare-agent-integration are set up in the POM. But this is not enough. JaCoCo also has to create a report, so that SonarQube can read the reports for the visualization. Therefore, we have to add the goals report and report-integration in the POM:

            <plugin>
                <groupId>org.jacoco</groupId>
                <artifactId>jacoco-maven-plugin</artifactId>
                <version>0.8.5</version>
                <executions>    
                    <execution>
                        <goals>  
                            <goal>prepare-agent</goal>
                            <goal>prepare-agent-integration</goal>
                            <goal>report</goal>
                            <goal>report-integration</goal>
                        </goals>  
                    </execution>
                </executions>  
            </plugin>

Again, it is a part of the section <project><build><plugins>.

Now, we can run the goal mvn verify and our project is built inclusive unit and integration test and inclusive generating two test reports.

SonarQube Test Report Visualization

Now, we want to visualize our test reports in SonarQube. Therefore, we have to run the Sonar Maven 3 Plugin (command mvn sonar:sonar)  in our project after a successful build. So Sonar Maven Plugin knows where to upload the report, we have to configure our SonarQube instance in ~/.m2/setting.xml:

    <profile>
      <id>sonar</id>
      <activation>
        <activeByDefault>true</activeByDefault>
      </activation>
      <properties>
        <!-- Optional URL to server. Default value is http://localhost:9000 -->
        <sonar.host.url>http://localhost:9000</sonar.host.url>
      </properties>
    </profile>

When we open our project in the SonarQube dashboard, we see the overall test coverage report.

Summary

This blog describes how to generate test reports for a Maven build if unit tests and integration tests run separately. On GitHub, I host a sample project that demonstrate all configuration steps. As technical environment I use

  • Maven 3.6.3
  • Maven Plugins:
    • Maven Surefire Plugin
    • Maven Failsafe Plugin
    • Build Helper Maven Plugin
    • Jacoco Maven Plugin
    • Sonar Maven Plugin
  • SonarQube 8.3.1
  • Java 11

Links

  1. Jacoco Maven plugin project site
  2. Maven Failsafe Plugin  project site
  3. Build Helper Maven Plugin project site
  4. SonarQube documentation about Test Coverage in common
  5. A sample Maven project on GitHub


Leave a comment

Using Testcontainers in Spring Boot Tests For Database Integration Tests

In this blog post I’d like to demonstrate how I integrate Testcontainers in Spring Boot tests for running integration tests with a database. I’m not using Testcontainers’ Spring Boot modules. How it works with them, I will show in a separate blog post. All samples can be found on GitHub.

Why Testcontainers?

Testcontainers is a library that helps to integrate infrastructure components like database in integration tests based on Docker Container. It helps to avoid writing integrated tests. These are kind of tests that will pass or fail based on the correctness of another system. With Testcontainers I have the control over these dependent systems.

Introducing the domain

The further samples shows different approach how to save some hero objects through different repository implementations in a database and how the corresponding tests could look like.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import java.util.Objects;

public class Hero {
    private Long id;
    private String name;
    private String city;
    private ComicUniversum universum;

    public Hero(String name, String city, ComicUniversum universum) {
        this.name = name;
        this.city = city;
        this.universum = universum;
    }

    public String getName() {
        return name;
    }

    public String getCity() {
        return city;
    }

    public ComicUniversum getUniversum() {
        return universum;
    }
}

All further repositories are parts of a Spring Boot web application. So at the end of this blog post I will demonstrate how to write a test for the whole web application including a database. Let’s start with an easy sample, a repository based on JDBC.

Testing Repository Based on JDBC

Assume we have following repository implementation based on JDBC. We have two methods, one for adding a hero into the database and one for getting all heroes from the database.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

import javax.sql.DataSource;
import java.util.Collection;

@Repository
public class HeroClassicJDBCRepository {

    private final JdbcTemplate jdbcTemplate;

    public HeroClassicJDBCRepository(DataSource dataSource) {
        jdbcTemplate = new JdbcTemplate(dataSource);
    }

    public void addHero(Hero hero) {
        jdbcTemplate.update("insert into hero (city, name, universum) values (?,?,?)",
                hero.getCity(), hero.getName(), hero.getUniversum().name());

    }

    public Collection<Hero> allHeros() {
        return jdbcTemplate.query("select * From hero",
                (resultSet, i) -> new Hero(resultSet.getString("name"),
                                            resultSet.getString("city"),
                                            ComicUniversum.valueOf(resultSet.getString("universum"))));
    }

}

For this repository, we can write a normal JUnit5 tests without Spring application context loading. So first at all, we have to set up the dependencies to the test libraries, in this case, JUnit5 and Testcontainers. As build tool, I use Maven. Both test libraries provide so called BOM “bill of material”, that helps to avoid a version mismatch in my used dependencies. As database, I want to use MySQL. Therefore, I use the Testcontainers’ module mysql additional to the core module testcontainers. It provides a predefined MySQL container. For simplifying the container setup specifically in JUnit5 test code, Testcontainers provides a JUnit5 module junit-jupiter.

    <dependencies>
        <dependency>
            <groupId>org.testcontainers</groupId>
            <artifactId>testcontainers</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.testcontainers</groupId>
            <artifactId>junit-jupiter</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.testcontainers</groupId>
            <artifactId>mysql</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.junit.jupiter</groupId>
            <artifactId>junit-jupiter</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>
    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.junit</groupId>
                <artifactId>junit-bom</artifactId>
                <version>${junit.jupiter.version}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
            <dependency>
                <groupId>org.testcontainers</groupId>
                <artifactId>testcontainers-bom</artifactId>
                <version>${testcontainers.version}</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>

Now, we have everything to write the first test.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import ...

@Testcontainers
class HeroClassicJDBCRepositoryIT {
    @Container
    private MySQLContainer database = new MySQLContainer();

    private HeroClassicJDBCRepository repositoryUnderTest;

    @Test
    void testInteractionWithDatabase() {
        ScriptUtils.runInitScript(new JdbcDatabaseDelegate(database, ""),"ddl.sql");
        repositoryUnderTest = new HeroClassicJDBCRepository(dataSource());

        repositoryUnderTest.addHero(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));

        Collection<Hero> heroes = repositoryUnderTest.allHeros();

        assertThat(heroes).hasSize(1);
    }

    @NotNull
    private DataSource dataSource() {
        MysqlDataSource dataSource = new MysqlDataSource();
        dataSource.setUrl(database.getJdbcUrl());
        dataSource.setUser(database.getUsername());
        dataSource.setPassword(database.getPassword());
        return dataSource;
    }
}

Let’s have a look how the database is prepared for the test. Firstly, we annotate the test class with @Testcontainers. Behind this annotation hides a JUnit5 extension provided by Testcontainers. It checks if Docker is installed on the machine, starts and stops the container during the test. But how Testcontainers knows which container it should start? Here, the annotation @Container helps. It marks container that should manage by the Testcontainers extension. In this case, a MySQLContainer provided by Testcontainers module mysql. This class provides a MySQL Docker container and handles such things like setting up database user, recognizing when the database is ready to use etc. As soon as the database is ready to use, the database schema has to be set up. Testcontainers can also provide support here. ScriptUtils.runInitScript(new JdbcDatabaseDelegate(database, ""),"ddl.sql"); ensure that the schema is set up like it defines in SQL script ddl.sql.

-- ddl.sql
create table hero (id bigint AUTO_INCREMENT PRIMARY KEY, city varchar(255), name varchar(255), universum varchar(255)) engine=InnoDB

Now we are ready to set up our repository under test. Therefore, we need the database connection information for the DataSource object. Under the hood, Testcontainers searches after an available port and bind the container on this free port. This port number is different on every container start via Testcontainers. Furthermore, it configures the database in container with a user and password. Therefore, we have to ask the MySQLContainer object how the database credentials and the JDBC URL are. With this information, we can set up the repository under test (repositoryUnderTest = new HeroClassicJDBCRepository(dataSource());) and finish the test.

If you run the test and you get the following error message:

17:18:50.990 [ducttape-1] DEBUG com.github.dockerjava.core.command.AbstrDockerCmd - Cmd: org.testcontainers.dockerclient.transport.okhttp.OkHttpDockerCmdExecFactory$1@1adc57a8
17:18:51.492 [ducttape-1] DEBUG org.testcontainers.dockerclient.DockerClientProviderStrategy - Pinging docker daemon...
17:18:51.493 [ducttape-1] DEBUG com.github.dockerjava.core.command.AbstrDockerCmd - Cmd: org.testcontainers.dockerclient.transport.okhttp.OkHttpDockerCmdExecFactory$1@3e5b3a3b
17:18:51.838 [main] DEBUG org.testcontainers.dockerclient.DockerClientProviderStrategy - UnixSocketClientProviderStrategy: failed with exception InvalidConfigurationException (ping failed). Root cause LastErrorException ([111] Verbindungsaufbau abgelehnt)
17:18:51.851 [main] DEBUG org.rnorth.tcpunixsocketproxy.ProxyPump - Listening on localhost/127.0.0.1:41039 and proxying to /var/run/docker.sock
17:18:51.996 [ducttape-0] DEBUG org.testcontainers.dockerclient.DockerClientProviderStrategy - Pinging docker daemon...
17:18:51.997 [ducttape-1] DEBUG org.testcontainers.dockerclient.DockerClientProviderStrategy - Pinging docker daemon...
17:18:51.997 [ducttape-0] DEBUG com.github.dockerjava.core.command.AbstrDockerCmd - Cmd: org.testcontainers.dockerclient.transport.okhttp.OkHttpDockerCmdExecFactory$1@5d43d23e
17:18:51.997 [ducttape-1] DEBUG com.github.dockerjava.core.command.AbstrDockerCmd - Cmd: org.testcontainers.dockerclient.transport.okhttp.OkHttpDockerCmdExecFactory$1@7abf08d2
17:18:52.002 [tcp-unix-proxy-accept-thread] DEBUG org.rnorth.tcpunixsocketproxy.ProxyPump - Accepting incoming connection from /127.0.0.1:41998
17:19:01.866 [main] DEBUG org.testcontainers.dockerclient.DockerClientProviderStrategy - ProxiedUnixSocketClientProviderStrategy: failed with exception InvalidConfigurationException (ping failed). Root cause TimeoutException (null)
17:19:01.870 [main] ERROR org.testcontainers.dockerclient.DockerClientProviderStrategy - Could not find a valid Docker environment. Please check configuration. Attempted configurations were:
17:19:01.872 [main] ERROR org.testcontainers.dockerclient.DockerClientProviderStrategy -     EnvironmentAndSystemPropertyClientProviderStrategy: failed with exception InvalidConfigurationException (ping failed)
17:19:01.873 [main] ERROR org.testcontainers.dockerclient.DockerClientProviderStrategy -     EnvironmentAndSystemPropertyClientProviderStrategy: failed with exception InvalidConfigurationException (ping failed)
17:19:01.874 [main] ERROR org.testcontainers.dockerclient.DockerClientProviderStrategy -     UnixSocketClientProviderStrategy: failed with exception InvalidConfigurationException (ping failed). Root cause LastErrorException ([111] Verbindungsaufbau abgelehnt)
17:19:01.875 [main] ERROR org.testcontainers.dockerclient.DockerClientProviderStrategy -     ProxiedUnixSocketClientProviderStrategy: failed with exception InvalidConfigurationException (ping failed). Root cause TimeoutException (null)
17:19:01.875 [main] ERROR org.testcontainers.dockerclient.DockerClientProviderStrategy - As no valid configuration was found, execution cannot continue
17:19:01.900 [main] DEBUG 🐳 [mysql:5.7.22] - mysql:5.7.22 is not in image name cache, updating...
Mai 01, 2020 5:19:01 NACHM. org.junit.jupiter.engine.execution.JupiterEngineExecutionContext close
SEVERE: Caught exception while closing extension context: org.junit.jupiter.engine.descriptor.MethodExtensionContext@2e6a5539
org.testcontainers.containers.ContainerLaunchException: Container startup failed
	at org.testcontainers.containers.GenericContainer.doStart(GenericContainer.java:322)
	at org.testcontainers.containers.GenericContainer.start(GenericContainer.java:302)
	at org.testcontainers.junit.jupiter.TestcontainersExtension$StoreAdapter.start(TestcontainersExtension.java:173)
	at org.testcontainers.junit.jupiter.TestcontainersExtension$StoreAdapter.access$100(TestcontainersExtension.java:160)
	at org.testcontainers.junit.jupiter.TestcontainersExtension.lambda$null$3(TestcontainersExtension.java:50)
	at org.junit.jupiter.engine.execution.ExtensionValuesStore.lambda$getOrComputeIfAbsent$0(ExtensionValuesStore.java:81)
	at org.junit.jupiter.engine.execution.ExtensionValuesStore$MemoizingSupplier.get(ExtensionValuesStore.java:182)
	at org.junit.jupiter.engine.execution.ExtensionValuesStore.closeAllStoredCloseableValues(ExtensionValuesStore.java:58)
	at org.junit.jupiter.engine.descriptor.AbstractExtensionContext.close(AbstractExtensionContext.java:73)
	at org.junit.jupiter.engine.execution.JupiterEngineExecutionContext.close(JupiterEngineExecutionContext.java:53)
	at org.junit.jupiter.engine.descriptor.JupiterTestDescriptor.cleanUp(JupiterTestDescriptor.java:222)
	at org.junit.jupiter.engine.descriptor.JupiterTestDescriptor.cleanUp(JupiterTestDescriptor.java:57)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$cleanUp$9(NodeTestTask.java:151)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.cleanUp(NodeTestTask.java:151)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:83)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
	at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
	at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
	at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
	at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32)
	at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
	at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:229)
	at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$6(DefaultLauncher.java:197)
	at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:211)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:191)
	at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:128)
	at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:69)
	at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
	at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
	at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
Caused by: org.testcontainers.containers.ContainerFetchException: Can't get Docker image: RemoteDockerImage(imageNameFuture=java.util.concurrent.CompletableFuture@539d019[Completed normally], imagePullPolicy=DefaultPullPolicy(), dockerClient=LazyDockerClient.INSTANCE)
	at org.testcontainers.containers.GenericContainer.getDockerImageName(GenericContainer.java:1265)
	at org.testcontainers.containers.GenericContainer.logger(GenericContainer.java:600)
	at org.testcontainers.containers.GenericContainer.doStart(GenericContainer.java:311)
	... 47 more
Caused by: java.lang.IllegalStateException: Previous attempts to find a Docker environment failed. Will not retry. Please see logs and check configuration
	at org.testcontainers.dockerclient.DockerClientProviderStrategy.getFirstValidStrategy(DockerClientProviderStrategy.java:78)
	at org.testcontainers.DockerClientFactory.client(DockerClientFactory.java:115)
	at org.testcontainers.LazyDockerClient.getDockerClient(LazyDockerClient.java:14)
	at org.testcontainers.LazyDockerClient.inspectImageCmd(LazyDockerClient.java:12)
	at org.testcontainers.images.LocalImagesCache.refreshCache(LocalImagesCache.java:42)
	at org.testcontainers.images.AbstractImagePullPolicy.shouldPull(AbstractImagePullPolicy.java:24)
	at org.testcontainers.images.RemoteDockerImage.resolve(RemoteDockerImage.java:62)
	at org.testcontainers.images.RemoteDockerImage.resolve(RemoteDockerImage.java:25)
	at org.testcontainers.utility.LazyFuture.getResolvedValue(LazyFuture.java:20)
	at org.testcontainers.utility.LazyFuture.get(LazyFuture.java:27)
	at org.testcontainers.containers.GenericContainer.getDockerImageName(GenericContainer.java:1263)
	... 49 more



org.testcontainers.containers.ContainerLaunchException: Container startup failed

This error message means that the Docker daemon is not running. After ensuring that Docker daemon is running, the test run is successful.

There are very many debug messages in the console output. The logging output in tests can be configured by a logback.xml file in src/test/resources:

<?xml version="1.0" encoding="UTF-8" ?>
<configuration>
    <include resource="org/springframework/boot/logging/logback/base.xml"/>
    <root level="info">
        <appender-ref ref="CONSOLE" />
    </root>
</configuration>

Spring Boot documentation about logging recommends to use logback-spring.xml as configuration file. But normal JUnit5 tests don’t recognize it, only @SpringBootTest annotated tests. logback.xml is used by both kind of tests.

Testing Repository based on JPA Entity Manager

Now, we want to implement a repository based on JPA with a classic entity manager. Assume, we have following implementation with three methods, adding heroes to the database, finding heroes by search criteria and getting all heroes from the database. The entity manager is configured by Spring’s application context (@PersistenceContext is responsible for that).

package com.github.sparsick.testcontainerspringboot.hero.universum;

import ...

@Repository
public class HeroClassicJpaRepository {

    @PersistenceContext
    private EntityManager em;

    @Transactional
    public void addHero(Hero hero) {
        em.persist(hero);
    }

    public Collection<Hero> allHeros() {
        return em.createQuery("Select hero FROM Hero hero", Hero.class).getResultList();
    }

    public Collection<Hero> findHerosBySearchCriteria(String searchCriteria) {
        return em.createQuery("SELECT hero FROM Hero hero " +
                        "where hero.city LIKE :searchCriteria OR " +
                        "hero.name LIKE :searchCriteria OR " +
                        "hero.universum = :searchCriteria",
                Hero.class)
                .setParameter("searchCriteria", searchCriteria).getResultList();
    }

}

As JPA implementation, we choose Hibernate and MySQL as database provider. We have to configure which dialect should Hibernate use.

# src/main/resources/application.properties
spring.jpa.database-platform=org.hibernate.dialect.MySQL8Dialect

In application.properties you also configure the database connection etc.

For setting up the entity manager in a test correctly, we have to run the test with an application context, so that entity manager is configured correctly by Spring.

Spring Boot brings some test support classes. Therefore, we have to add a further test dependency to the project.

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>

This starter also includes JUnit Jupiter dependency and dependencies from other test library, so you can remove these dependencies from your dependency declaration if you want to.

Now, we have everything for writing the test.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import ...

@SpringBootTest
@Testcontainers
class HeroClassicJpaRepositoryIT {
    @Container
    private static MySQLContainer database = new MySQLContainer();

    @Autowired
    private HeroClassicJpaRepository repositoryUnderTest;

    @Test
    void findHeroByCriteria(){
        repositoryUnderTest.addHero(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));

        Collection<Hero> heros = repositoryUnderTest.findHerosBySearchCriteria("Batman");

        assertThat(heros).contains(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));
    }
   
    @DynamicPropertySource
    static void databaseProperties(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url", database::getJdbcUrl);
        registry.add("spring.datasource.username", database::getUsername);
        registry.add("spring.datasource.password", database::getPassword);
    }
}

The test class is annotated with some annotations. The first one is @SpringBootTest thereby the Spring application context is started during the test. The last one is @Testcontainers . This annotation we already know from the last test. It is a JUnit5 extension that manage starting and stopping the docker container during the test. Like we see at above JDBC test, we annotate database container private static MySQLContainer database = new MySQLContainer(); with @Container. It marks that this container should be managed by Testcontainers. Here is a little difference to above JDBC set up. Here, MySQLContainer database is static and in the JDBC set up it is a normal class field. Here, it has to be static because the container has to start before the application context starts, so that we have a change to pass the database connection configuration to the application context. For this, static method databaseProperties is responsible. Here, it is important that this method is annotated by @DynamicPropertySource. It overrides the application context configuration during the start phase. In our case, we want to override the database connection configuration with the database information that we get from database container object managed by Testcontainers. The last step is to set up the database schema in the database. Here JPA can help. It can create a database schema automatically. You have to configure it with

# src/test/resources/application.properties
spring.jpa.hibernate.ddl-auto=update

Now, we can inject the repository into the test (@Autowired private HeroClassicJpaRepository repositoryUnderTest). This repository is configured by Spring and ready to test.

Testing Repository based on Spring Data JPA

Today, it is common in a Spring Boot application to use JPA in combination with Spring Data, so we rewrite our repository to use Spring Data JPA instead of plain JPA. The result is an interface that extends Spring Data’s CrudRepository, so we have all basic operation like save, delete, update find by id etc. . For searching by criteria functionality, we have to define a method with @Query annotation that have a JPA query.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.CrudRepository;
import org.springframework.data.repository.query.Param;

import java.util.List;

public interface HeroSpringDataJpaRepository extends CrudRepository<Hero, Long> {

    @Query("SELECT hero FROM Hero hero where hero.city LIKE :searchCriteria OR hero.name LIKE :searchCriteria OR hero.universum = :searchCriteria")
    List<Hero> findHerosBySearchCriteria(@Param("searchCriteria") String searchCriteria);
}

As mentioned above in classic JPA sample so also here, we have to configure which SQL dialect our chosen JPA implementation Hibernate should use and how the database schema should set up.

The same with the test configuration, again we need a test with a Spring application context to configure the repository correctly for the test. But here we don’t need to start the whole application context with @SpringBootTest. Instead, we use @DataJpaTest. This annotation starts an application context only with beans that are needed for the persistence layer.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import ...

@DataJpaTest
@AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
@Testcontainers
class HeroSpringDataJpaRepositoryIT {
    @Container
    private static MySQLContainer database = new MySQLContainer();

    @Autowired
    private HeroSpringDataJpaRepository repositoryUnderTest;

    @Test
    void findHerosBySearchCriteria() {
        repositoryUnderTest.save(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));

        Collection<Hero> heros = repositoryUnderTest.findHerosBySearchCriteria("Batman");

        assertThat(heros).hasSize(1).contains(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));
    }

    @DynamicPropertySource
    static void databaseProperties(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url",database::getJdbcUrl);
        registry.add("spring.datasource.username", database::getUsername);
        registry.add("spring.datasource.password", database::getPassword);
    }
}

@DataJpaTest starts an in-memory database as default. But we want that a containerized database is used, provided by Testcontainers. Therefore, we have to add the annotation @AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE). This disables starting an in-memory database. The remaining test configuration is the same as the configuration in above test for the plain JPA example.

Testing Repositories but Reusing a database

With the increasing number of tests, it becomes more and more important that each test takes quite a long time, because each time a new database is started and initialized. One idea is to reuse the database in each test. Here the Single Container Pattern can help. A database is started and initialized once before all tests start running. For that, each test that need a database has to extend an abstract class, that is responsible for starting and initializing a database once before all tests run.

package com.github.sparsick.testcontainerspringboot.hero.universum;

import ...


public abstract class DatabaseBaseTest {
    static final MySQLContainer DATABASE = new MySQLContainer();

    static {
        DATABASE.start();
    }

    @DynamicPropertySource
    static void databaseProperties(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url", DATABASE::getJdbcUrl);
        registry.add("spring.datasource.username", DATABASE::getUsername);
        registry.add("spring.datasource.password", DATABASE::getPassword);
    }
}

In this abstract class we configure the database that is started once for all tests that extend this abstract class and the application context with that database. Please note, that we don’t use Testcontainers’ annotations here, because this annotation takes care that the container is started and stopped after each test. But this we would avoid. Therefore, we start the database by ourselves. For stopping database we don’t need to take care. For this Testcontainers’ side-car container ryuk takes care.

Now, each test class, that need a database, extends this abstract class. The only thing, that we have to configure, is how the application context should be initialized. That means, when you need the whole application context then use @SpringBootTest. When you need only persistence layer then use @DataJpaTest with @AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE).

@DataJpaTest
@AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
class HeroSpringDataJpaRepositoryReuseDatabaseIT extends DatabaseBaseTest {

    @Autowired
    private HeroSpringDataJpaRepository repositoryUnderTest;

    @Test
    void findHerosBySearchCriteria() {
        repositoryUnderTest.save(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));

        Collection<Hero> heros = repositoryUnderTest.findHerosBySearchCriteria("Batman");

        assertThat(heros).contains(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));
    }
}

Testing the whole Web Application including Database

Now we want to test our whole application, from controller to database. The controller implementation looks like this:

@RestController
public class HeroRestController {

    private final HeroSpringDataJpaRepository heroRepository;

    public HeroRestController(HeroSpringDataJpaRepository heroRepository) {
        this.heroRepository = heroRepository;
    }

    @GetMapping("heros")
    public Iterable<Hero> allHeros(String searchCriteria) {
        if (searchCriteria == null || searchCriteria.equals("")) {
            return heroRepository.findAll();

        }
        return heroRepository.findHerosBySearchCriteria(searchCriteria);
    }

    @PostMapping("hero")
    public void hero(@RequestBody Hero hero) {
        heroRepository.save(hero);
    }
}

The test class that test the whole way from database to controller looks like that

@SpringBootTest
@AutoConfigureMockMvc
@Testcontainers
class HeroRestControllerIT {

    @Container
    private static MySQLContainer database = new MySQLContainer();

    @Autowired
    private MockMvc mockMvc;

    @Autowired
    private HeroSpringDataJpaRepository heroRepository;

    @Test
    void allHeros() throws Exception {
        heroRepository.save(new Hero("Batman", "Gotham City", ComicUniversum.DC_COMICS));
        heroRepository.save(new Hero("Superman", "Metropolis", ComicUniversum.DC_COMICS));

        mockMvc.perform(get("/heros"))
                .andExpect(status().isOk())
                .andExpect(jsonPath("$[*].name", containsInAnyOrder("Batman", "Superman")));
    }

    @DynamicPropertySource
    static void databaseProperties(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url", database::getJdbcUrl);
        registry.add("spring.datasource.username", database::getUsername);
        registry.add("spring.datasource.password", database::getPassword);
    }
}

The test set up for the database and the application is known by the test from the above sections. One thing is different. We add MockMVC support with @AutoConfigureMockMvc. This helps to write tests through the HTTP layer.

Of course, you can also use the single container pattern in which the abstract class DatabaseBaseTest is extended.

Conclusion and Overview

This blog post shows how we can write tests for some persistence layer implementations in Spring Boot with Testcontainers. We also see how to reuse database instance for several tests and how to write test for the whole web application from controller tor database. All code snippet can be found on GitHub. In a further blog post I will show how to write test with Testcontainers Spring Boot modules.

Do you have other ideas for writing tests for persistence layer? Please let me know and write a comment.

Further Information

  1. Concept of BOM “bill of material”
  2. Testcontainers
  3. Spring Boot Documentation – Logging
  4. Spring Boot Documentation – Auto-configured Data JPA Tests
  5. Testcontainers – Single Container Pattern
  6. Spring Boot Documentation – MockMVC
  7. Full example in GitHub repository


Leave a comment

Test Environment for Ansible on a Windows System Without Linux Subsystem Support

Some weeks ago, I gave a workshop about Ansible and I was asked how to set up a local test environment for Ansible. Requirement is that this test environment can run on a Windows 10 without a Linux subsystem. I don’t why, but it was not my first customer where Windows 10 could not be used with Linux subsystem (Don’t ask me about reasons). So I recommend following set up.

Idea

Ansible itself requires a Linux-based system as the control machine (so-called control node). The machine that is provisioned (so-called managed node) can be both, Linux- or Windows-based. In my following example, the manage node is Linux-based.   So when I have to develop on a Windows machine, I install two Linux-based virtual machines. One for calling the Ansible’s playbooks and a second one for testing the provisioning. I set up the virtual machines with VirtualBox and Vagrant. Vagrant allows me to share the playbooks easily between host and virtual machines. That means, I can develop the playbook on the Windows system and the virtual machines can run headless. The next section shows you how to set up this tool chain.

Tool Chain Setup

At first, install VirtualBox and Vagrant on your machine. I additionally use Babun, a windows shell based on Cygwin and oh-my-zsh, for a better shell experience on Windows, but this isn’t necessary. Then, go to the directory (let’s called it ansible-workspace), where your Ansible’s playbooks are located. Create there a Vagrant configuration file with the command vagrant init:

$ cd ansible-workspace
$ vagrant init
$ tree  
.
├── deploy-app.yml
├── host_vars
│   └── ubuntu-server
├── inventories
│   ├── production
│   └── test
├── roles
│   ├── deploy-on-tomcat
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │       ├── cleanup-webapp.yml
│   │       ├── deploy-webapp.yml
│   │       ├── main.yml
│   │       ├── start-tomcat.yml
│   │       └── stop-tomcat.yml
└── Vagrantfile

Now, we have to choose a so-called Vagrant Box on Vagrant Cloud. A box is the package format for a Vagrant environment. It depends on the provider and the operating system that you choose to use. In our case, it is a VirtualBox VM image based on a minimal Ubuntu 18.04 system (box name is bento/ubuntu-18.04 ). In our case, we have to configure two boxes in our Vagrantfile. Both are based on bento/ubuntu-18.04.

Vagrant.configure("2") do |config|  
  config.vm.define "managed-node" do |ubuntu|
    ubuntu.vm.box = "bento/ubuntu-18.04"
  end

  config.vm.define "control-node" do |vbox|
     vbox.vm.box = "bento/ubuntu-18.04"
  end
end

The next step is to ensure that Python is installed on the managed node and this node is available via an IP address. For that, we use the shell provisioning, and we configure a private network in the Vagrantfile:

# ...  
  config.vm.define "managed-node" do |ubuntu|
    ubuntu.vm.box = "bento/ubuntu-18.04"
    ubuntu.vm.network "private_network", ip: "192.168.33.11"

    ubuntu.vm.provision "shell", inline: <<-SHELL
      sudo apt-get update
      sudo apt-get install python -y
    SHELL
  end
# ...

In the control node, we have to ensure that Ansible is installed and that a SSH connection from control node to our managed node via SSH-key is possible. For that agian, we use the shell provisioning in the Vagrantfile:

 # ... 
  config.vm.define "controll-node" do |vbox|
     vbox.vm.box = "bento/ubuntu-18.04"

     vbox.vm.provision "shell", inline: <<-SHELL
       sudo apt-get update -y
       sudo apt-get install -y software-properties-common
       sudo apt-add-repository ppa:ansible/ansible
       sudo apt-get update -y
       sudo apt-get install -y ansible
     SHELL

     vbox.vm.provision "shell", privileged: false, inline: <<-SHELL
       ssh-keygen -t rsa -q -f "/home/vagrant/.ssh/id_rsa" -N ""
       ssh-keygen -R 192.168.33.11
       ssh-keyscan -t rsa -H 192.168.33.11 >> /home/vagrant/.ssh/known_hosts
       sshpass -p 'vagrant' ssh-copy-id -i /home/vagrant/.ssh/id_rsa.pub vagrant@192.168.33.11
     SHELL
  end
# ...

On Github’s Gist you can find the whole Vagrantfile.

The last step is to configure your Ansible project, so that the playbooks can run against the managed node. Therefore, you create a new inventory file called local-test with managed node’s IP address:

$ cat inventories/local-test
[application_server] # name of your group that is mention in your playbook
192.168.33.11

Workflow

After setting up the tool chain let’s have a look how to work with it. I write my Ansible playbook on the Windows system and run them from the control node against managed node. First at all, we have to start our Vagrant boxes.

$ cd ansible-workspace
$ vagrant up

When both Vagrant boxes are ready to use, we can jump into control node box with:

$ vagrant ssh control-node

You can find the Ansible playbooks inside the box in the folder /vagrant .  In this folder run Ansible:

$ cd /vagrant
$ ansible-playbook -i inventories/local-test -u vagrant deploy-app.yml

It is important that you use the local-test inventory.

Outlook

Some Docker fans would prefer a container instead of a virtual machine. But remember, Docker runs on Windows in a virtual machine, therefore, I don’t see a benefit for using Docker instead of a virtual machine. But of course with Windows 10 native container support a setup with Docker is a good alternative if Ansible doesn’t run on the Linux subsystem.

Do you another idea or approach? Let me know and write a comment.

Links

  1. VirtualBox
  2. Vagrant
  3. Whole Vagrantfile on GitHub.


Leave a comment

Maven Project Setup for Mixing Spock 1.x and JUnit 5 Tests

I create a sample Groovy project for Maven, that mixes Spock tests and JUnit 5 tests in one project. In the next section I’ll describe how to set up such kind of Maven project.

Enable Groovy in the Project

First at all, you have to enable Groovy in your project. One possibility is to add the GMavenPlus Plugin to your project.

<build>
    <plugins>
        <plugin>
            <groupId>org.codehaus.gmavenplus</groupId>
            <artifactId>gmavenplus-plugin</artifactId>
            <version>1.6.2</version>
            <executions>
                <execution>
                    <goals>
                        <goal>addSources</goal>
                        <goal>addTestSources</goal>
                        <goal>compile</goal>
                        <goal>compileTests</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

The goals addSources and addTestSources add Groovy (test) sources to Maven’s main (test) sources. The default locations are src/main/groovy (for main source) and src/test/groovy (for test source). Goals compile and compileTests compile the Groovy (test) code. If you don’t have Groovy main code, you can omit addSource and compile.

This above configuration is always using the latest released Groovy version. If you want to ensure that a specific Groovy version is used, you have to add the specific Groovy dependency to your classpath.

   <dependencies>
        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy</artifactId>
            <version>2.5.6</version>
        </dependency>
  </dependencies>

Enable JUnit 5 in the Project

The simplest setup for using JUnit 5 in your project is to add the JUnit Jupiter dependency in your test class path and to configure the correct version of Maven Surefire Plugin (at least version 2.22.0).

    <dependencies>
<!--... maybe more dependencies -->
        <dependency>
            <groupId>org.junit.jupiter</groupId>
            <artifactId>junit-jupiter</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.junit</groupId>
                <artifactId>junit-bom</artifactId>
                <version>${junit.jupiter.version}</version>
                <scope>import</scope>
                <type>pom</type>
            </dependency>
        </dependencies>
    </dependencyManagement>
    <build>
        <plugins>
        <!-- other plugins -->
            <plugin>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.22.1</version>
            </plugin>
        </plugins>
    </build>

Enable Spock in the Project

Choosing the right Spock dependency depends on which Groovy version you are using in the project. In our case, a Groovy version 2.5. So we need Spock in version 1.x-groovy-2.5 in our test class path.

    <dependencies>
        <!-- more dependencies -->
        <dependency>
            <groupId>org.spockframework</groupId>
            <artifactId>spock-core</artifactId>
            <version>1.3-groovy-2.5</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

Now the expectation is that the Spock tests and the JUnit5 tests are executed in the Maven build. But only the JUnit5 tests are executed by Maven. So what happened?

I started to change the Maven Surefire Plugin version to 2.21.0. Then the Spock tests were executed, but no JUnit5 tests. The reason is that in the version 2.22.0 of Maven Surefire Plugin JUnit4 provider is replaced by JUnit Platform Provider as default. But Spock in version 1.x is based on JUnit4. This will be changed in Spock version 2. This version will be based on the JUnit5 Platform. Thus, for Spock 1.x, we have to add JUnit Vintage dependency to our test class path.

    <dependencies>
        <!-- more dependencies -->
          <dependency>  <!--Only necessary for surefire to run spock tests during the maven build -->
            <groupId>org.junit.vintage</groupId>
            <artifactId>junit-vintage-engine</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>

This allows running elder JUnit (3/4) tests on the JUnit Platform. With this configuration both, Spock and JUnit 5 tests, are executed in the Maven build.

Links


Leave a comment

How to Format a Large Code Base Automatically

If you introduce code formatting rules retroactively, you have to solve the problem how to format existing code base according to the new formatting rules. You could checkout every code repository one by one in your IDE and click on Autoformat the whole project. But this is boring and waste of time. Fortunately, Intellij IDEA has a format CLI tool in its installation. You can locate it in the path <your Intellij IDEA installation>/bin. It’s called format.sh.

In the next section I’d like to show you how you can automate formatting big code base. First, I will show the preparation steps like exporting your code formatting rule setting from the IDE. Then, I will demonstrate how to use the CLI-Tool format.sh. At the end, I will show a small Groovy script that query all repositories (in this case they are Git repositories), formatting the code and push it back to the remote SCM.

Preparations

First at all, we need the code formatting rule setting exported from Intellij IDEA. In your Intellij IDEA follow the next step

  1. Open File -> Settings -Editor-> Code Style
  2. Click on Export…
  3. Choose a name for the XML file (for example, Default.xml) and a location where this file should be saved (for example, /home/foo ).

Then, checkout or clone your SCM repository and remember the location where you checkout/clone it (for example, /home/foo/myrepository).

Format Code Base Via format.sh  CLI Tool

Three parameters are important for format.sh:

  • -s : Set a path to Intellij IDEA code style settings .xml file (in our example: /home/foo/Default.xml).
  • -r : Set that directories should be scanned recursively.
  • path<n> : Set a path to a file or a directory that should be formatted (in our example: /home/foo/myrepository).

> ./format.sh
IntelliJ IDEA 2018.2.4, build IC-182.4505.22 Formatter
Usage: format [-h] [-r|-R] [-s|-settings settingsPath] path1 path2...
-h|-help Show a help message and exit.
-s|-settings A path to Intellij IDEA code style settings .xml file.
-r|-R Scan directories recursively.
-m|-mask A comma-separated list of file masks.
path<n> A path to a file or a directory.

> /format.sh -r -s ~/Default.xml ~/myrepository

It’s possible that the tool cancels scanning because of a java.lang.OutOfMemoryError: Java heap space. Then, you have to increase Java’s maximum memory size (-Xmx) in <your Intellij IDEA installation>/bin/idea64.vmoptions.


> nano idea64.vmoptions
-Xms128m
-Xmx750m // <- here increase the maximum memory size
-XX:ReservedCodeCacheSize=240m
-XX:+UseConcMarkSweepGC
-XX:SoftRefLRUPolicyMSPerMB=50
-ea
-Dsun.io.useCanonCaches=false
-Djava.net.preferIPv4Stack=true
-Djdk.http.auth.tunneling.disabledSchemes=""
-XX:+HeapDumpOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow
-Dawt.useSystemAAFontSettings=lcd
-Dsun.java2d.renderer=sun.java2d.marlin.MarlinRenderingEngine

Groovy Script For Formatting Many Repository In a Row

Now, we want to bring everything together. The script should do four things:

  1. Find all repository URLs whose code has to be formatted.
  2. Check out / Clone the repositories.
  3. Format the code in all branches of each repostory.
  4. Commit and push the change to the remote SCM.

I choose Git as SCM in my example. The finding of the repository URLs depends on the Git Management System (like BitBucket, Gitlab, SCM Manager etc.) that you use. But the approach is in all system the same:

  1. Call the RESTful API of your Git Management System.
  2. Parse the JSON object in the response after the URLs.

For example, in BitBucket it’s like that:



@Grab('org.codehaus.groovy.modules.http-builder:http-builder:0.7.1')
import groovyx.net.http.*

import static groovyx.net.http.ContentType.*
import static groovyx.net.http.Method.*

def cloneUrlsForProject() {

    def projectUrl = "https://scm/rest/api/1.0/projects/PROJECT_KEY/repos?limit=1000"
    def token = "BITBUCKET_TOKEN"

    def projects = []
    def cloneUrls = []

    def http = new HTTPBuilder(projectUrl)
    http.request(GET) {
        headers."Accept" = "application/json"
        headers."Authorization" = "Bearer ${token}"

        response.success = { resp -> projects = new JsonSlurper().parseText(resp.entity.content.text)}

        response.failure = { resp ->
            throw new RuntimeException("Error fetching clone urls for '${projectKey}': ${resp.statusLine}")
        }
    }

    projects.values.each { value ->
        def cloneLink = value.links.clone.find { it.name == "ssh" }
        cloneUrls.add(cloneLink.href)
    }

    return cloneUrls
}

Then, we have to clone the repositories and checkout each branch. In each branch, the format.sh has to be called. For the git operation, we use the jgit library and for the format.sh call we use a Groovy feature for process calling. In Groovy it’s possible to define the command as a String and then to call the method execute() on this String like “ls -l”.execute(). So the Groovy script for the last three tasks would be looked like that:

#!/usr/bin/env groovy
@Grab('org.eclipse.jgit:org.eclipse.jgit:5.1.2.201810061102-r')
import jgit.*
import org.eclipse.jgit.api.CreateBranchCommand
import org.eclipse.jgit.api.Git
import org.eclipse.jgit.api.ListBranchCommand
import org.eclipse.jgit.transport.UsernamePasswordCredentialsProvider

import java.nio.file.Files


def intellijHome = 'path to your idea home folder'
def codeFormatterSetting = 'path to your exported code formatter setting file'
def allRepositoriesUrls = ["http://scm/repo1","http://scm/repo2"] // for simplifying

allRepositoriesUrls.each { repository ->
    def repositoryName = repository.split('/').flatten().findAll { it != null }.last()
    File localPath = Files.createTempDirectory("${repositoryName}-").toFile()
    println "Clone ${repository} to ${localPath}"
    Git.cloneRepository()
       .setURI(repository)
       .setDirectory(localPath)
       .setNoCheckout(true)
       .setCredentialsProvider(new UsernamePasswordCredentialsProvider("user", "password")) // only needed when clone url is https / http
       .call()
       .withCloseable { git ->
        def remoteBranches = git.branchList().setListMode(ListBranchCommand.ListMode.REMOTE).call()
        def remoteBranchNames = remoteBranches.collect { it.name.replace('refs/remotes/origin/', '') }

        println "Found the following branches: ${remoteBranchNames}"

        remoteBranchNames.each { remoteBranch ->
            println "Checkout branch $remoteBranch"
            git.checkout()
               .setName(remoteBranch)
               .setCreateBranch(true)
               .setUpstreamMode(CreateBranchCommand.SetupUpstreamMode.TRACK)
               .setStartPoint("origin/" + remoteBranch)
               .call()

            def formatCommand = "$intellijHome/bin/format.sh -r -s $codeFormatterSetting $localPath"

            println formatCommand.execute().text

            git.add()
               .addFilepattern('.')
               .call()
            git.commit()
               .setAuthor("Automator", "no-reply@yourcompany.com")
               .setMessage('Format code according to IntelliJ setting.')
               .call()

            println "Commit successful!"
        }

        git.push()
           .setCredentialsProvider(new UsernamePasswordCredentialsProvider("user", "password")) // only needed when clone url is https / http
           .setPushAll()
           .call()

        println "Push is done"

    }
}

Do you have another approach? Let me know and write a comment below.


Leave a comment

Mocking SecurityContext in Jersey Tests

Jersey has a great possibility to write integration test for REST-APIs, written with Jersey. Just extend the class JerseyTest and go for it.

I ran in an issue, where I had to mock a SecurityContext, so that the SecurityContext includes a special UserPrincipal. The challenge is that Jersey wraps the SecurityContext in an own class SecurityContextInjectee in tests. So I have to add my SecurityContext Mock to this Jersey’s wrapper class. Let me demonstrate it in an example.

Let say I have the following Jersey Resource:


import javax.ws.rs.*;
import javax.ws.rs.core.*;

@Path("hello/world")
public class MyJerseyResource {

    @GET
    public Response helloWorld(@Context final SecurityContext context) {
        String name = context.getUserPrincipal().getName();
        return Response.ok("Hello " + name, MediaType.TEXT_PLAIN).build();
    }

}

In my test, I have to mock the SecurityContext, so that a predefined user principal can be used during the tests. I use Mockito as mocking framework. My mock looks like the following one

import java.security.Principal;
import javax.ws.rs.core.SecurityContext;

import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

final SecurityContext securityContextMock = mock(SecurityContext.class);
when(securityContextMock.getUserPrincipal()).thenReturn(new Principal() {
    @Override
    public String getName() {
        return "Alice";
    }
});

For adding this mocked SecurityContext to the wrapper class SecurityContextInjectee, I have to configure a ResourceConfig with a modified ContainerRequestContext in my Jersey Test. The mocked SecurityContext can be set in this modified ContainerRequestContext and then it will be used in the wrapper class:


import java.security.Principal;

import javax.ws.rs.container.ContainerRequestContext;
import javax.ws.rs.container.ContainerRequestFilter;
import javax.ws.rs.core.SecurityContext;

import org.glassfish.jersey.server.ResourceConfig;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

@Override
public Application configure() {
    final SecurityContext securityContextMock = mock(SecurityContext.class);
    when(securityContextMock.getUserPrincipal()).thenReturn(new Principal() {
        @Override
        public String getName() {
            return "Alice";
        }
    });

    ResourceConfig config = new ResourceConfig();
    config.register(new ContainerRequestFilter(){
        @Override
        public void filter(final ContainerRequestContext containerRequestContext) throws IOException {
            containerRequestContext.setSecurityContext(securityContextMock);
        }
    });
    return config;
}

Then, the whole test for my resource looks like the following one:

import java.security.Principal;

import javax.ws.rs.container.ContainerRequestContext;
import javax.ws.rs.container.ContainerRequestFilter;
import javax.ws.rs.core.Application;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.SecurityContext;

import org.apache.commons.httpclient.HttpStatus;
import org.glassfish.jersey.server.ResourceConfig;
import org.glassfish.jersey.test.JerseyTest;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

public class MyJerseyResourceTest extends JerseyTest {

    @Test
    public void helloWorld() throws Exception {
        Response response = target("hello/world").request().get();

        assertThat(response.getStatus()).isEqualTo(HttpStatus.SC_OK);
        assertThat(response.getEntity()).isEqualTo("Hello Alice");
    }

    @Override
    public Application configure() {
        final SecurityContext securityContextMock = mock(SecurityContext.class);
        when(securityContextMock.getUserPrincipal()).thenReturn(new Principal() {
            @Override
            public String getName() {
                return "Alice";
            }
        });

        ResourceConfig config = new ResourceConfig();
        config.register(new ContainerRequestFilter() {
            @Override
            public void filter(final ContainerRequestContext containerRequestContext) throws IOException {
                containerRequestContext.setSecurityContext(securityContextMock);
            }
        });
        return config;
    }
}

Do you have a smarter solution for this problem? Let me know it and write a comment below.


Leave a comment

QA Milestones since the Foundation of an eHealth Framework

“Quality exists when the price is long forgotten”,  said Mr Royce to Mr Rolls with his petrol smeared face or Mr Rolls to Mr Royce – but anyway.

The question is about which price is he talking about?

Money, yes of course! But what about Pain, Features, Competition, Time, Returns, No-Family, Consumer Satisfaction, Feelgood Management, Lifecycle…. – I think you get the point.

The reason for the pain i feel right now is that it happens again to me for the third time to move my contribution for the Developer Network of ICW to another webspace.

So i was quite surprized and more important i´m very thankful to the offer of SKM (Big Hugs!) to be finally part of this enviroment full of motivation, positive energy and knowledge in order to link the following stuff for eternity to my Xing account.

To all who are magically attracted to quality, fighting against superficiality, sensitised to sense-of-self vs. awareness-of-others and promoters of best practises for Continuous Integration and Continuous Delivery.

Let´s go….

QA Milestones since the Foundation of the eHealth Framework (2010)

Imagine that you are responsible for the quality of the underlying technical platform in the scope of an electronic health record that is being developed from scratch. The platform
is the eHealth Framework (eHF). In addition, this solution is part of an integrated health suite with a software development kit (SDK) and various professional services products.
On the 02 December 2009 the tenth version of the eHealth Framework with the version number 2.9 was released. Therefore it is a good occasion to take a look back and remind
ourselves of the challenges we faced so far in QA, how we solved them and how we generated value.
As the saying goes: On time, on scope, on budget! The role that quality plays in this mix is the subject of this article.

 

QA from Scratch to First Release

When developing software from scratch you first need to lay the foundations so that when it is time to roll out the software all the features and usage scenarios can be tested. Sounds easy but it isn’t!

Here is a list of the highlights:
Product Management produced the Software Requirement Specification (SRS) with an overview of the priorities for developing the basis of the new LifeSensor, a personal health record. Based on the Software Requirement Specification, QA extracted all the relevant test cases, these are then reviewed by Product Management. The Tooling to carry out the necessary tests was built. To ensure the web services testing could be carried out we built a Java-based web services test client that catered for maximum flexibility and automation. The test data was stored in an MS Excel sheet. The development cycle was such that every Thursday there was a new version to be tested. Based on our testing we created new bugs as necessary. The respective Product Manager created acceptance tests for the newly developed features that had to be completed for the current week. In this way everyone was involved in an operational feature. In such situation be careful with statements like “….I’m 90% finished…” when you are already in the 3rd day of finalizing the requested missing 10%. I don’t want to criticize anyone because requirements can and do change and that causes effects from the DB straight to GUI and its usability. To steer the project there was a clear focus on pragmatic and sensible reporting: tested, not-tested and accepted Bugs including Top-Ten Risks.

Lessons learned:

  • It was quickly proved in practice that integrative tests between WS-to-GUI and GUI-to-WS were valuable. Yes there was a GUI too.
  • Another important point to share is:  Feature or QA-ready is NOT Production ready. So involve the operating administrators immediately to keep everything maintainable.
  • Take care about these guys!  Make the operating administrators to your friends. Do everything!

 

QA after Initial Roll-out

Once the software has gone into production each new feature must fit into the existing application both conceptually and functionally without introducing any unwanted side effects. In addition we were confronted with the mandatory requirement that each new version of a framework with existing clients must face.

The keyword here is: backwards compatibility. Here it is also important to mention that LifeSensor development was then split between developing its own specific components and its basis components, the latter becoming the backbone of ICW’s health care platform: This was the birth of the eHealth-Framework with its own development team.

Here are some of the highlights of this new team:
We changed our software development process to agile inside a V-Model and introduced Scrum. This means that we QAs worked in parallel with the development team and product mangers. We were no longer on the receiving end of getting a piece of software thrown over the fence and having to test it somewhere in isolation. The whole team was responsible for delivering the features. The whole team was responsible for quality. Not only QA was responsible for quality any more. This is the modern QA approach like in modern soccer. If the Bavarians have the ball the Lewandowski become the first line of defense. If you go deeper with this thought our approach and reason for existing is that QA has to deliver added value and support to development and product management in order to be accepted day by day. For instance, this means no longer being able to hide behind theoretical metrics discussions. We refactored the architecture of our test client and separated common components like object creation from the test data in MS Excel and test result reporting. The added value here was that other teams could easily reuse our test data and checks when using the eHealth Framework or if they wanted to build their own test client. We started to build up a continuous integration environment based on CruiseControl where every test in our test client was executed during the night. We developed further features in our test client in order to minimize duplication of test data based on localization needs. (for instance, different test data was necessary for Austria, Switzerland….) Once again: we had already released a software version and would have to release further software versions: So please think about this attached diagram from the Software Lifecycle Management session from the ICW Developer Conference 2008 and what that means for ensuring backwards compatibility.

 

eHF_Lifecycle

Lessons learned:

  • Do not underestimate the Software Lifecycle. And it is good to have the checks in a continuous integration process.
  • QA is not only testing. QA is not Build Management.
  • Build Management is an independent IT topic with special in-depth know-how and provides a crucial service for the development and the QA team.
  • With software in production be prepared that QA will get new roles as supporter and investigator of potential bugs.

QA Between Version 2.1 and 2.8

The first part dealt with with the historical roots of QA. We now make a big jump over the released versions. Meanwhile, the amount of test data multiplied. We also supported new platforms such as Glassfish. Now the existing tools had to be made manageable and usable with this amount of necessary test data. Meaning usability is not only a topic for developing a GUI. We carried out about 50.000 tests per night on the Web services level, per code line, and rising.

Here is a collection of highlights and ideas:
The test data in the Excel sheets were clustered together as fragments where it made sense. For example, now it’s no longer necessary to include the address in each Emergency Contact as test data; you can pick the one you want from a pool of addresses and reference this. This saves lots of maintenance work when the interface changes. With each negative test, we check the expected exception. However, exceptions and their messages are also subject to change. So what happens now to our backward compatibility tests? They will all fail. What have we done about this? We reused the localization feature that I talked about in “QA after initial roll out” and include there the newly expected exceptions for backwards compatibility testing. And how does the test now knows its context, is it a backward compatibility test or a normal test for its own initial version?  Talk with your build management team, they will provide you a switch.

The Continuous Integration (CI) environment itself was also continuously developed. To deal with the quantity and to scale better we moved to Hudson. Appropriate monitoring solutions were designed as well as a “merge assistant” was developed for more control over this process. Additionally, the tomcat server logs are now published by Hudson for every test that is run. In order to test Auditing and Encryption on the database level we developed direct access to the DB with reuse of the already written and automated test cases. Of course everything can be run on local machines as well as in the CI environment. We made a test tool evaluation but at that time it did not look so good for support in the agile Java domain and an artifact based approach.

Lessons learned:

  • Automation is king. Period! But be careful because automation can make you stupid. Systems and approaches are changing and therefore the test cases have to change respectively. Just configuring the test system and executing one BIG button is not QA but a hit-and-run approach – even when the test results are all “executed ok” in the log files. So we have to ask ourselves every time – Are we still testing the right things?
  • Greatness must be worked hard for every day and is not a default value. Establish such a likeable enviroment.
  • Now take a deep breath and think about what is the process worth in time problems? Everybody should think about it at least once again after a release is delivered.
  • Establish a weekly Bug smashing session. Here it’s important that not only the QA-Team works to resolve the bug but that decision makers from PM and Dev are also represented. The whole project will take a great benefit from that.

 

QA for the Latest Release – 2.9

It will not surprise anyone if I say the amount of work is no less. What was not mentioned so far is that we also have an outstanding development team in Bulgaria. That is, the communications must be guided by this. But there are enough tools to cater for this situation they only need to be properly implemented. Imagine that the whole team is located on the same floor, but everyone stays at their desk. Fortunately Scrum addresses this issue. It is not complicated.
So we’re into the final stretch. Here are the main highlights and some associated ideas:
We achieved the goal of including the whole system documentation (based on DITA) in our CI environment (for example, part of this documentation set is included on ICW’s partner DVDs). Not only bugs, but all story planning as well version tracking is completely mapped in Jira. This means we had absolutely everything that we need to create a release in our CI environment. Some bugs need to be fixed not only for future versions but also in older versions or code lines. How can I ensure that everything is remembered? During bug smashing if we determine that we need to provide a fix for an older version, we provide this information as an additional subtask in the Jira entry. Everything is now transparent and if there are side effects everyone can see if a fix was checked in and when it was checked in and QA can start with retesting.

As far as our CI environment went we had to solve the problem that in the meantime we carry out so many tests that they were not finished in the morning but rather at midday. In this regard our build management with development and QA worked together to improve the feedback time by optimizing the build time. Some intermediate goals to achieve the required solution included: different load balancing and tuning of integration tests.

How to improve further:
Have you ever experienced the situation where somebody calls you up and nervously asks “…QA! Do you have a test which checks the… (fill in some scenario you hope is covered)“ – always if the customer claims to have found a bug , suddenly in that second for every stakeholder this is such a clear and logical never to be forgotten test from that moment on. So. Hmmmmm! Why not identify these “obvious” ones a little bit sooner? How? We have product managers, developers and QAs and everybody brings their own perspective to every single feature. We can use these perspectives and you will be surprised how your test cases increase. Every test case you identify in such sessions is an additional security net and a good day for the quality of your product. Donate this to your system as much as possible. It is of great benefit.

Some of next challenges will be adding micro-benchmarking tests in our test scenarios. This will never replace a mandatory performance test from this specially skilled people in our performance lab for every release but gives you soon information about possible side effects when your system is growing up.
I hope you liked this short overview of our work and that you can find something here that will help you in your own daily work. Thanks to my colleagues in technical writing and build management for their input and help with this post.

Finished.

PS: So and there is a pdf version for you to download right into the mediathek.