SKM IT World

Just another blog about IT


Leave a comment

Strategy Pattern Revisited With Spring

This blog post wants to show another approach how to implement the Strategy Pattern with dependency injection. As DI framework, I choose Spring framework

From Wikipedia

Firstly, let’s have a look how the Strategy Pattern is implemented in the classic way.
As starting point, we have a HeroController that should add a hero in HeroRepository depends on which repository was chosen by the user.

package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PostMapping;

@Controller
public class HeroControllerClassicWay {

    @PostMapping("/hero/new")
    public String addNewHero(@ModelAttribute("newHero") NewHeroModel newHeroModel) {
        HeroRepository heroRepository = findHeroRepository(newHeroModel.getRepository());
        heroRepository.addHero(newHeroModel.getHero());
        return "redirect:/hero";
    }

    private HeroRepository findHeroRepository(String repositoryName) {
        if (repositoryName.equals("Unique")) {
            return new UniqueHeroRepository();
        }

        if(repositoryName.equals(("Duplicate")){
            return new DuplicateHeroRepository();
        }

        throw new IllegalArgumentException(String.format("Find no repository for given repository name [%s]", repositoryName));
    }
}
package com.github.sparsick.springbootexample.hero.universum;

import java.util.Collection;
import java.util.HashSet;
import java.util.Set;

import org.springframework.stereotype.Repository;

@Repository
public class UniqueHeroRepository implements HeroRepository {

    private Set<Hero&gt; heroes = new HashSet<&gt;();

    @Override
    public String getName() {
        return "Unique";
    }

    @Override
    public void addHero(Hero hero) {
        heroes.add(hero);
    }

    @Override
    public Collection<Hero&gt; allHeros() {
        return new HashSet<&gt;(heroes);
    }

}
package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Repository;

import java.util.ArrayList;
import java.util.Collection;
import java.util.List;

@Repository
public class DuplicateHeroRepository implements HeroRepository {

    private List<Hero&gt; heroes = new ArrayList<&gt;();

    @Override
    public void addHero(Hero hero) {
        heroes.add(hero);
    }

    @Override
    public Collection<Hero&gt; allHeros() {
        return List.copyOf(heroes);
    }

    @Override
    public String getName() {
        return "Duplicate";
    }
}

This implementation has some pitfalls. The creation of the repository implementations aren’t managed by the Spring Context (it breaks the dependency injection / inverse of control). This will be painful as soon as you want to expand the repository implementation with further feature that need to inject other classes (for example, counting the usage of this class with MeterRegistry).

package com.github.sparsick.springbootexample.hero.universum;

import java.util.Collection;
import java.util.HashSet;
import java.util.Set;

import io.micrometer.core.instrument.Counter;
import io.micrometer.core.instrument.MeterRegistry;
import org.springframework.stereotype.Repository;

@Repository
public class UniqueHeroRepository implements HeroRepository {

    private Set<Hero&gt; heroes = new HashSet<&gt;();
    private Counter addCounter;

    public UniqueHeroRepository(MeterRegistry meterRegistry) {
        addCounter = meterRegistry.counter("hero.repository.unique");
    }

    @Override
    public String getName() {
        return "Unique";
    }

    @Override
    public void addHero(Hero hero) {
        addCounter.increment();
        heroes.add(hero);
    }

    @Override
    public Collection<Hero&gt; allHeros() {
        return new HashSet<&gt;(heroes);
    }

}

It breaks also the separation of concern. When I want to test the controller class, I have no possibility to mock the repository interface easily. So the first idea is to put the creation of repository implementation to the Spring context. The repository implementation are annotated with @Repository annotation. So Spring’s component scan find them.
The next question how to inject them into the controller class. Here, a Spring feature can help. I define a list of HeroRepository in the controller. This list has to be filled during the creation of the controller instance.

package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PostMapping;

import java.util.List;

@Controller
public class HeroControllerRefactoringStep1 {

    private List<HeroRepository&gt; heroRepositories;

    public HeroControllerRefactoringStep1(List<HeroRepository&gt; heroRepositories) {
        this.heroRepositories = heroRepositories;
    }

    @PostMapping("/hero/new")
    public String addNewHero(@ModelAttribute("newHero") NewHeroModel newHeroModel) {
        HeroRepository heroRepository = findHeroRepository(newHeroModel.getRepository());
        heroRepository.addHero(newHeroModel.getHero());
        return "redirect:/hero";
    }

    private HeroRepository findHeroRepository(String repositoryName) {
        return heroRepositories.stream()
                .filter(heroRepository -&gt; heroRepository.getName().equals(repositoryName))
                .findFirst()
                .orElseThrow(()-&gt; new IllegalArgumentException(String.format("Find no repository for given repository name [%s]", repositoryName)));

    }
}

Spring searches in its context for all implementation of the interface HeroRepostiory and put them all to the list. One disadvantage has this solution, every adding a hero browses the list of HeroRepository to find the right implementation. This can be optimized by creating a map in the controller constructor that has the repository name as key and the corresponded implementation as value.

package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PostMapping;

import java.util.HashMap;
import java.util.List;
import java.util.Map;

@Controller
public class HeroControllerRefactoringStep2 {

    private Map<String, HeroRepository&gt; heroRepositories;

    public HeroControllerRefactoringStep2(List<HeroRepository&gt; heroRepositories) {
        this.heroRepositories = heroRepositoryStrategies(heroRepositories);
    }

    private Map<String, HeroRepository&gt; heroRepositoryStrategies(List<HeroRepository&gt; heroRepositories){
        Map<String, HeroRepository&gt; heroRepositoryStrategies = new HashMap<&gt;();
        heroRepositories.forEach(heroRepository -&gt; heroRepositoryStrategies.put(heroRepository.getName(), heroRepository));
        return heroRepositoryStrategies;
    }

    @PostMapping("/hero/new")
    public String addNewHero(@ModelAttribute("newHero") NewHeroModel newHeroModel) {
        HeroRepository heroRepository = findHeroRepository(newHeroModel.getRepository());
        heroRepository.addHero(newHeroModel.getHero());
        return "redirect:/hero";
    }

    private HeroRepository findHeroRepository(String repositoryName) {
        HeroRepository heroRepository = heroRepositories.get(repositoryName);
        if(heroRepository != null) {
            return  heroRepository;
        }
        throw new IllegalArgumentException(String.format("Find no repository for given repository name [%s]", repositoryName));
    }
}

The final question is what if other classes in the application need the possibility to choose a repository implementation during the runtime. I could copy and paste the private method in each class that have this need or I move the creation of the map to the Spring Context and inject the Map to each class.

package com.github.sparsick.springbootexample.hero;

import com.github.sparsick.springbootexample.hero.universum.HeroRepository;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;

import java.util.HashMap;
import java.util.List;
import java.util.Map;

@SpringBootApplication
public class HeroApplicationRefactoringStep3 {

	public static void main(String[] args) {
		SpringApplication.run(HeroApplication.class, args);
	}

	@Bean
    Map<String, HeroRepository&gt; heroRepositoryStrategy(List<HeroRepository&gt; heroRepositories){
        Map<String, HeroRepository&gt; heroRepositoryStrategy = new HashMap<&gt;();
        heroRepositories.forEach(heroRepository -&gt; heroRepositoryStrategy.put(heroRepository.getName(), heroRepository));
        return heroRepositoryStrategy;
    }
}

package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PostMapping;

import java.util.Map;

@Controller
public class HeroControllerRefactoringStep3 {

    private Map<String, HeroRepository&gt; heroRepositoryStrategy;

    public HeroControllerRefactoringStep3(Map<String, HeroRepository&gt; heroRepositoryStrategy) {
        this.heroRepositoryStrategy = heroRepositoryStrategy;
    }

    @PostMapping("/hero/new")
    public String addNewHero(@ModelAttribute("newHero") NewHeroModel newHeroModel) {
        HeroRepository heroRepository = findHeroRepository(newHeroModel.getRepository());
        heroRepository.addHero(newHeroModel.getHero());
        return "redirect:/hero";
    }

    private HeroRepository findHeroRepository(String repositoryName) {
        return heroRepositoryStrategy.get(repositoryName);
    }

}

This solution is a little bit ugly, because it isn’t obvious that the Strategy Pattern is used. So the next refactoring step is moving the map of hero repositories to an own component class. Therefore, the bean definition heroRepositoryStrategy in the application configuration can be removed.

package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Component;

import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.Set;

@Component
public class HeroRepositoryStrategy {

    private Map<String, HeroRepository&gt; heroRepositoryStrategies;

    public HeroRepositoryStrategy(Set<HeroRepository&gt; heroRepositories) {
        heroRepositoryStrategies = createStrategies(heroRepositories);
    }

    HeroRepository findHeroRepository(String repositoryName) {
        return heroRepositoryStrategies.get(repositoryName);
    }

    Set<String&gt; findAllHeroRepositoryStrategyNames () {
        return heroRepositoryStrategies.keySet();
    }

    Collection<HeroRepository&gt; findAllHeroRepositories(){
        return heroRepositoryStrategies.values();
    }


    private Map<String, HeroRepository&gt; createStrategies(Set<HeroRepository&gt; heroRepositories){
        Map<String, HeroRepository&gt; heroRepositoryStrategies = new HashMap<&gt;();
        heroRepositories.forEach(heroRepository -&gt; heroRepositoryStrategies.put(heroRepository.getName(), heroRepository));
        return heroRepositoryStrategies;
    }

}

package com.github.sparsick.springbootexample.hero.universum;

import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PostMapping;

import java.net.Inet4Address;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;

@Controller
public class HeroController {

    private HeroRepositoryStrategy heroRepositoryStrategy;

    public HeroController(HeroRepositoryStrategy heroRepositoryStrategy) {
        this.heroRepositoryStrategy = heroRepositoryStrategy;
    }

    @PostMapping("/hero/new")
    public String addNewHero(@ModelAttribute("newHero") NewHeroModel newHeroModel) {
        HeroRepository heroRepository = heroRepositoryStrategy.findHeroRepository(newHeroModel.getRepository());
        heroRepository.addHero(newHeroModel.getHero());
        return "redirect:/hero";
    }
}

The whole sample is hosted on GitHub.

Advertisements


Leave a comment

Maven Project Setup for Mixing Spock 1.x and JUnit 5 Tests

I create a sample Groovy project for Maven, that mixes Spock tests and JUnit 5 tests in one project. In the next section I’ll describe how to set up such kind of Maven project.

Enable Groovy in the Project

First at all, you have to enable Groovy in your project. One possibility is to add the GMavenPlus Plugin to your project.

<build&gt;
    <plugins&gt;
        <plugin&gt;
            <groupId&gt;org.codehaus.gmavenplus</groupId&gt;
            <artifactId&gt;gmavenplus-plugin</artifactId&gt;
            <version&gt;1.6.2</version&gt;
            <executions&gt;
                <execution&gt;
                    <goals&gt;
                        <goal&gt;addSources</goal&gt;
                        <goal&gt;addTestSources</goal&gt;
                        <goal&gt;compile</goal&gt;
                        <goal&gt;compileTests</goal&gt;
                    </goals&gt;
                </execution&gt;
            </executions&gt;
        </plugin&gt;
    </plugins&gt;
</build&gt;

The goals addSources and addTestSources add Groovy (test) sources to Maven’s main (test) sources. The default locations are src/main/groovy (for main source) and src/test/groovy (for test source). Goals compile and compileTests compile the Groovy (test) code. If you don’t have Groovy main code, you can omit addSource and compile.

This above configuration is always using the latest released Groovy version. If you want to ensure that a specific Groovy version is used, you have to add the specific Groovy dependency to your classpath.

   <dependencies&gt;
        <dependency&gt;
            <groupId&gt;org.codehaus.groovy</groupId&gt;
            <artifactId&gt;groovy</artifactId&gt;
            <version&gt;2.5.6</version&gt;
        </dependency&gt;
  </dependencies&gt;

Enable JUnit 5 in the Project

The simplest setup for using JUnit 5 in your project is to add the JUnit Jupiter dependency in your test class path and to configure the correct version of Maven Surefire Plugin (at least version 2.22.0).

    <dependencies&gt;
<!--... maybe more dependencies --&gt;
        <dependency&gt;
            <groupId&gt;org.junit.jupiter</groupId&gt;
            <artifactId&gt;junit-jupiter</artifactId&gt;
            <scope&gt;test</scope&gt;
        </dependency&gt;
    </dependencies&gt;

    <dependencyManagement&gt;
        <dependencies&gt;
            <dependency&gt;
                <groupId&gt;org.junit</groupId&gt;
                <artifactId&gt;junit-bom</artifactId&gt;
                <version&gt;${junit.jupiter.version}</version&gt;
                <scope&gt;import</scope&gt;
                <type&gt;pom</type&gt;
            </dependency&gt;
        </dependencies&gt;
    </dependencyManagement&gt;
    <build&gt;
        <plugins&gt;
        <!-- other plugins --&gt;
            <plugin&gt;
                <artifactId&gt;maven-surefire-plugin</artifactId&gt;
                <version&gt;2.22.1</version&gt;
            </plugin&gt;
        </plugins&gt;
    </build&gt;

Enable Spock in the Project

Choosing the right Spock dependency depends on which Groovy version you are using in the project. In our case, a Groovy version 2.5. So we need Spock in version 1.x-groovy-2.5 in our test class path.

    <dependencies&gt;
        <!-- more dependencies --&gt;
        <dependency&gt;
            <groupId&gt;org.spockframework</groupId&gt;
            <artifactId&gt;spock-core</artifactId&gt;
            <version&gt;1.3-groovy-2.5</version&gt;
            <scope&gt;test</scope&gt;
        </dependency&gt;
    </dependencies&gt;

Now the expectation is that the Spock tests and the JUnit5 tests are executed in the Maven build. But only the JUnit5 tests are executed by Maven. So what happened?

I started to change the Maven Surefire Plugin version to 2.21.0. Then the Spock tests were executed, but no JUnit5 tests. The reason is that in the version 2.22.0 of Maven Surefire Plugin JUnit4 provider is replaced by JUnit Platform Provider as default. But Spock in version 1.x is based on JUnit4. This will be changed in Spock version 2. This version will be based on the JUnit5 Platform. Thus, for Spock 1.x, we have to add JUnit Vintage dependency to our test class path.

    <dependencies&gt;
        <!-- more dependencies --&gt;
          <dependency&gt;  <!--Only necessary for surefire to run spock tests during the maven build --&gt;
            <groupId&gt;org.junit.vintage</groupId&gt;
            <artifactId&gt;junit-vintage-engine</artifactId&gt;
            <scope&gt;test</scope&gt;
        </dependency&gt;
    </dependencies&gt;

This allows running elder JUnit (3/4) tests on the JUnit Platform. With this configuration both, Spock and JUnit 5 tests, are executed in the Maven build.

Links


Leave a comment

Using JUnit 5 In Pre-Java 8 Projects

This post demonstrates how JUnit 5 can be used in pre-Java 8 projects and explains why it could be a good idea.

JUnit 5 requires at least Java 8 as runtime environment, so you want to update your whole project to Java 8. But sometimes there exists reason why you can’t immediately update your project to Java 8. For example, the version of your application server in production only supports Java 7. But an update isn’t be taken quickly because of some issues in your production code.

Now, the question is how can you use JUnit 5 without update your production code to Java 8?

You can set up the Java version separately for production code and for test code .

<!-- in Maven -->
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>7</source>
<target>7</target>
<testSource>8</testSource>
<testTarget>8</testTarget>
</configuration>
</plugin>
</plugins>
</build>
// in Gradle
sourceCompatibility = '7'
targetCompatibility = '7'

compileTestJava {
sourceCompatibility ='8'
targetCompatibility = '8'
}

Precondition is that you use a Java 8 JDK for your build.

If you try to use Java 8 feature in your Java 7 production code, Maven and Gradle will fail the build.

// Maven
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project junit5-in-pre-java8-projects: Compilation failure
[ERROR] /home/sparsick/dev/workspace/junit5-example/junit5-in-pre-java8-projects/src/main/java/Java7Class.java:[8,58] lambda expressions are not supported in -source 7
[ERROR]   (use -source 8 or higher to enable lambda expressions)
// Gradle
> Task :compileJava FAILED
/home/sparsick/dev/workspace/junit5-example/junit5-in-pre-java8-projects/src/main/java/Java7Class.java:8: error: lambda expressions are not supported in -source 7
Function<String, String > java8Feature = (input) -> input;
^
(use -source 8 or higher to enable lambda expressions)
1 error

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':compileJava'.
> Compilation failed; see the compiler error output for details.

Now you can introduce JUnit 5 in your project and start writing test with JUnit 5.

<!-- Maven-->
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<scope>test</scope>
</dependency>
<!-- junit-vintage-engine is needed for running elder JUnit4 test with JUnit5-->
<dependency>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
<scope>test</scope>
</dependency>
// in Gradle
dependencies {
testCompile 'org.junit.jupiter:junit-jupiter-api:5.3.2'
testCompile 'org.junit.jupiter:junit-jupiter-engine:5.3.2'
testCompile 'org.junit.jupiter:junit-jupiter-params:5.3.2'
testCompile 'org.junit.vintage:junit-vintage-engine:5.3.2'
}

Your old JUnit 4 tests need not be migrated, because JUnit 5 has a test engine, that can run JUnit 4 tests with JUnit 5. So use JUnit 5 for new tests and only migrate JUnit 4 tests if you have to touch them anyway.

Although you can’t update your production code to a newer Java version, it has some benefit to update your test code to a newer one.

The biggest benefit is that you can start learning new language feature during your daily work when you write tests. You don’t make the beginner’s mistake in the production code. You have access to new tools that can help improve your tests. For example, in JUnit 5 it’s more comfortable to write parameterized tests than in JUnit 4. In my experience, developer writes rather parameterized test with JUnit 5 than with JUnit 4 in a situation where parameterized test make sense.

The above described technique also works for other Java version. For example, your production code is on Java 11 and you want to use Java 12 feature in your test code. Another use case for this technique could be learning another JVM language like Groovy, Kotlin or Clojure in your daily work. Then use the new language in your test code.

For Maven projects, this approach has one little pitfall. IntelliJ IDEA ignores the Java version configuration for test code. It uses the configured Java version in production code section for the whole project. An issue is already opened. A workaround is also described in this issue. So only the Maven build gives you the feedback if your production code uses correct Java version. IntelliJ hasn’t this problem for Gradle projects. Here, it uses the Java version just like it is configured in Gradle build file.

The situation in Netbeans looks better for Maven projects. Netbeans loads the Java configuration for Maven project, correctly. For Gradle projects, I couldn’t check it, because in Netbeans 10, there doesn’t yet exist a Gradle plugin (status: January 2019, but for Netbeans 9, so maybe something will come)

Links


Leave a comment

Mocking SecurityContext in Jersey Tests

Jersey has a great possibility to write integration test for REST-APIs, written with Jersey. Just extend the class JerseyTest and go for it.

I ran in an issue, where I had to mock a SecurityContext, so that the SecurityContext includes a special UserPrincipal. The challenge is that Jersey wraps the SecurityContext in an own class SecurityContextInjectee in tests. So I have to add my SecurityContext Mock to this Jersey’s wrapper class. Let me demonstrate it in an example.

Let say I have the following Jersey Resource:


import javax.ws.rs.*;
import javax.ws.rs.core.*;

@Path("hello/world")
public class MyJerseyResource {

    @GET
    public Response helloWorld(@Context final SecurityContext context) {
        String name = context.getUserPrincipal().getName();
        return Response.ok("Hello " + name, MediaType.TEXT_PLAIN).build();
    }

}

In my test, I have to mock the SecurityContext, so that a predefined user principal can be used during the tests. I use Mockito as mocking framework. My mock looks like the following one

import java.security.Principal;
import javax.ws.rs.core.SecurityContext;

import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

final SecurityContext securityContextMock = mock(SecurityContext.class);
when(securityContextMock.getUserPrincipal()).thenReturn(new Principal() {
    @Override
    public String getName() {
        return "Alice";
    }
});

For adding this mocked SecurityContext to the wrapper class SecurityContextInjectee, I have to configure a ResourceConfig with a modified ContainerRequestContext in my Jersey Test. The mocked SecurityContext can be set in this modified ContainerRequestContext and then it will be used in the wrapper class:


import java.security.Principal;

import javax.ws.rs.container.ContainerRequestContext;
import javax.ws.rs.container.ContainerRequestFilter;
import javax.ws.rs.core.SecurityContext;

import org.glassfish.jersey.server.ResourceConfig;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

@Override
public Application configure() {
    final SecurityContext securityContextMock = mock(SecurityContext.class);
    when(securityContextMock.getUserPrincipal()).thenReturn(new Principal() {
        @Override
        public String getName() {
            return "Alice";
        }
    });

    ResourceConfig config = new ResourceConfig();
    config.register(new ContainerRequestFilter(){
        @Override
        public void filter(final ContainerRequestContext containerRequestContext) throws IOException {
            containerRequestContext.setSecurityContext(securityContextMock);
        }
    });
    return config;
}

Then, the whole test for my resource looks like the following one:

import java.security.Principal;

import javax.ws.rs.container.ContainerRequestContext;
import javax.ws.rs.container.ContainerRequestFilter;
import javax.ws.rs.core.Application;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.SecurityContext;

import org.apache.commons.httpclient.HttpStatus;
import org.glassfish.jersey.server.ResourceConfig;
import org.glassfish.jersey.test.JerseyTest;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;

public class MyJerseyResourceTest extends JerseyTest {

    @Test
    public void helloWorld() throws Exception {
        Response response = target("hello/world").request().get();

        assertThat(response.getStatus()).isEqualTo(HttpStatus.SC_OK);
        assertThat(response.getEntity()).isEqualTo("Hello Alice");
    }

    @Override
    public Application configure() {
        final SecurityContext securityContextMock = mock(SecurityContext.class);
        when(securityContextMock.getUserPrincipal()).thenReturn(new Principal() {
            @Override
            public String getName() {
                return "Alice";
            }
        });

        ResourceConfig config = new ResourceConfig();
        config.register(new ContainerRequestFilter() {
            @Override
            public void filter(final ContainerRequestContext containerRequestContext) throws IOException {
                containerRequestContext.setSecurityContext(securityContextMock);
            }
        });
        return config;
    }
}

Do you have a smarter solution for this problem? Let me know it and write a comment below.


Leave a comment

My Lesson Learned From Doing Gilded Rose Kata

I’d like to share some of my thoughts about my approach to solve the Gilded Rose Refactoring Kata by Emily Bache. If you don’t know this kata, read the description for a better understanding. I have published my whole solution on GitHub . I tried to make a commit after every step, so you can keep track of my steps in the log of git. The chosen programming language is Java.

Solving Gilded Rose Step-By-Step

Let’s have a look at what I have done step-by-step.

Before adding the new feature, I wanted to refactor the given code base. Therefore, I started writing tests till I had a 100% line and branch coverage. During writing the tests, I was having the idea,, that the calculation of the quality is depended by the name of the item. Hence, the idea arose to use something similar like the Strategy Pattern. When I reached for 100% coverage, I tried to start with the implementation for the first strategy (“Aged Brie”). But I was unsure, what was my limit values for this first strategy. My problem was that I hadn’t tests for the limit values. So my first lessons learned was that 100% line or branch coverage doesn’t mean all test cases are covered. So I added tests for the limit values and finished implementing the “Aged Brie” strategy, added it to the original updateQualtity method (see below code snippet) and ran the tests. All tests were green.


ItemStrategy itemStrategy = new ItemStrategy();
...
for (int i = 0; i < items.length; i++) {
   if("Aged Brie".equals(items[i].name)) {
      items[i] = itemStrategy.updateQualityForAgedBrieItem(items[i]);
      continue;
   }

// original code follows
}

These cycle I repeated four times: Find missing test cases (mostly for limit values); add new tests for these cases; implement a further strategy; add this new strategy to the original updateQualtiy method and ran the tests. If the tests are green, the next cycle with a new strategy begins. At the end the extended updatedQuality method looked like the following code snippet.

ItemStrategy itemStrategy = new ItemStrategy();

...
for (int i = 0; i < items.length; i++) {
   if("Aged Brie".equals(items[i].name)) {
      items[i] = itemStrategy.updateQualityForAgedBrieItem(items[i]);
      continue;
   } else if ("Sulfuras, Hand of Ragnaros".equals(items[i].name)) {
      items[i] = itemStrategy.updateQualityForSulfurasItem(items[i]);
      continue;
   } else if("Backstage passes to a TAFKAL80ETC concert".equals(items[i].name)) {
      items[i] = itemStrategy.updateQualityForBackstagePassItem(items[i]);
      continue;
   } else {
      items[i] = itemStrategy.updateQualityForNormalItem(items[i]);
      continue;
   }

// commented out original code
}

My second Lessons Learned was “Refactoring needs time” and the refactoring wasn’t finished. The next steps were cleaning up unnecessary code and refactoring the strategy implementations like replacing if-else construct by ternary operator and extracting if-condition to private methods.

After that I implemented the new feature “conjured item” following the above describe work flow. After this step I could say “Ready”, but I was unhappy with the if-else if-else chain. Therefore, I decided to extract each strategy implementation to an own class (following the “classic” strategy pattern). That helps to replace the if-else if-else chain by an itemStrategyMap. So the next Lesson Learned was “The status ‘Ready’ depends by the definition”.
The last step was doing clean up and choosing better names for the interface and its method.


static Map<String, ItemStrategy> itemStrategyMap = new HashMap<>();

static {
   itemStrategyMap.put("Aged Brie", new AgedBrieItemStrategy());
   itemStrategyMap.put("Sulfuras, Hand of Ragnaros", new SulfurasItemStrategy());
   itemStrategyMap.put("Backstage passes to a TAFKAL80ETC concert", new BackstagePassItemStrategy());
   itemStrategyMap.put("Conjured", new ConjuredItemStrategy());
}

public void updateQuality() {
   for (int i = 0; i < items.length; i++) {
      ItemStrategy itemStrategy = itemStrategyMap.getOrDefault(items[i].name, new NormalItemStrategy());
      items[i] = itemStrategy.updateItem(items[i]);
   }
}

Let’s summarize the Lesson Learned:
1) 100% line or branch coverage doesn’t mean all test cases are covered.
2) Refactoring needs time.
3) The status ‘Ready’ depends by the definition.
These insights aren’t really new for me. I can often observe these insights in my daily work. Nevertheless, it was good to have these insights again, following the rule “learning through repetition” ☺

What I forgot

I stopped after that step. Thinking about it some days later, I have realized that there exists more improvements. For example, the tests from GildedRoseTest class could be extracted to separate test classes regarding to the specific strategy classes.


2 Comments

Commons VFS, SSHJ and JSch in Comparison

Some weeks ago I evaluated some SSH libraries for Java. The main requirements to them are file transferring and file operations on a remote machine. Therefore, it exists a network protocol based on SSH, SSH File Transfer Protocol (or SFTP). So I needed a SSH library that supports SFTP.

A research shows that it exits many SSH libraries for Java. I reduce the number of libraries to three for the comparison. I choose JSch, SSHJ and Apache’s Commons VFS for a deeper look. All of them support SFTP. JSch seems to be the de-facto standard for Java. SSHJ is a newer library. Its goal is to have a clear Java API for SSH. The goal of Commons VFS is to have a clear API for virtual file systems and SFTP is one of the supported protocol. Under the hood it uses JSch for the SFTP protocol. The libraries should cover following requirements:

  • client authentication over password
  • client authentication over public key
  • server authentication
  • upload files from local host over SFTP
  • download files to local host over SFTP
  • file operations on the remote host like move, delete, list all children of a given folder (filtering after type like file or folder) over SFTP
  • execute plain shell commands

Lets have a deeper look how the three libraries cover the requirements.

Client Authentication

All three libraries supports both required authentication methods. SSHJ has the clearest API for authentication (SSHClient.authUserPass(), SSHClient.authUserPublicKey()).


SSHClient sshClient= new SSHClient();
sshClient.connect(host);

// only for public key authentication
sshClient.authPublickey("user", "location to private key file");

// only for password authentication
sshClient.authPassword("user", "password");

In Commons VFS the authentication configuration depends which kind of authentication should be used. For the public key authentication, the private key has to set in the FileSystemOption and the user name is a part of the connection url. For the password authentication, user name and password is a part of the connection url.


StandardFileSystemManager fileSystemManager = new StandardFileSystemManager();
fileSystemManager.init();

// only for public key authentication
SftpFileSystemConfigBuilder sftpConfigBuilder = SftpFileSystemConfigBuilder.getInstance();
FileSystemOptions opts = new FileSystemOptions();
sftpConfigBuilder.setIdentities(opts, new File[]{privateKey.toFile()});
String connectionUrl = String.format("sftp://%s@%s", user, host);

// only for password authentication
String connectionUrl = String.format("sftp://%s:%s@%s", user, password, host);

// Connection set-up
FileObject remoteRootDirectory = fileSystemManager.resolveFile(connectionUrl, connectionOptions);

The authentication configuration in JSch is similar to Commons VFS. It depends which kind of authentication should be used. The private key for the public key authentication has to be configured in the JSch object and the password for the password authentication has to be set in the Session object. For both, the user name is set, when the JSch object gets the Session object.


JSch sshClient = new JSch();

// only for public key authentication
sshClient.addIdentity("location to private key file");

session = sshClient.getSession(user, host);

// only for password authentication
session.setPassword(password);

session.connect();

Server Authentication

All three libraries supports server authentication. In SSHJ the server authentication can be enabled with SSHClient.loadKnownHost. It is possible to  add an own location of known_host file or it is used the default location that depends on the using platform.


SSHClient sshClient = new SSHClient();
sshClient.loadKnownHosts(); // or sshClient.loadKnownHosts(knownHosts.toFile());
sshClient.connect(host);

In Commons VFS the server authentication configuration is also a part of the FileSystemOption like the public key authentication. There, the location of the known_hosts file can be set.


SftpFileSystemConfigBuilder sftpConfigBuilder = SftpFileSystemConfigBuilder.getInstance();
FileSystemOptions opts = new FileSystemOptions();
sftpConfigBuilder.setKnownHosts(opts, new File("location of the known_hosts file"));

In JSch it exists two possibilities to configure the server authentication. One possibility is to use the OpenSSHConfig (see JSch example for OpenSSHConfig). The another possibility is easier. The location of the known_hosts file can be set directly in JSch object.


JSch sshClient = new JSch();
sshClient.setKnownHosts("location of known-hosts file");

Upload/download Files Over SFTP

All three libraries supports uploads and downloads files over SFTP. SSHJ has very clear API for these operations. The SSHClient object creates a SFTPClient object. This object is responsible for the upload (SFTPClient.put) and for the download (SFTPClient.get).


SSHClient sshClient = new SSHClient();
// ... connection

try (SFTPClient sftpClient = sshClient.newSFTPClient()) {
  // download
  sftpClient.get(remotePath, new FileSystemFile(local.toFile()));
  // upload
  sftpClient.put(new FileSystemFile(local.toFile()), remotePath);
}

In Commons VFS the upload and download files is abstracted as operation on a file system. So both are represented by the copyFrom method of a FileObject object. Upload is a copyFrom operation on a RemoteFile  object and download is a copyFrom operation on a LocalFile.


StandardFileSystemManager fileSystemManager = new StandardFileSystemManager();
// ... configuration
remoteRootDirectory = fileSystemManager.resolveFile(connectionUrl, connectionOptions);

LocalFile localFileObject = (LocalFile) fileSystemManager.resolveFile(local.toUri().toString());
FileObject remoteFileObject = remoteRootDirectory.resolveFile(remotePath);
try {
  // download
  localFileObject.copyFrom(remoteFileObject, new AllFileSelector());

  // upload
  remoteFileObject.copyFrom(localFileObject, new AllFileSelector());
} finally {
  localFileObject.close();
  remoteFileObject.close();
}

JSch also supports a SFTPClient. In JSch it is called ChannelSFTP. It has two method for download (ChannelSFTP.get) and upload (ChannelSFTP.put).


// here: creation and configuration of session

ChannelSftp sftpChannel = null;
try {
  sftpChannel = (ChannelSftp) session.openChannel("sftp");
  sftpChannel.connect();

  // download
  InputStream inputStream = sftpChannel.get(remotePath);
  Files.copy(inputStream, localPath);

  // upload
  OutputStream outputStream = sftpChannel.put(remotePath);
  Files.copy(locaPathl, outputStream);
} catch (SftpException | JSchException ex) {
  throw new IOException(ex);
} finally {
  if (sftpChannel != null) {
    sftpChannel.disconnect();
  }
}

Execute Shell Commands

Only Commons VFS doesn’t support executing plain shell commands. In SSHJ it is a two-liner. The SshClient starts a new Session object. This object executes the shell command. It is very intuitive.


// creation and configuration of sshClient

try (Session session = sshClient.startSession()) {
  session.exec("ls");
}

In Jsch the ChannelExec is responsible for executing shell commands over SSH. At first the command is set in the channel and then the channel has to be started. It isn’t so intuitive than in SSHJ.


// here: creation and configuration of session object

ChannelExec execChannel = null;
try {
  execChannel = (ChannelExec) session.openChannel("exec");
  execChannel.connect();
  execChannel.setCommand(command);
  execChannel.start();
} catch (JSchException ex) {
  throw new IOException(ex);
} finally {
  if (execChannel != null) {
    execChannel.disconnect();
  }
}

File Operations On the Remote Hosts

All libraries supports more or less ideal file operations over SFTP on remote machines. In SSHJ SFTPClient has also methods for file operations. The names of the methods are the same as the file operations on a Linux system. The following code snippet shows how to delete a file.


//here: creation and configuration of sshClient

try (SFTPClient sftpClient = sshClient.newSFTPClient()) {
  sftpClient.rm(remotePath);
}

Commons VFS’s core functionality is file operations. The usage takes getting used to. A file object has to be resolve and the file operations can be done on it.


// here: creation and configuration of remoteRootDirectory

FileObject remoteFileObject = remoteRootDirectory.resolveFile(remotePath);
try {
  remoteFileObject.delete();
} finally {
  remoteFileObject.close();
}

JSch’s SFTPClient ChannelSFTP has also method for file operations. The mostly file operations are supported by this channel. For e.g. the file copy operation on the remote machine has to be done by plain shell commands over the ChannelExec.

// here: creation and configuration of session
ChannelSftp sftpChannel = null;
try {
  sftpChannel = (ChannelSftp) session.openChannel("sftp");
  sftpChannel.connect();
  sftpChannel.rm(remotePath);
} catch (SftpException | JSchException ex) {
  throw new IOException(ex);
} finally {
  if (sftpChannel != null) {
    sftpChannel.disconnect();
  }
}

Conclusion

After this comparison I have two favourites, SSHJ and Commons VFS. SSHJ has a very clear API and I would choose it if I need a common SSH client or file operation support over SFTP is sufficient. I would choose Commons VFS if I have file operation over many file system protocols or a common SSH client is not needed. For the case, that I need both, I could use JSch directly to execute commands over SSH. The API of Commons VFS takes getting used to. But after understanding the concept behind, the usage of the API is straightforward.

The whole source code examples of this comparison are hosted on Github.

Useful Links

  1. SSHJ homepage
  2. JSch homepage
  3. Commons-vfs homepage
  4. Wikipedia page about SFTP
  5. Source Code of this comparison on Github


1 Comment

Unit And Integration Test Reports For Maven Projects In SonarQube

Since SonarQube 4.2. the test report isn’t generated by the Sonar Maven Plugin during a Maven build (see SonarQube’s blog post) . Therefore, the test report has to be generated by another plugin before Sonar Maven Plugin collects the information for the SonarQube server. Here, Jacoco Maven Plugin can help. It has the possibility to generate test report that are understandable for SonarQube. Jacoco Maven Plugin goes one step further, it has the possibility to generate a test report for integration test.

In the following sections, a solution is presented that meets following criteria:

  • Maven is used as build tool.
  • The project can be a multi module project.
  • Unit tests and integration tests are parts of each module. Here, integration tests are tests that test the integration between classes in a module.
  • Test reports are separate in unit test report and integration test report.

The road map for the next section is that firstly the Maven project structure is shown for the separation of unit and integration tests. Then the Maven project configuration is shown for having separate unit test runs and integration test runs.  After that, we have a look on the Maven project configuration for the test report generation separated in unit test and integration test. At the end, SonarQube’s configuration is shown for the test report visualization in the SonarQube’s dashboard.

Maven Project Structure

At first, we look at how a default Maven project structure looks like for a single module project.

my-app
├── pom.xml
├── src
│   ├── main
│   │   └── java
│   │       └──
│   └── test
│       └── java
│           └──

The directory src/main/java contains the production source code and the directory src/test/java contains the test source code. We could put unit tests and integration tests together in this directory. But we want to separate these two types of tests in separate directories. Therefore, we add a new directory called src/it/java. Then unit tests are put in the directory src/test/java and the integration tests are put in the directory src/it/java, so the new project structure looks like the following one.

my-app
├── pom.xml
├── src
│   ├── it
│   │   └── java
│   │       └──
│   ├── main
│   │   └── java
│   │       └──
│   └── test
│       └── java
│           └──

Unit And Integration Test Runs

Fortunately, the unit test run configuration is a part of the Maven default project configuration. Maven runs these tests automatically if following criteria are met:

  • The tests are in the directory src/test/java and
  • the test class name either starts with Test or ends with Test or TestCase.

Maven runs these tests during the Maven’s build lifecylce phase test.

The integration test run configuration has to be done manually. It exists Maven plugins that can help. We want that the following criteria are met:

  • integration tests are stored in the directory src/it/java and
  • the integration test class name either starts IT or ends with IT or ITCase and
  • integrations tests runs during the Maven’s build lifecycle phase integration-test.

Firstly, Maven has to know that it has to include the directory src/it/java to its test class path. Here, the Build Helper Maven Plugin can help. It adds the directory src/it/java to the test class path.


<plugin>
  <groupId>org.codehaus.mojo</groupId>
  <artifactId>build-helper-maven-plugin</artifactId>
  <version>1.8</version>
  <executions>
    <execution>
      <id>add-test-source</id>
      <phase>process-test-sources</phase>
      <goals>
        <goal>add-test-source</goal>
      </goals>
      <configuration>
        <sources>
          src/it/java
        </sources>
      </configuration>
     </execution>
     <execution>
       <id>add-test-resources</id>
       <phase>generate-test-resources</phase>
       <goals>
         <goal>add-test-resource</goal>
       </goals>
       <configuration>
          <resources>
            <resource>
              src/it/resources
            </resource>
          </resources>
       </configuration>
     </execution>
   </executions>
 </plugin>

The above code snippet has to be inserted into the section <project><build><plugins> in the project root pom.

Maven’s build lifecycle contains a phase called integration-test.  In this phase, we want to run the integration test. Therefore, we bind the Maven Failsafe Plugin to the phase integration-test:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-failsafe-plugin</artifactId>
  <version>2.13</version>
  <configuration>
    <encoding>${project.build.sourceEncoding}</encoding>
  </configuration>
  <executions>
    <execution>
      <id>failsafe-integration-tests</id>
      <phase>integration-test</phase>
      <goals>
        <goal>integration-test</goal>
        <goal>verify</goal>
      </goals>
    </execution>
  </executions>
</plugin>

Again, the above code snippet also has to be inserted into the section <project><build><plugins> in the project root pom. Then Maven Failsafe Plugin runs the integration tests automatically, when their class name either starts with IT or ends with IT or ITCase.

Test Report Generation

We want to use the Jacoco Maven Plugin for the test report generation. It should generate two test reports, one for the unit test and one for the integration tests. Therefore, the plugin has to two separated agents, that have to be prepared. Then they generate the report during the test runs. The Maven’s build lifecycle contains own phases for preparation before the test phases (test and integration-test). The preparation phase for the test phase is called process-test-classes and the preparation phase for integration-test phase is called pre-integration-test. In these two phases we bind the Jacoco Maven Plugin, so the configuration of this plugin looks like the following code snippet (Again, it is a part of the section <project><build><plugins>):

<plugin>
  <groupId>org.jacoco</groupId>
  <artifactId>jacoco-maven-plugin</artifactId>
  <version>0.7.2.201409121644</version>
  <executions>
    <execution>
      <configuration>
        <destFile>${sonar.jacoco.reportPath}
      </configuration>
      <id>pre-test</id>
      <phase>process-test-classes</phase>
      <goals>
        <goal>prepare-agent</goal>
      </goals>
    </execution>
<!-- we want to execute <span class="hiddenSpellError" pre="execute " data-mce-bogus="1">jacoco</span>:prepare-agent-integration in test phase,-->
but before executing maven failsafe plugin -->
    <execution>
      <configuration>
        <destFile>${sonar.jacoco.itReportPath}</destFile>
      </configuration>
      <id>pre-itest</id>
      <phase>pre-integration-test</phase>
      <goals>
        <goal>prepare-agent-integration</goal>
      </goals>
    </execution>
  </executions>
</plugin>

The configuration element destFile is the path to the location, where the test reports should be stored. It is important to use the properties ${sonar.jacoco.reportPath} and ${sonar.jacoco.itReportPath}. These properties are used by SonarQube to find the test reports for the visualization.

Now, we can run the goal mvn install and our project is built inclusive unit and integration test and inclusive generating two test reports.

SonarQube Test Report Visualization

Now, we want to visualize our test reports in SonarQube. Therefore, we have to run the Sonar Maven 3 Plugin (command mvn sonar:sonar)  in our project after a successful build.

When we open our project in the SonarQube dashboard, we see only the report for the unit test per module. The reason is that the report visualization of the integration test has to be configured in SonarQube, separately. These configuration steps are described in the SonarQube documentation very well.

Summary

This blog describes how to generate test reports for unit and integration test during a Maven build. On GitHub, I host a sample project that demonstrate all configuration steps. As technical environment I use

  • Maven 3.2.5
  • Maven Plugins:
    • Maven Surefire Plugin
    • Maven Failsafe Plugin
    • Build Helper Maven Plugin
    • Jacoco Maven Plugin
    • Sonar Maven 3 Plugin
  • SonarQube 4.5.1
  • Java 7

Links

  1. SonarQube’s blog post Unit Test Execution in SonarQube
  2. Jacoco Maven plugin project site
  3. Introduction to Maven’s build lifecycle
  4. Maven Failsafe Plugin  project site
  5. Build Helper Maven Plugin project site
  6. SonarQube documentation about Code Coverage by Integration Tests for Java Project
  7. A sample Maven project on GitHub