SKM IT World

Just another blog about IT


Leave a comment

Maven Project Setup for Mixing Spock 1.x and JUnit 5 Tests

I create a sample Groovy project for Maven, that mixes Spock tests and JUnit 5 tests in one project. In the next section I’ll describe how to set up such kind of Maven project.

Enable Groovy in the Project

First at all, you have to enable Groovy in your project. One possibility is to add the GMavenPlus Plugin to your project.

<build&gt;
    <plugins&gt;
        <plugin&gt;
            <groupId&gt;org.codehaus.gmavenplus</groupId&gt;
            <artifactId&gt;gmavenplus-plugin</artifactId&gt;
            <version&gt;1.6.2</version&gt;
            <executions&gt;
                <execution&gt;
                    <goals&gt;
                        <goal&gt;addSources</goal&gt;
                        <goal&gt;addTestSources</goal&gt;
                        <goal&gt;compile</goal&gt;
                        <goal&gt;compileTests</goal&gt;
                    </goals&gt;
                </execution&gt;
            </executions&gt;
        </plugin&gt;
    </plugins&gt;
</build&gt;

The goals addSources and addTestSources add Groovy (test) sources to Maven’s main (test) sources. The default locations are src/main/groovy (for main source) and src/test/groovy (for test source). Goals compile and compileTests compile the Groovy (test) code. If you don’t have Groovy main code, you can omit addSource and compile.

This above configuration is always using the latest released Groovy version. If you want to ensure that a specific Groovy version is used, you have to add the specific Groovy dependency to your classpath.

   <dependencies&gt;
        <dependency&gt;
            <groupId&gt;org.codehaus.groovy</groupId&gt;
            <artifactId&gt;groovy</artifactId&gt;
            <version&gt;2.5.6</version&gt;
        </dependency&gt;
  </dependencies&gt;

Enable JUnit 5 in the Project

The simplest setup for using JUnit 5 in your project is to add the JUnit Jupiter dependency in your test class path and to configure the correct version of Maven Surefire Plugin (at least version 2.22.0).

    <dependencies&gt;
<!--... maybe more dependencies --&gt;
        <dependency&gt;
            <groupId&gt;org.junit.jupiter</groupId&gt;
            <artifactId&gt;junit-jupiter</artifactId&gt;
            <scope&gt;test</scope&gt;
        </dependency&gt;
    </dependencies&gt;

    <dependencyManagement&gt;
        <dependencies&gt;
            <dependency&gt;
                <groupId&gt;org.junit</groupId&gt;
                <artifactId&gt;junit-bom</artifactId&gt;
                <version&gt;${junit.jupiter.version}</version&gt;
                <scope&gt;import</scope&gt;
                <type&gt;pom</type&gt;
            </dependency&gt;
        </dependencies&gt;
    </dependencyManagement&gt;
    <build&gt;
        <plugins&gt;
        <!-- other plugins --&gt;
            <plugin&gt;
                <artifactId&gt;maven-surefire-plugin</artifactId&gt;
                <version&gt;2.22.1</version&gt;
            </plugin&gt;
        </plugins&gt;
    </build&gt;

Enable Spock in the Project

Choosing the right Spock dependency depends on which Groovy version you are using in the project. In our case, a Groovy version 2.5. So we need Spock in version 1.x-groovy-2.5 in our test class path.

    <dependencies&gt;
        <!-- more dependencies --&gt;
        <dependency&gt;
            <groupId&gt;org.spockframework</groupId&gt;
            <artifactId&gt;spock-core</artifactId&gt;
            <version&gt;1.3-groovy-2.5</version&gt;
            <scope&gt;test</scope&gt;
        </dependency&gt;
    </dependencies&gt;

Now the expectation is that the Spock tests and the JUnit5 tests are executed in the Maven build. But only the JUnit5 tests are executed by Maven. So what happened?

I started to change the Maven Surefire Plugin version to 2.21.0. Then the Spock tests were executed, but no JUnit5 tests. The reason is that in the version 2.22.0 of Maven Surefire Plugin JUnit4 provider is replaced by JUnit Platform Provider as default. But Spock in version 1.x is based on JUnit4. This will be changed in Spock version 2. This version will be based on the JUnit5 Platform. Thus, for Spock 1.x, we have to add JUnit Vintage dependency to our test class path.

    <dependencies&gt;
        <!-- more dependencies --&gt;
          <dependency&gt;  <!--Only necessary for surefire to run spock tests during the maven build --&gt;
            <groupId&gt;org.junit.vintage</groupId&gt;
            <artifactId&gt;junit-vintage-engine</artifactId&gt;
            <scope&gt;test</scope&gt;
        </dependency&gt;
    </dependencies&gt;

This allows running elder JUnit (3/4) tests on the JUnit Platform. With this configuration both, Spock and JUnit 5 tests, are executed in the Maven build.

Links

Advertisements


Leave a comment

Using JUnit 5 In Pre-Java 8 Projects

This post demonstrates how JUnit 5 can be used in pre-Java 8 projects and explains why it could be a good idea.

JUnit 5 requires at least Java 8 as runtime environment, so you want to update your whole project to Java 8. But sometimes there exists reason why you can’t immediately update your project to Java 8. For example, the version of your application server in production only supports Java 7. But an update isn’t be taken quickly because of some issues in your production code.

Now, the question is how can you use JUnit 5 without update your production code to Java 8?

You can set up the Java version separately for production code and for test code .

<!-- in Maven -->
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>7</source>
<target>7</target>
<testSource>8</testSource>
<testTarget>8</testTarget>
</configuration>
</plugin>
</plugins>
</build>
// in Gradle
sourceCompatibility = '7'
targetCompatibility = '7'

compileTestJava {
sourceCompatibility ='8'
targetCompatibility = '8'
}

Precondition is that you use a Java 8 JDK for your build.

If you try to use Java 8 feature in your Java 7 production code, Maven and Gradle will fail the build.

// Maven
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project junit5-in-pre-java8-projects: Compilation failure
[ERROR] /home/sparsick/dev/workspace/junit5-example/junit5-in-pre-java8-projects/src/main/java/Java7Class.java:[8,58] lambda expressions are not supported in -source 7
[ERROR]   (use -source 8 or higher to enable lambda expressions)
// Gradle
> Task :compileJava FAILED
/home/sparsick/dev/workspace/junit5-example/junit5-in-pre-java8-projects/src/main/java/Java7Class.java:8: error: lambda expressions are not supported in -source 7
Function<String, String > java8Feature = (input) -> input;
^
(use -source 8 or higher to enable lambda expressions)
1 error

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':compileJava'.
> Compilation failed; see the compiler error output for details.

Now you can introduce JUnit 5 in your project and start writing test with JUnit 5.

<!-- Maven-->
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<scope>test</scope>
</dependency>
<!-- junit-vintage-engine is needed for running elder JUnit4 test with JUnit5-->
<dependency>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
<scope>test</scope>
</dependency>
// in Gradle
dependencies {
testCompile 'org.junit.jupiter:junit-jupiter-api:5.3.2'
testCompile 'org.junit.jupiter:junit-jupiter-engine:5.3.2'
testCompile 'org.junit.jupiter:junit-jupiter-params:5.3.2'
testCompile 'org.junit.vintage:junit-vintage-engine:5.3.2'
}

Your old JUnit 4 tests need not be migrated, because JUnit 5 has a test engine, that can run JUnit 4 tests with JUnit 5. So use JUnit 5 for new tests and only migrate JUnit 4 tests if you have to touch them anyway.

Although you can’t update your production code to a newer Java version, it has some benefit to update your test code to a newer one.

The biggest benefit is that you can start learning new language feature during your daily work when you write tests. You don’t make the beginner’s mistake in the production code. You have access to new tools that can help improve your tests. For example, in JUnit 5 it’s more comfortable to write parameterized tests than in JUnit 4. In my experience, developer writes rather parameterized test with JUnit 5 than with JUnit 4 in a situation where parameterized test make sense.

The above described technique also works for other Java version. For example, your production code is on Java 11 and you want to use Java 12 feature in your test code. Another use case for this technique could be learning another JVM language like Groovy, Kotlin or Clojure in your daily work. Then use the new language in your test code.

For Maven projects, this approach has one little pitfall. IntelliJ IDEA ignores the Java version configuration for test code. It uses the configured Java version in production code section for the whole project. An issue is already opened. A workaround is also described in this issue. So only the Maven build gives you the feedback if your production code uses correct Java version. IntelliJ hasn’t this problem for Gradle projects. Here, it uses the Java version just like it is configured in Gradle build file.

The situation in Netbeans looks better for Maven projects. Netbeans loads the Java configuration for Maven project, correctly. For Gradle projects, I couldn’t check it, because in Netbeans 10, there doesn’t yet exist a Gradle plugin (status: January 2019, but for Netbeans 9, so maybe something will come)

Links


Leave a comment

How to Mark a Jenkins Job Red When Tests Fail In A Maven Build

The default setting in Jenkins is to mark a job yellow, when a Maven build fails because of failing tests. If you don’t want to have three status of your jobs, you can configure Jenkins so, that the jobs also mark red independent why a Maven build fails.

For this you will need administration rights on your Jenkins instance. Following steps have to be done:

  1. Go to Manage Jenkins -> Manage system.
  2. Add -Dmaven.test.failure.ignore=false to Maven Project Configuration -> Global Maven_OPTS.
  3. Save this change and that’s it.

Your next job run will consider this configuration. Unfortunately, this configuration has only effects for Maven jobs. Freestyle jobs ignore this configuration (see also this bug).

But a workaround exists:

  1. Install the TextFinder plugin via Manage Jenkins -> Manage Plugin.
  2. Open the Freestyle job’s configuration that should be marked red, when Maven tests fail.
  3. Click on Add a post-build action (in section Post-build Action) and select Jenkins Text Finder.
  4. Activate the check box Also search the console output.
  5. Add the value There are test failures to Regular expression.
  6. Save this change.

 

 

 


1 Comment

Unit And Integration Test Reports For Maven Projects In SonarQube

Since SonarQube 4.2. the test report isn’t generated by the Sonar Maven Plugin during a Maven build (see SonarQube’s blog post) . Therefore, the test report has to be generated by another plugin before Sonar Maven Plugin collects the information for the SonarQube server. Here, Jacoco Maven Plugin can help. It has the possibility to generate test report that are understandable for SonarQube. Jacoco Maven Plugin goes one step further, it has the possibility to generate a test report for integration test.

In the following sections, a solution is presented that meets following criteria:

  • Maven is used as build tool.
  • The project can be a multi module project.
  • Unit tests and integration tests are parts of each module. Here, integration tests are tests that test the integration between classes in a module.
  • Test reports are separate in unit test report and integration test report.

The road map for the next section is that firstly the Maven project structure is shown for the separation of unit and integration tests. Then the Maven project configuration is shown for having separate unit test runs and integration test runs.  After that, we have a look on the Maven project configuration for the test report generation separated in unit test and integration test. At the end, SonarQube’s configuration is shown for the test report visualization in the SonarQube’s dashboard.

Maven Project Structure

At first, we look at how a default Maven project structure looks like for a single module project.

my-app
├── pom.xml
├── src
│   ├── main
│   │   └── java
│   │       └──
│   └── test
│       └── java
│           └──

The directory src/main/java contains the production source code and the directory src/test/java contains the test source code. We could put unit tests and integration tests together in this directory. But we want to separate these two types of tests in separate directories. Therefore, we add a new directory called src/it/java. Then unit tests are put in the directory src/test/java and the integration tests are put in the directory src/it/java, so the new project structure looks like the following one.

my-app
├── pom.xml
├── src
│   ├── it
│   │   └── java
│   │       └──
│   ├── main
│   │   └── java
│   │       └──
│   └── test
│       └── java
│           └──

Unit And Integration Test Runs

Fortunately, the unit test run configuration is a part of the Maven default project configuration. Maven runs these tests automatically if following criteria are met:

  • The tests are in the directory src/test/java and
  • the test class name either starts with Test or ends with Test or TestCase.

Maven runs these tests during the Maven’s build lifecylce phase test.

The integration test run configuration has to be done manually. It exists Maven plugins that can help. We want that the following criteria are met:

  • integration tests are stored in the directory src/it/java and
  • the integration test class name either starts IT or ends with IT or ITCase and
  • integrations tests runs during the Maven’s build lifecycle phase integration-test.

Firstly, Maven has to know that it has to include the directory src/it/java to its test class path. Here, the Build Helper Maven Plugin can help. It adds the directory src/it/java to the test class path.


<plugin>
  <groupId>org.codehaus.mojo</groupId>
  <artifactId>build-helper-maven-plugin</artifactId>
  <version>1.8</version>
  <executions>
    <execution>
      <id>add-test-source</id>
      <phase>process-test-sources</phase>
      <goals>
        <goal>add-test-source</goal>
      </goals>
      <configuration>
        <sources>
          src/it/java
        </sources>
      </configuration>
     </execution>
     <execution>
       <id>add-test-resources</id>
       <phase>generate-test-resources</phase>
       <goals>
         <goal>add-test-resource</goal>
       </goals>
       <configuration>
          <resources>
            <resource>
              src/it/resources
            </resource>
          </resources>
       </configuration>
     </execution>
   </executions>
 </plugin>

The above code snippet has to be inserted into the section <project><build><plugins> in the project root pom.

Maven’s build lifecycle contains a phase called integration-test.  In this phase, we want to run the integration test. Therefore, we bind the Maven Failsafe Plugin to the phase integration-test:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-failsafe-plugin</artifactId>
  <version>2.13</version>
  <configuration>
    <encoding>${project.build.sourceEncoding}</encoding>
  </configuration>
  <executions>
    <execution>
      <id>failsafe-integration-tests</id>
      <phase>integration-test</phase>
      <goals>
        <goal>integration-test</goal>
        <goal>verify</goal>
      </goals>
    </execution>
  </executions>
</plugin>

Again, the above code snippet also has to be inserted into the section <project><build><plugins> in the project root pom. Then Maven Failsafe Plugin runs the integration tests automatically, when their class name either starts with IT or ends with IT or ITCase.

Test Report Generation

We want to use the Jacoco Maven Plugin for the test report generation. It should generate two test reports, one for the unit test and one for the integration tests. Therefore, the plugin has to two separated agents, that have to be prepared. Then they generate the report during the test runs. The Maven’s build lifecycle contains own phases for preparation before the test phases (test and integration-test). The preparation phase for the test phase is called process-test-classes and the preparation phase for integration-test phase is called pre-integration-test. In these two phases we bind the Jacoco Maven Plugin, so the configuration of this plugin looks like the following code snippet (Again, it is a part of the section <project><build><plugins>):

<plugin>
  <groupId>org.jacoco</groupId>
  <artifactId>jacoco-maven-plugin</artifactId>
  <version>0.7.2.201409121644</version>
  <executions>
    <execution>
      <configuration>
        <destFile>${sonar.jacoco.reportPath}
      </configuration>
      <id>pre-test</id>
      <phase>process-test-classes</phase>
      <goals>
        <goal>prepare-agent</goal>
      </goals>
    </execution>
<!-- we want to execute <span class="hiddenSpellError" pre="execute " data-mce-bogus="1">jacoco</span>:prepare-agent-integration in test phase,-->
but before executing maven failsafe plugin -->
    <execution>
      <configuration>
        <destFile>${sonar.jacoco.itReportPath}</destFile>
      </configuration>
      <id>pre-itest</id>
      <phase>pre-integration-test</phase>
      <goals>
        <goal>prepare-agent-integration</goal>
      </goals>
    </execution>
  </executions>
</plugin>

The configuration element destFile is the path to the location, where the test reports should be stored. It is important to use the properties ${sonar.jacoco.reportPath} and ${sonar.jacoco.itReportPath}. These properties are used by SonarQube to find the test reports for the visualization.

Now, we can run the goal mvn install and our project is built inclusive unit and integration test and inclusive generating two test reports.

SonarQube Test Report Visualization

Now, we want to visualize our test reports in SonarQube. Therefore, we have to run the Sonar Maven 3 Plugin (command mvn sonar:sonar)  in our project after a successful build.

When we open our project in the SonarQube dashboard, we see only the report for the unit test per module. The reason is that the report visualization of the integration test has to be configured in SonarQube, separately. These configuration steps are described in the SonarQube documentation very well.

Summary

This blog describes how to generate test reports for unit and integration test during a Maven build. On GitHub, I host a sample project that demonstrate all configuration steps. As technical environment I use

  • Maven 3.2.5
  • Maven Plugins:
    • Maven Surefire Plugin
    • Maven Failsafe Plugin
    • Build Helper Maven Plugin
    • Jacoco Maven Plugin
    • Sonar Maven 3 Plugin
  • SonarQube 4.5.1
  • Java 7

Links

  1. SonarQube’s blog post Unit Test Execution in SonarQube
  2. Jacoco Maven plugin project site
  3. Introduction to Maven’s build lifecycle
  4. Maven Failsafe Plugin  project site
  5. Build Helper Maven Plugin project site
  6. SonarQube documentation about Code Coverage by Integration Tests for Java Project
  7. A sample Maven project on GitHub


2 Comments

Generate P2 Repository From Maven Dependencies

A new version of this blog post is published under the title “Generate P2 Repository From Maven Artifacts in 2017

Motivation

If you use Maven Tycho to build your Eclipse RCP application, you can use Maven dependencies as you know it from Maven (for more information see [1]). I notice that Tycho takes many times for computing the target platform from the Maven dependencies during the build. I expect a speed up of build time if I use a P2 repository instead of Maven dependencies directly. Therefore, the task is how do I get the Maven dependencies to a P2 repository.

Procedure for generating a P2 repository from Maven dependencies

We generate the P2 repository with the help of some Maven’s plugin and some Maven Tycho’s plugin:

  • Builder Helper Maven Plugin
  • Maven Dependency Plugin
  • Maven Tycho’s P2 Extra Plugin
  • Maven Tycho’s P2 Plugin
  • Maven Tycho’s P2 Repository Plugin

Maven Tycho’s Plugins use Eclipse standard tools internally. How the Eclipse standard tools work are described well on this blog post [2]. Further useful information about Tycho’s plugins can be found in [3] and [4].

We create a POM project and configure plugins mentioned above, so that the following procedure can work:

  1. Define the Maven dependencies, that should be add to the P2 repository, in the <dependencies> section.
  2. Copy these defined dependencies to the source location of the Feature and Bundle Publisher with Maven Dependency Plugin.
  3. Generate P2 repository with P2 Extra Plugin.
  4. Add categories to the P2 metadata with P2 Plugin, so that you can see your P2 repository in Eclipse Target Platform Wizard.
  5. Zip P2 repository with P2 Repository Plugin.
  6. Attach zipped P2 repository to be installed and deployed in the Maven repository during the deploy phase with Builder Helper Plugin.

You can find the whole POM configuration in [5].

How to use zipped P2 Repository from Maven Repository

When you use Nexus as Maven Repository, you can use the Nexus Unzip Plugin ([6]).

Links

[1] Maven Tycho How to – Dependency on pom-first artifacts
[2] Blog Post about Generation P2 Repository with Eclipse Standard Tool
[3] P2 Extra Plugin Project Site
[4] Sonatype Wiki Page about P2 Extra Plugin
[5] POM Project on GitHub
[6] Nexus Unzip Plugin Documentation


5 Comments

Spring Web Application With Hessian Services As a Maven Project

This post describes how to set up a Maven project for a Spring web application with Hessian Service. It also shows how to set up the deployment for exposing the Hessian service and how to set up a client to consume the Hessian service. The Spring Framework version is 3.1.1.RELEASE and the Hessian version is 4.0.7.

Maven Set Up For Server

Our Maven project has three modules

  • hello-world-api
  • hello-world-impl
  • hello-word-war

<modules>
  <module>hello-world-api</module>
  <module>hello-world-impl</module>
  <module>hello-world-war</module>
</modules>

The module hello-world-api contains the interfaces of the services that Hessian server and Hessian client need for the communication. The module hello-world-impl contains the implementation of the services that are deployed on the server side. The module hello-world-war contains the configuration for the servlet container and the configuration which services should be exported as a  Hessian service.

The Service Interface Definition

The module hello-world-api should become a JAR artifact so the packaging jar for this Maven module:


<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>com.github.skosmalla.spring.hessian</groupId>
    <artifactId>hello-world-spring-hessian</artifactId>
    <version>1.0.0-SNAPSHOT</version>
  </parent>

  <artifactId>hello-world-api</artifactId>

  <name>Hello World Api</name>

</project>

The jar contains the interfaces of the services. An example


package com.github.skosmalla.hello.world.spring.hessian;

public interface HelloWorld {

  public String welcome();

}

The Service Implementation

The module hello-world-impl also becomes a JAR artifact. This jar contains the service implementation for the server. The service implementation could look like following code:


package com.github.skosmalla.hello.world.spring.hessian;

public class HelloWorldImpl implements HelloWorld {

  @Override
  public String welcome() {
    return "Hello World";
  }

}

Thereby this implementation can be used as a Hessian service, it has to be defined as a Spring bean. Therefore we need a Spring configuration file hello-world-service-config.xml. The location for this file is src/main/resources/META-INF/spring. The bean configuration looks like the following code:


<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">

  <bean id="helloWorldService" class="com.github.skosmalla.hello.world.spring.hessian.HelloWorldImpl"/>

</beans>

In this module we have two dependencies, one to the API module and one to the spring-beans module.


<dependencies>
  <dependency>
    <groupId>org.springframework</groupId>
    <artifactId>spring-beans</artifactId>
  </dependency>
  <dependency>
    <groupId>com.github.skosmalla.spring.hessian</groupId>
    <artifactId>hello-world-api</artifactId>
  </dependency>
</dependencies>

The WAR Deployment

The module hello-world-war  describes the configuration for the server deployment. The artifact of this module becomes a WAR file. Therefore, the packaging of this Maven module is war.


<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>com.github.skosmalla.spring.hessian</groupId>
    <artifactId>hello-world-spring-hessian</artifactId>
    <version>1.0.0-SNAPSHOT</version>
  </parent>

  <artifactId>hello-world-war</artifactId>

  <name>Hello World WAR </name>

  <packaging>war</packaging>

  <build>
    <defaultGoal>install</defaultGoal>
  </build>
</project>

Now, we have to do two things for running our server application on a servlet container:

  1. Add configuration of Spring application context with our implementation to the servlet container.
  2. Add configuration to dispatch request to our Hessian service.

To create an ApplicationContext instance in a web application, we have to configure an ContextLoaderListener in our Web Application Deployment Descriptor (location: src/main/webapp/WEB-INF/web.xml):

<context-param>
  <param-name>contextConfigLocation</param-name>
  <param-value>classpath:META-INF/spring/*.xml</param-value>
</context-param>

<listener>
  <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
</listener>

The ContextLoaderListener builds the root application context from all Spring configuration files located in the classpath under META-INF/spring (That pattern matches our hello-world-service-config.xml).  But no request can be processed. Therefore we have to configure a servlet that dispatch the request to the service. Here, the Spring Framework supports us with a DispatcherServlet. To use it we have to add that servlet to our Web Application Deployment Descriptor (location: src/main/webapp/WEB-INF/web.xml):

<servlet>
  <servlet-name>hessian</servlet-name>
  <servlet-class>org.springframework.web.servlet.DispatcherServlet</servlet-class>
  <load-on-startup>1</load-on-startup>
</servlet>

<servlet-mapping>
  <servlet-name>hessian</servlet-name>
  <url-pattern>/hessian/*</url-pattern>
</servlet-mapping>

That configuration means that servlet is named as hessian and it is responsible for all request to the URL http://<URL to the Tomcat instanz>/<webapp-context>/hessian/* . Now, we have to configure the Hessian service interface that are dispatched by the servlet hessian. For it, we have to add a Spring configuration file in the same location like Web Application Deployment Descriptor (src/main/webapp/WEB-INF). The name of that file have to be hessian-servlet.xml (the pattern is <servlet name>-servlet.xml). Here, we configure the Hessian service interface:


<bean name="/HelloWorldService" class="org.springframework.remoting.caucho.HessianServiceExporter">
  <property name="service" ref="helloWorldService" />
  <property name="serviceInterface" value="com.github.skosmalla.hello.world.spring.hessian.HelloWorld" />
</bean>

That Spring configuration file defines a new application context. It is child application context of the root application context, loaded by the ContextLoaderListener. A child application context can see every bean of the root application context, but the root application context cannot see beans of its child application context (for more information, have look at the Spring Framework reference). The HessianServiceExporter has a reference to the service implementation, defined in the root application context.
The URL of that Hessian service interface is http://<URL to the Tomcat instanz>/<webapp-context>/hessian/HelloWorldService (Pattern is http://<URL to the Tomcat instanz>/<webapp-context>/hessian/<bean name of the HessianServiceExporter>).

So that these configurations can run in a servlet container, we have to add the dependencies, that contains HessianServiceExporter, DispatcherServlet and ContextLoaderListener, to the pom.xml :


<dependency>
  <groupId>org.springframework</groupId>
  <artifactId>spring-web</artifactId>
</dependency>
<dependency>
  <groupId>org.springframework</groupId>
  <artifactId>spring-webmvc</artifactId>
</dependency>
<dependency>
  <groupId>com.github.skosmalla.spring.hessian</groupId>
  <artifactId>hello-world-impl</artifactId>
</dependency>
<dependency>
  <groupId>com.caucho</groupId>
  <artifactId>hessian</artifactId>
</dependency>

The Hessian dependency is needed at runtime and the hello-world-impl contains our business logic and the spring configuration file for the root application context.

With a mvn clean install Maven builds a WAR file in the project’s target folder. This WAR file can be deployed on a Tomcat.

The Hessian Test Client With Spring

Now we write a client to test the Hessian service. Therefore, we create a new Maven module.


<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

  <modelVersion>4.0.0</modelVersion>
  <parent>
    <groupId>com.github.skosmalla.spring.hessian</groupId>
    <artifactId>hello-world-spring-hessian</artifactId>
    <version>1.0.0-SNAPSHOT</version>
  </parent>

  <artifactId>hello-world-client</artifactId>

  <name>Hello World Client</name>

</project>

Spring Framework offers a HessianProxyFactoryBean for calling the remote HelloWorld service. The configuration for this HessianProxyFactoryBean could look like the following code snippet:


<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">

<bean id="helloWorldService"
  class="org.springframework.remoting.caucho.HessianProxyFactoryBean">
  <property name="serviceUrl"
    value="http:/localhost:8080/hello-world/hessian/HelloWorldService" />
  <property name="serviceInterface"
    value="com.github.skosmalla.hello.world.spring.hessian.HelloWorld" />
</bean>
</beans>

In the property serviceInterface we define the interface of the Hessian service, here com.github.skosmalla.hello.world.spring.hessian.HelloWorld. In the property serviceUrl we define the URL to the Hessian Service deployed on the Tomcat. In our sample the Tomcat is on localhost with port number 8080 and the web application is hello-world.

Now, this factory bean creates Hessian service proxy for us:


package com.github.skosmalla.hello.world.spring.hessian;

import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

public class HessianClient {

  public static void main(String[] args) {
    ApplicationContext appContext = new ClassPathXmlApplicationContext("META-INF/spring/hessian-config.xml");

    HelloWorld service = (HelloWorld) appContext.getBean("helloWorldService");

    String welcomeMessage = service.welcome();

    System.out.println(welcomeMessage);

  }
}

The dependencies for the client are the following one:

<dependencies>
  <dependency>
    <groupId>com.github.skosmalla.spring.hessian</groupId>
    <artifactId>hello-world-api</artifactId>
  </dependency>
  <dependency>
    <groupId>org.springframework</groupId>
    <artifactId>spring-context</artifactId>
  </dependency>
  <dependency>
    <groupId>org.springframework</groupId>
    <artifactId>spring-web</artifactId>
  </dependency>
  <dependency>
    <groupId>com.caucho</groupId>
    <artifactId>hessian</artifactId>
    <scope>runtime</scope>
  </dependency>
</dependencies>

If we start this client we will get “Hello World” on our command line.

Now, we have seen a full example how to set up a Hessian service in a Spring web application and how to call such a service remotely. The full code you can find on Github.

Links


Leave a comment

Release Preparing of a Maven Project

This post describes how to prepare a release for Maven project with a set of Maven Plugins.

Background

The release preparation phase is built of following common steps:

  1. Build the artifact to verify that the build is successful.
  2. Run the unit tests of the artifact.
  3. Update the version of the artifact to the release version.
  4. Commit the artifact to the SCM.
  5. Create a tag of the artifact.
  6. Update the version of the artifact to the next development version.
  7. Commit the artifact to the SCM.

These common steps must be extended when the artifact has snapshot dependencies. The snapshot dependencies  appears in following types:

  • The parent POM has a snapshot version.
  • The snapshot version of a group of dependencies is abstracted by a Maven propery.
  • The Maven project is a multi module project (special case of  “The parent pom has a snapshot version.”).

Notice, that the above types can be appeared in combination.

Precondition

A SCM command line tool must be installed.

Approachs to the Release Preparation

The Maven project has no dependencies with snapshot version

This case relates to the above described common steps. Here the Maven Release Plugin helps:

  1. Check out the project from SCM.
  2. Call the goal prepare of the Maven Release Plugin:
    mvn release:prepare -Dusername=scm.username -Dpassword=scm.password
    
  3. The Maven Release Plugin starts in an interactive mode with following questions:
    1. Which release version should use.
    2. Which name of the tag in the SCM.
    3. Which development version should use after the tag.

Maven multi module project

The approach is similar to the case “The Maven project has no dependencies with snapshot version” only that the sub modules also must be released. In this case the Maven Release Plugin helps:

  1. Check out the project from SCM.
  2. Call the goal prepare of the Maven Release Plugin:
    mvn release:prepare -Dusername=scm.username -Dpassword=scm.password -DautoVersionSubmodules=true
    

    The option autoVersionSubmodules=true means that all sub modules should get the same relase version and then the development version as the root project.

  3. The Maven Release Plugin starts in an interactive mode with following questions:
    1. Which release version should use.
    2. Which name of the tag in the SCM.
    3. Which development version should use after the tag.

The Maven project has a parent POM with a snapshot version

Notice, this case doesn’t means the parent POM in sub modules of a multi module Maven project. The above described common step are extended by following steps:

  1. Check, whether the parent POM project must be released.
  2. Update the parent pom version to the current release version.
  3. Follow the common release preparation steps.
  4. After the tag, update the parent POM version to the new development version.

For these steps, the Maven Release Plugin, Versions Maven Plugin and the Maven SCM Plugin help:

  1. Check out the project from SCM.
  2. For the update the parent POM version, the goal update-parent of the Versions Maven Plugin helps:
    mvn versions:update-parent -DgenerateBackupPoms=false
    

    The Versions Maven Plugin looks for the current release version of the parent POM in the repository and replaces the snapshot version by the found release version. The option generatebackupPoms=false means that the plugin should not generate backup for the old pom file.

  3. Commit this change to the SCM with the goal checkin of the Maven SCM Plugin:
    mvn scm:checkin -Dusername=<scm.username> -Dpassword=<scm.password> -Dmessage="[maven-scm-plugin] checkin parent pom release version as a part of the release preparation" -Dincludes=pom.xml
    

    The commit is important because the Maven Release Plugin aborts the prepare goal when it finds diff between working copy and SCM.

  4. Then the above described common steps can be done with the well-known Maven Release Plugin:
    mvn release:prepare -Dusername=<scm.username> -Dpassword=<scm.password>
    
  5. The last step is to udpate the parent pom to the current development version and to commit this change:
    mvn versions:update-parent -DallowSnapshots=true
    mvn scm:checkin -Dusername=<scm.username> -Dpassword=<scm.password> -Dmessage="[maven-scm-plugin] checkin parent pom development version as a part of the release preparation" -Dincludes=pom.xml</pre>
    

The Maven project has dependencies with snapshot version grouped in a Maven property

In the this case the above described common steps are extended with following steps:

  1. Check, whether the dependencies must be released.
  2. Update the version defined in properties to the current release version.
  3. Execute the common release preparation steps.
  4. Update the version defined in properties to the current development version.

With our well-known Maven plugins it looks like this:

mvn versions:update-properties -DgenerateBackupPoms=false
mvn scm:checkin -Dusername=<scm.username> -Dpassword=<scm.password> -Dmessage="[maven-scm-plugin] checkin DU dependencies release version as a part of the release preparation" -Dincludes=pom.xml
mvn versions:update-properties -DallowSnapshots=true
mvn scm:checkin -Dusername=<scm.username> -Dpassword=<scm.password> -Dmessage="[maven-scm-plugin] checkin DU dependencies development version as a part of the release preparation" -Dincludes=pom.xml

Links

  1. Maven Release Plugin Home Page
  2. Versions Maven Plugin Home Page
  3. Maven SCM Plugin Home Page