SKM IT World

Just another blog about IT


Leave a comment

Running Ansible on a Windows System

On my last conference talk (it was about Ansible and Docker at DevOpsCon in Berlin), I was asked what is the best way to run Ansible on a Windows system. Ansible itself requires a Linux-based system as the control machine. When I have to develop on a Windows machine, I install a Linux-based virtual machine to run the Ansible’s playbooks inside the virtual machine. I set up the virtual machine with Virtualbox and Vagrant. This tools allow me to share the playbooks easily between host and the virtual machine. so I can develop the playbook on the windows system and the virtual machine can have a headless setup. The next section shows you how to set up this tool chain.

 Tool Chain Setup

 At first, install VirtualBox and Vagrant on your machine. I additionally use Babun, a windows shell based on Cygwin and oh-my-zsh, for a better shell experience on Windows, but this isn’t necessary. Then, go to the directory (let’s called it ansible-workspace), where your Ansible’s playbooks are located. Create there a Vagrant configuration file with the command vagrant init:
ansible-workspace
├── inventories
│   ├── production
│   └── test
├── README.md
├── roles
│   ├── deploy-on-tomcat
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │       ├── cleanup-webapp.yml
│   │       ├── deploy-webapp.yml
│   │       ├── main.yml
│   │       ├── start-tomcat.yml
│   │       └── stop-tomcat.yml
│   ├── jdk
│   │   └── tasks
│   │       └── main.yml
│   └── tomcat8
│       ├── defaults
│       │   └── main.yml
│       ├── files
│       │   └── init.d
│       │       └── tomcat
│       ├── tasks
│       │   └── main.yml
│       └── templates
│           └── setenv.sh.j2
├── demo-app-ansible-deploy-1.0-SNAPSHOT.war
├── deploy-demo.yml
├── inventories
│   ├── production
│   └── test
├── roles
│   ├── deploy-on-tomcat
│   │   ├── defaults
│   │   │   └── main.yml
│   │   └── tasks
│   │       ├── cleanup-webapp.yml
│   │       ├── deploy-webapp.yml
│   │       ├── main.yml
│   │       ├── start-tomcat.yml
│   │       └── stop-tomcat.yml
│   ├── jdk
│   │   └── tasks
│   │       └── main.yml
│   └── tomcat8
│       ├── defaults
│       │   └── main.yml
│       ├── files
│       │   └── init.d
│       │       └── tomcat
│       ├── tasks
│       │   └── main.yml
│       └── templates
│           └── setenv.sh.j2
├── setup-app-roles.yml
├── setup-app.yml
└── Vagrantfile

├── setup-app-roles.yml
├── setup-app.yml
└── Vagrantfile


Now, we have to choose a so-called Vagrant Box on Vagrant Cloud. A box is the package format for a Vagrant environment. It depends on the provider and the operation system that you choose to use. In our case, it is a Virtualbox VM image based on a minimal Ubuntu 18.04 system (box name is bento/ubuntu-18.04 ). This box will be configured in our Vagrantfile:

Vagrant.configure("2") do |config|
  config.vm.box = "bento/ubuntu-18.04"
end

The next step is to ensure that Ansible will be installed in the box. Thus, we use the shell provisioner of Vagrant. The Vagranfile will be extended by the provisioning code:

Vagrant.configure("2") do |config|
  # ... other Vagrant configuration
  config.vm.provision "shell", inline: <<-SHELL
    sudo apt-get update -y
    sudo apt-get install -y software-properties-common
    sudo apt-add-repository ppa:ansible/ansible
    sudo apt-get update -y
    sudo apt-get install -y ansible
    # ... other Vagrant provision steps
  SHELL
end
end

The last step is to copy the SSH credential into the Vagrant box. Thus, we mark the SSH credential folder of the host system as a Shared folder, so that we can copy them to the SSH config folder inside the box.
Vagrant.configure("2") do |config|
 
  # ... other Vagrant configuration
  config.vm.synced_folder ".", "/vagrant"
  config.vm.synced_folder "path to your ssh config", "/home/vagrant/ssh-host"
  # ... other Vagrant configuration

  config.vm.provision "shell", inline: <<-SHELL
    # ... other Vagrant provision steps
    cp /home/vagrant/ssh-host/* /home/vagrant/.ssh/.
  SHELL
end

On Github’s Gist you can found the whole Vagrantfile.

Workflow

After setting up the tool chain let’s have a look how to work with it. I write my Ansible playbook on the Windows system and run them from the Linux guest system against the remote hosts. For running the Ansible playbooks we have to start the Vagrant box.
> cd ansible-workspace
> vagrant up

When the Vagrant box is ready to use, we can jump into the box with:
 
> vagrant ssh 

You can find the Ansible playbooks inside the box in the folder /vagrant .  In this folder run Ansible:
 
> cd" /vagrant
> ansible-playbook -i inventories/test -u tekkie setup-db.yml

Outlook

Maybe on Windows 10 it’s possible to use Ansible natively, because of the Linux subsystem. But I don’t try it out. Some Docker fans would prefer a container instead of a virtual machine. But remember, before Windows 10 Docker runs on Windows in a virtual machine, so therefore, I don’t see a benefit for using Docker instead of a virtual machine. But of course with Windows 10 native container support a setup with Docker is a good alternative if Ansible doesn’t run on the Linux subsystem.
Do you another idea or approach? Let me know and write a comment.

Links

  1. VirtualBox
  2. Vagrant
  3. Whole Vagrantfile on Github.

 

Advertisements


Leave a comment

Apache2 as Reverse Proxy for NPM Registry Proxies in Sonatype Nexus 3

I use a NPM registry proxy in Sonatype Nexus 3 behind an Apache2 as reverse proxy. With the “standard” Apache2 VirtualHost configuration


<VirtualHost:80>

  ProxyRequests Off
  <Proxy *>
    Order deny,allow
    Allow from all
  </Proxy>
  ProxyPass / http://localhost:8081/
  ProxyPassReverse / http://localhost:8081/

</VirtualHost:80>

I got following failure when I tried to install the dependency @sinonjs/formatio.

$ yarn add @sinonjs/formatio --verbose
yarn add v1.3.2
warning package.json: No license field
verbose 0.337 Checking for configuration file "/home/sparsick/dev/workspace/yarn-test-module/.npmrc".
verbose 0.337 Checking for configuration file "/home/sparsick/.npmrc".
verbose 0.337 Found configuration file "/home/sparsick/.npmrc".
verbose 0.337 Checking for configuration file "/usr/etc/npmrc".
verbose 0.338 Found configuration file "/usr/etc/npmrc".
verbose 0.338 Checking for configuration file "/home/sparsick/dev/workspace/yarn-test-module/.npmrc".
verbose 0.338 Checking for configuration file "/home/sparsick/dev/workspace/.npmrc".
verbose 0.338 Checking for configuration file "/home/sparsick/dev/.npmrc".
verbose 0.338 Checking for configuration file "/home/sparsick/.npmrc".
verbose 0.338 Found configuration file "/home/sparsick/.npmrc".
verbose 0.338 Checking for configuration file "/home/.npmrc".
verbose 0.341 Checking for configuration file "/home/sparsick/dev/workspace/yarn-test-module/.yarnrc".
verbose 0.342 Found configuration file "/home/sparsick/dev/workspace/yarn-test-module/.yarnrc".
verbose 0.343 Checking for configuration file "/home/sparsick/.yarnrc".
verbose 0.344 Found configuration file "/home/sparsick/.yarnrc".
verbose 0.344 Checking for configuration file "/usr/etc/yarnrc".
verbose 0.344 Checking for configuration file "/home/sparsick/dev/workspace/yarn-test-module/.yarnrc".
verbose 0.345 Found configuration file "/home/sparsick/dev/workspace/yarn-test-module/.yarnrc".
verbose 0.345 Checking for configuration file "/home/sparsick/dev/workspace/.yarnrc".
verbose 0.345 Checking for configuration file "/home/sparsick/dev/.yarnrc".
verbose 0.345 Checking for configuration file "/home/sparsick/.yarnrc".
verbose 0.345 Found configuration file "/home/sparsick/.yarnrc".
verbose 0.345 Checking for configuration file "/home/.yarnrc".
verbose 0.347 current time: 2018-02-27T08:04:43.357Z
warning yarn-test-module: No license field
[1/4] Resolving packages...
verbose 0.45 Performing "GET" request to "http://mycompany/repository/npm-public/@sinonjs%2fformatio".
verbose 0.55 Request "http://mycompany/repository/npm-public/@sinonjs%2fformatio" finished with status code 404.
verbose 0.551 Error: Couldn't find package "@sinonjs/formatio" on the "npm" registry.
at /usr/lib/node_modules/yarn/lib/cli.js:49061:15
at Generator.next (<anonymous>)
at step (/usr/lib/node_modules/yarn/lib/cli.js:92:30)
at /usr/lib/node_modules/yarn/lib/cli.js:103:13
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:188:7)
error Couldn't find package "@sinonjs/formatio" on the "npm" registry.
info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this command.</span>

The problem is that Apache2 canonicalizes URLs as default. So I have to configure Apache2 to not canonicalize URLs and additionally, I have to allow encoded slashes:


<VirtualHost:80>

  ProxyRequests Off
  <Proxy *>
    Order deny,allow
    Allow from all
  </Proxy>
  ProxyPass / http://localhost:8081/ nocanon
  ProxyPassReverse / http://localhost:8081/

  AllowEncodedSlashes NoDecode

</VirtualHost:80>

With the above Apache2 Virtualhost configuration I could install my dependency via the NPM registry proxy.


$ yarn add @sinonjs/formatio

yarn add v1.3.2
warning package.json: No license field
info No lockfile found.
warning yarn-test-module: No license field
[1/4] Resolving packages...
[2/4] Fetching packages...
[3/4] Linking dependencies...
[4/4] Building fresh packages...
success Saved lockfile.
success Saved 2 new dependencies.
├─ @sinonjs/formatio@2.0.0
└─ samsam@1.3.0
warning yarn-test-module: No license field
Done in 0.60s.

Big thanks to Sonatype support team, that gave me this advice.


Leave a comment

Pimp My Git – Generate Content for .gitignore From the Scratch

When I start a new Git repository, I lose a lot of time to set up my .gitignore file and normally, I don’t match everything on the first shoot. Fortunately, there exists some tools, that help to bootstrapping it. I’d like to show two of them. One is a website that can be used on the command line and the another is a plugin for the IDE IntelliJ IDEA.

Website gitignore.io

There is a website http://gitignore.io that lists the common ignore pattern for you specific programming language, tool, IDE etc.
The usage is very simple: Fill the search with names of  tools, framework, programming language etc, which you want to use in your Git project, and the website generates the content for your .gitignore file.

You can also run gitignore.io from your command line. Therefore, you need an active internet connection and an environment function. I’ll demonstrate the integration of gitignore.io in zsh. For the integration in other shells or clients, please look into the documentation.

Firstly, we have to create a function gi in our ~/.zshrc:


echo "function gi() { curl -L -s https://www.gitignore.io/api/\$@ ;}" >> ~/.zshrc && source ~/.zshrc

Now, we can use it on the command line.


$ gi java,maven # Preview of the content for .gitignore

# Created by https://www.gitignore.io/api/java,maven

### Java ###
# Compiled class file
*.class

# Log file
*.log

# BlueJ files
*.ctxt

# Mobile Tools for Java (J2ME)
.mtj.tmp/

# Package Files #
*.jar
*.war
*.ear
*.zip
*.tar.gz
*.rar

# virtual machine crash logs, see http://www.java.com/en/download/help/error_hotspot.xml
hs_err_pid*

### Maven ###
target/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
pom.xml.next
release.properties
dependency-reduced-pom.xml
buildNumber.properties
.mvn/timing.properties

# Avoid ignoring Maven wrapper jar file (.jar files are usually ignored)
!/.mvn/wrapper/maven-wrapper.jar

# End of https://www.gitignore.io/api/java,maven

$ gi list # list currently available templates
1c-bitrix,a-frame,actionscript,ada,adobe
advancedinstaller,agda,alteraquartusii,altium,android
androidstudio,angular,anjuta,ansible,apachecordova
apachehadoop,appbuilder,appceleratortitanium,appcode,appcode+all
appcode+iml,appengine,aptanastudio,arcanist,archive
archives,archlinuxpackages,aspnetcore,assembler,atmelstudio
ats,audio,automationstudio,autotools,backup
basercms,basic,batch,bazaar,bazel
bitrix,bittorrent,blackbox,bluej,bower
bricxcc,buck,c,c++,cake
.... furthermore

$ gi java,maven >> .gitignore # append the content in your project's .gitignore

IntelliJ IDEA Plugin – .ignore

There is a plugin for IntelliJ IDEA that helps creating .gitignore file with content for your selected tool, programming language etc. . At first you have to install the plugin .ignore (Go to File -> Settings -> Plugins and search for .ignore).

You can now create .gitignore file via the .ignore plugin. By the way, the plugin can also create ignore files for other tools like Docker or Mercurial. Then a file generator is opened and you can choose templates of tools, programming language etc that you will use in the Git project.A preview shows you the possible content. A click on Generate and you are ready.

Do you have other tips and tricks to boost the initialization time of a Git project? Share them and write a comment below.

Links

  1. gitignore.io
  2. Website of .ignore


Leave a comment

How to Mark a Jenkins Job Red When Tests Fail In A Maven Build

The default setting in Jenkins is to mark a job yellow, when a Maven build fails because of failing tests. If you don’t want to have three status of your jobs, you can configure Jenkins so, that the jobs also mark red independent why a Maven build fails.

For this you will need administration rights on your Jenkins instance. Following steps have to be done:

  1. Go to Manage Jenkins -> Manage system.
  2. Add -Dmaven.test.failure.ignore=false to Maven Project Configuration -> Global Maven_OPTS.
  3. Save this change and that’s it.

Your next job run will consider this configuration. Unfortunately, this configuration has only effects for Maven jobs. Freestyle jobs ignore this configuration (see also this bug).

But a workaround exists:

  1. Install the TextFinder plugin via Manage Jenkins -> Manage Plugin.
  2. Open the Freestyle job’s configuration that should be marked red, when Maven tests fail.
  3. Click on Add a post-build action (in section Post-build Action) and select Jenkins Text Finder.
  4. Activate the check box Also search the console output.
  5. Add the value There are test failures to Regular expression.
  6. Save this change.

 

 

 


1 Comment

Generate P2 Repository From Maven Artifacts In 2017

Some years ago, I wrote a blog post about how to generate a P2 repository based on Maven artifacts. That described approach is obsolete nowadays and I’d like to show a new approach that is based on the p2-maven-plugin that was created to solve exactly this problem.

P2-Maven-Plugin Integration in Maven Build Life Cycle

First at all, we bind the p2-maven-plugin’s goal site to the Maven’s life cycle phase package. This goal is responsible for the generation of the P2 repository.

<plugin>
  <groupId>org.reficio</groupId>
  <artifactId>p2-maven-plugin</artifactId>
  <version>1.3.0</version>
  <executions>
    <execution>
      <id>default-cli</id>
      <phase>package</phase>
      <goals>
        <goal>site</goal>
      </goals>
      <!--... -->
    </execution>
  </executions>
</plugin>

Generating P2 Repository

Now, we can define which Maven artifacts should be a part of the new P2 repository. It is irrelevant for the p2-maven-pluging if the defined artifacts have already a OSGi manifest or not. If no OSGi manifest exists, the plugin will generate one.


<execution>
<!-- ... -->
<configuration>
  <artifacts>
    <!-- specify your dependencies here -->
    <!-- groupId:artifactId:version -->
    <artifact>
      <id>com.google.guava:guava:jar:23.0</id>
      <!-- Artifact with existing OSGi-Manifest-->
    </artifact>
    <artifact>
      <id>commons-io:commons-io:1.3</id>
      <!-- Artifact without existing OSGi-Manifest-->
    </artifact>
  </artifacts>
</configuration>
</execution>

The artifacts are specified by the pattern groupId:artifactId:version. If you want to save some typing, use the Buildr tab on MVN repository website for copying the right dependency declaration format.

This sample configuration creates a P2 repository that look like the following one:


target/repository
├── artifacts.jar
├── category.xml
├── content.jar
└── plugins
    ├── com.google.code.findbugs.jsr305_1.3.9.jar
    ├── com.google.errorprone.error_prone_annotations_2.0.18.jar
    ├── com.google.guava_23.0.0.jar
    ├── com.google.j2objc.annotations_1.1.0.jar
    ├── commons-io_1.3.0.jar
    └── org.codehaus.mojo.animal-sniffer-annotations_1.14.0.jar

1 directory, 9 files

 

The default behavior of the plugin is, that all transitive dependencies of the defined artifact are also downloaded and packed into the P2 repository. If you don’t want it, then you have to set the option transitive to false in the corresponded artifact declaration. If you need the sources (if they exist in the Maven repository) of the defined artifact in the P2 repository, then you have to set the option source to true in the corresponded artifact declaration.

<!-- ... -->
<artifact>
  <id>com.google.guava:guava:jar:23.0</id>
  <transitive>false</transitive>
  <source>true</source>
</artifact>
<!-- ... -->

Then the generated P2 repository looks like the following one:


target/repository
├── artifacts.jar
├── category.xml
├── content.jar
└── plugins
    ├── com.google.guava.source_23.0.0.jar
    ├── com.google.guava_23.0.0.jar
    └── commons-io_1.3.0.jar

1 directory, 6 files

Generating P2 Repository With Grouped Artifacts

In some situations, you want to group artifacts in so-called feature. p2-maven-plugin provides an option that allows to group the Maven artifact directly into features. The definition of the artifacts is the same like above. The difference is that it has to be inside the corresponded feature. Then, the feature definition needs some meta data information like feature ID, feature version, description etc.


<!-- ...-->
<configuration>
  <featureDefinitions>
    <feature>
      <!-- Generate a feature including artifacts that are listed below inside the feature element-->
      <id>spring.feature</id>
      <version>4.3.11</version>
      <label>Spring Framework 4.3.11 Feature</label>
      <providerName>A provider</providerName>
      <description>${project.description}</description>
      <copyright>A copyright</copyright>
      <license>A licence</license>
      <artifacts>
        <artifact>
          <id>org.springframework:spring-core:jar:4.3.11.RELEASE</id>id>
        </artifact>
        <artifact>
          <id>org.springframework:spring-context:jar:4.3.11.RELEASE</id>id>
          <source>true</source>
        </artifact>
      </artifacts>
    </feature>
    <!--...-->
  </featureDefinitions>
  <!-- ... -->
<configuration>

Then the generated P2 repository looks like the following one:


target/repository
├── artifacts.jar
├── category.xml
├── content.jar
├── features
│   └── spring.feature_4.3.11.jar
└── plugins
    ├── org.apache.commons.logging_1.2.0.jar
    ├── org.springframework.spring-aop.source_4.3.11.RELEASE.jar
    ├── org.springframework.spring-aop_4.3.11.RELEASE.jar
    ├── org.springframework.spring-beans.source_4.3.11.RELEASE.jar
    ├── org.springframework.spring-beans_4.3.11.RELEASE.jar
    ├── org.springframework.spring-context.source_4.3.11.RELEASE.jar
    ├── org.springframework.spring-context_4.3.11.RELEASE.jar
    ├── org.springframework.spring-core_4.3.11.RELEASE.jar
    ├── org.springframework.spring-expression.source_4.3.11.RELEASE.jar
    └── org.springframework.spring-expression_4.3.11.RELEASE.jar

2 directories, 14 files

Of course both options (generating p2 repository with feature and only with plugins) can be mixed.

p2-maven-plugin provides more options like excluding specific transitive dependencies, referencing to other eclipse features and so on. For more information, please look at the p2-maven-plugin homepage.

Now, we can generate P2 repositories from Maven artifacts. We lacks of how to deploy this P2 repository to a Repository manager like Artifactory or Sonatype Nexus. Both repository manager supports P2 repositories, Artifactory in the Professional variant (cost money) and Sonatype Nexus in OSS variant (free). For Nexus, it’s important that you use the version 2.x. The newest version, 3.x, doesn’t yet support P2 repositories.

Deploying P2 Repository to a Repository Manager

First at all, we want that our generated P2 repository is packed into a zip file. Therefore, we add the tycho-p2-repository-plugin to the Maven build life cycle:


<plugin>
  <groupId>org.eclipse.tycho</groupId>
  <artifactId>tycho-p2-repository-plugin</artifactId>
  <version>1.0.0</version>
  <executions>
    <execution>
      <phase>package</phase>
      <goals>
        <goal>archive-repository</goal>
      </goals>
    </execution>
  </executions>
</plugin>


Then, we have to mark this zip file, so that Maven recognize that it has to deploy it during the deploy phase to a repository manager. For this, we add the build-helper-maven-plugin to the Maven build life cycle.

<!-- Attach zipped P2 repository to be installed and deployed in the Maven repository during the deploy phase. -->
<plugin>
  <groupId>org.codehaus.mojo</groupId>
  <artifactId>build-helper-maven-plugin</artifactId>
  <version>3.0.0</version>
  <executions>
    <execution>
      <goals>
        <goal>attach-artifact</goal>
      </goals>
      <configuration>
        <artifacts>
          <artifact>
            <file>target/${project.artifactId}-${project.version}.zip</file>
            <type>zip</type>
          </artifact>
        </artifacts>
      </configuration>
    </execution>
  </executions>
</plugin>

Now, the generated P2 repository can be addressed by other projects. For more information about how to address the P2 repository, please have a look on the documentation of your repository manager.

A whole pom.xml sample can be found on Github.

Links


Leave a comment

Pimp My Git – Manage Different Git Identities

I usually work on different Git projects that need different Git identities. My work flow for new repositories was

  1. Clone new repository.
  2. Go to cloned repository.
  3. If it is necessary to change the Git identity, call a shell script that runs `git config user.name “Sandra Parsick”; git config user.email sparsick@web.de`

I was never happy with this solution, but it works. Fortunately, a tweet of @BenediktRitter and one of @wosc suggest two alternatives to my method.

The first method bases on the Git feature “Conditional Includes” (required Git Version at least 2.13). The idea is that you define a default Git identity and separate Git identities per specific directory. That means, every repository, that is cloned beneath one of the specific directory, has automatically its specified Git identities.

The second method bases on a Python script, that is inspired by the Mercurial extension hg-persona. The idea is that you can individually set a Git identity per Git repository. It is an one command replacement for the git config user.* command serie.

In the next two sections I’d like to summarize how to set up and how to use these two methods. I have tested it on a Debian-based system. Let’s start with the first one.

Summarize Git identity for several Git repositories

As above described, this method bases on the Git feature “Conditional Includes”. Therefore, ensure you have installed your Git client in at least version 2.13 . Assume, we want to have two Git identities, one for Github and one for work. Therefore, create two .gitconfig files in your home folder.


touch ~/.gitconfig_github
touch ~/.gitconfig_work

Then add the specific Git identity in respective .gitconfig files.


~/.gitconfig_github

[user]
   name = YourNameForGithub
   email = name@forgithub.com

~/.gitconfig_work

[user]
   name = YourNameForWork
   email = name@forwork.com

The next step is to add these two .gitconfig files to our global one and to specify when to use them.

~/.gitconfig

[user]
   name = defaultName
   email = default@email.com

[includeIf "gitdir:~/workspace_work/"]
   path = .gitconfig_work

[includeIf "gitdir:~/workspace_github/"]
   path = .gitconfig_github

Now, every repository, that is cloned beneath ~/workspace_work/, has automatically the Git identity for Work (.gitconfig_work) and every repository, that is cloned beneath ~/workspace_github/, has automatically the Git identity for Github (.gitconfig_github). Otherwise, the default Git identity is used.

Setting Git identity individually per Git repository

For the second method, you have to install ws.git-persona from PyPI.


sudo apt-get install pip # if PyPI isn't install
pip install ws.git-persona

Then, open your global ~/.gitconfig and add your personas. In our cases, we add two personas, one for Github and one for work.


~/.gitconfig

[persona]
  github = YourNameForGithub &lt;name@forgithub.com&gt;
  work = YourNameForWork &lt;name@forwork.com&gt;

In the next step, we want to switch our Git identity in a Git repository. This is now possible with the command git-persona. In the following example we switch to the identity for Github and then to the identity for work.


> git-persona -n github
Setting user.name="YourNameForGithub", user.email="name@forgithub.com"
> git config user.name
YourNameForGithub
> git config user.email
name@forgithub.com
> git-persona -n work
Setting user.name="YourNameForWork", user.email="name@forwork.com"
> git config user.email
name@forwork.com
> git config user.name
YourNameForWork

If you have other methods to manage different Git identities, let me know it and write a comment.

 

Links

  1. Blog post about Git feature “Conditional Includes”.
  2. Github repository of git-personas.


Leave a comment

Pimp My Git – Git Mergetool

I like to work with git on the command line. But in some cases I prefer UI support. For example, solving merge conflicts is such a case. Git has a command mergetool, which can open a graphical tool to solve merge conflicts. But before you can use this command, you had to configure it. In this blog post I’d like to show you how to configure mergetool and how to use it.

Configuration

First at all, open a shell on Linux. On Windows, open Git Bash. Then choose a graphic tool that should support you solving merge conflicts. git mergetool –tool-help shows a list which tools are supported on your machine


 sparsick@sparsick-ThinkPad-T430s > git mergetool --tool-help
'git mergetool --tool=<tool>' may be set to one of the following:
                araxis
                kdiff3
                meld

The following tools are valid, but not currently available:
                bc
                bc3
                codecompare
                deltawalker
                diffmerge
                diffuse
                ecmerge
                emerge
                gvimdiff
                gvimdiff2
                gvimdiff3
                opendiff
                p4merge
                tkdiff
                tortoisemerge
                vimdiff
                vimdiff2
                vimdiff3
                winmerge
                xxdiff

Some of the tools listed above only work in a windowed
environment. If run in a terminal-only session, they will fail.

This command shows two lists. The first list shows all tools that are supported by git and that are available on your machine (in sample, it is araxis, kdiff3 and meld). The second list shows that are also supported by git, but they aren’t install on your machine.

I use meld as graphical tool. It’s runnable on Windows and Linux. If you haven’t install meld on your machine, then it’s now the right time to do it or choose another tool.

We want to set mergetool globally for all our repositories.


sparsick@sparsick-ThinkPad-T430s > git config --global merge.tool meld
sparsick@sparsick-ThinkPad-T430s > git mergetool
No files need merging

If git mergetool returns more than No files need merging, then the path to your graphic tool isn’t set in your $PATH system variable (The normal case on Windows systems). It’s possible to set the path to the graphical tool directly in git.

sparsick@sparsick-ThinkPad-T430s > git config --global mergetool.meld.path /c/Program\ Files\ \(x86\)/Meld/Meld.exe</pre>

Bear two important things in mind: mergetool is written without a dot between merge and tool and meld is a placeholder for the name of the graphical tool in the above sample. If you use another tool such like vimdiff, then the config key is called mergetool.vimdiff.path .

Now git mergetool is ready to use.

Usage

Now I’d like to demonstrate how to use git mergetool. It is used in when we have merge conflicts during a merge action. Let’s say we want to merge branch branch1 into master and this merge will have some merge conflicts.


sparsick@sparsick-ThinkPad-T430s > git merge branch1

Auto-merging test
CONFLICT (content): Merge conflict in test
Automatic merge failed; fix conflicts and then commit the result.

Now, we want to solve these conflicts with a graphical tool (in the example, it’s meld). git mergetool on the command line open the graphical tool of our choice.

sparsick@sparsick-ThinkPad-T430s > git mergetool

Merging:
test

Normal merge conflict for 'test':
{local}: modified file
{remote}: modified file

After solving the merge conflicts, the change has to commit.

sparsick@sparsick-ThinkPad-T430s > git status

On branch master
All conflicts fixed but you are still merging.
(use "git commit" to conclude merge)

Changes to be committed:

modified:   test

Untracked files:
(use "git add <file>..." to include in what will be committed)

test.orig
sparsick@sparsick-ThinkPad-T430s > git commit

You can see that we have a new untracked file test.orig . This is a backup of the merged file created by mergetool. You can configure that this backup should be removed after a successful merge.

sparsick@sparsick-ThinkPad-T430s > git config --global mergetool.keepBackup false

Further files are created when using git mergetool:

sparsick@sparsick-ThinkPad-T430s > git status

On branch master
Untracked files:
(use "git add ..." to include in what will be committed)

test.BACKUP.7344
test.BASE.7344
test.LOCAL.7344
test.REMOTE.7344

If only these files are currently untracked, then a git clean can help. Otherwise they have to be removed manually.

sparsick@sparsick-ThinkPad-T430s > git clean -f

Removing test.BACKUP.7344
Removing test.BASE.7344
Removing test.LOCAL.7344
Removing test.REMOTE.734

Links

  1. Meld Homepage
  2. git mergetool Documentation