On software quality, collaborative development, wikis and Open Source...

Last modified by Vincent Massol on 2013/10/22 14:37

Jan 27 2015

A strategy for maintaining backward compatibility

I'd like to explain what we do on the XWiki open source project to preserve backward compatibility of our Java APIs. We've done several iterations and we now have a process that is working quite well, even though there are still some issues but they're unavoidable with Java for the time being.

So here it goes:

  • We start by having a special package named internal and everything we put in there is considered implementation details and should never be used by users of our APIs. We're allowed to modify anything at any time in the internal package. For example: org.xwiki.instance.internal.*
  • We also use a Component-based approach, which forces us to separate interfaces (which are public) from implementation (which go in the internal package).
  • We have defined an advanced deprecation mechanism which allows us to move deprecated code that we no longer use to legacy maven module (using Aspects with AspectJ) that can be optionally installed at runtime, and which we never use at build time. This prevents us from using any deprecated legacy code and it allows us to push away the cruft under the carpet emoticon_wink (without breaking backward compatibility!)
  • We have the notion of "Young API" (a.k.a Unstable APIs) and when a new API is introduced, the developer can (and should) use the @Unstable annotation to signify to users that this API is unstable for the moment and can change at any time. We only allow an API to remain unstable for a full development cycle. We've recently introduced a custom Checkstyle check that enforces this!
  • Last, we use the Maven CLIRR plugin to make sure we never break an API unintentionally (it fails our build). This allows us to only carefully and intentionally break our APIs. It also allows to us to mention what we break in our Release Notes (example).

The important point IMO is that we have automated tools to ensure that our strategy is applied, namely:

  • Our Unstable Annotation Checker (a custom checkstyle rule)

This is working pretty well for us even though we do break backward compatibility from time to time, when we judge that the code touched is unlikely to be used and working around the breakage would be too complex and would take too much time (for example adding a method in an interface requires writing a new interface and modifying all code accepting that interface). Luckily this is going to be made somewhat simpler in Java in the future with Default Methods (introduced in Java 8). It won't fit all cases though.

Another example of backward compatibility aspect that we don't handle is when someone changes what an API returns or what a API does. A simple example is if a method returns an object of type X but the implementation changes to return another implementation of type X that behaves differently... This is a tough one to detect and prevent.

WDYT? Are you doing something else in your projects? (I'd be curious of ways to improve even further what we do!).

Sep 11 2014

Atlassian Summit 2014

I've noticed that Atlassian was holding a Summit yesterday. Apparently it was a success with 2100 attendees. That's great to see and I wanted to congratulate them on that!.

As a developer on the XWiki project what sparked my interest were the announcements related to Confluence. Although I haven't viewed the video yet (it wasn't posted when I checked), I've gathered the following from tweets and blog posts I've noticed:

  • New File Collaboration:

     "You’ll be able to share files through Confluence and collaborate on them in much the same way you already do with pages, using @mentions and in-line comments.

  • Real time collaborative editing:

     And it’s not uncommon for two or more people to need to make updates to the same page at the same time. That’s why we’re bringing real-time collaborative editing to Confluence. (yes!) Timeframe for delivery is a bit nebulous at this point, but the dev team is already plugging away on it

  • Inline comments on highlighted text:

     Oh, and in-line commenting? Yeah, that’s coming to Confluence pages in 5.7, too.

These are all very nice features. Actually they're so nice that the XWiki project has been working on them for some time too, some being there for years and some being more recent. So here's how XWiki would compare on those features:

  • File Collaboration. We have a very nice File Manager allowing users to easily import lots of files inside XWiki, manage them inside the wiki. You can comment on files, tag them, add wiki content related to the files, and more.


  • Real time collaboration. We have been working on years on real time collaboration in partnership with Academia (INRIA to be precise). We wanted to have realtime for both the Wiki editor and also the WYSIWYG editor. We've tested various solutions and in the end we've found some algorithms that seem to work well. Thus we now have 2 extensions for Real time:
  • Inline comments. XWiki has had support for Annotations for years now (4 years to be precise), allowing users to select text and put an annotation on that text. You can even reply to annotations and they are resilient to reasonable changes from the highlighted content!


IMO this shows several things:

  • The open source XWiki project is keeping its innovation rate and is still ahead in term of features
  • A small team can do miracles (the XWiki dev team is much smaller than the Confluence one)

Go XWiki, go! emoticon_smile

May 15 2014

Internals of an open source company: XWiki SAS

I had the pleasure to present at Human Talks. I presented how an open source company works internally and the challenges of handling potentially conflicting interests between business interests and open source interests. My experience is based on the XWiki SAS company and the XWiki open source project.

It was a fun event, held at Mozilla Paris in a very nice room (fun fact: on this picture, which was taken on the day Mozilla opened the office, you can see Ludovic Dubost - with the blue polo -, creator of the XWiki open source project and founder of the XWiki SAS company emoticon_wink).

Thanks Human Talks for the invite!

Feb 04 2014


Another year @ FOSDEM with the XWiki gang: Ludovic, Marius, Anca, Oana and Fabio!

This year though we succeeded in getting dev room (yeah!), a wiki dev room, that I co-organized with Quim Gil from Wikimedia and Jean-Macr Libs from tiki.org.

The XWiki project was lucky to have 6 talks:

It was a nice FOSDEM. We enjoyed Belgian waffles and French fries as usual (although I heard some enjoyed it a bit less than usual since they were on a low-carb diet emoticon_wink).

At the content level, the conference was slightly too low-tech for me, a Java developer. Lots of C/C++ guys and lots of stuff close to the hardware emoticon_wink Not that I don't think it's nice to do that, but rather that I can't participate much. There were some other tracks more of interest to me like the Java dev room (but I was stuck in the wiki dev room at the same time so couldn't join) and the Javascript dev room but this one was so full that it was near impossible to get in...

I'd personally love to see some more room/space given to open source in businesses for the future editions of FOSDEM.

With over 8000 participants it seems it was, once again, a very successful FOSDEM.

See you next year maybe!

Jan 27 2014

Calling a Maven plugin from another plugin

Normally you're not supposed to call a Maven plugin's mojo from another mojo but it may happen times when you have this need. In my case I've developed a plugin and I needed to call the Maven License Plugin from within my mojo.

I succeeded in achieving this through the Maven Executor plugin.

Without further ado here's the code that worked for me:

* @requiresProject
* @requiresDependencyResolution compile
* @threadSafe
public class FormatMojo extends AbstractVerifyMojo
     * The project currently being build.
     * @parameter expression="${project}"
     * @required
     * @readonly

   private MavenProject mavenProject;

     * The current Maven session.
     * @parameter expression="${session}"
     * @required
     * @readonly

   private MavenSession mavenSession;

     * The Maven BuildPluginManager component.
     * @component
     * @required

   private BuildPluginManager pluginManager;


   public void execute() throws MojoExecutionException, MojoFailureException
       // I needed to add a plugin dependency where my license.txt file is located.
       Dependency dep = new Dependency();

        Plugin licensePlugin = plugin(

                element(name("header"), "license.txt"),
                element(name("strictCheck"), "true"),
                    element(name("headerDefinition"), "license-xml-definition.xml")),
                    element(name("include"), "src/main/resources/**/*.xml"))


And boom it worked! emoticon_smile

Hope this can be useful to someone.

Sep 25 2013

Debugging the Maven Deploy Plugin, sort of

You may have been hit by some error while calling mvn deploy and found unable to debug the issue. For example a common error you can get is "Cannot connect. Reason: Auth fail".

The problem is that the Deploy plugin doesn't offer any debugging option. Researching the topic you'll find that it actually uses the JSCH library and researching how to debug this library you'll discover that it provides its own logging facade interface and doesn't provide any logging implementation, leaving it to the user to implement its interface to get any log.

Thanks this blog post I found a SLF4J implementation of the JSCH logging interface and added it to the wagon-ssh artifact.

In short to get debug logs, do the following:

  • Install this version of wagon-ssh in your local repository using this POM file, as follows:
    mvn install:install-file -Dfile=wagon-ssh-2.6-20130924-debug.jar -DpomFile=pom.xml
  • Force using this Wagon SSH modified version by modifying your project's POM to add:

Now when you do a mvn deploy you'll get debug logs such as:

[INFO] --- maven-deploy-plugin:2.7:deploy (default-deploy) @ xxx ---
Downloading: scp://xxx
19:03:00.666 [pool-3-thread-1] INFO  com.jcraft.jsch - Connecting to xxx port 22
19:03:00.717 [pool-3-thread-1] INFO  com.jcraft.jsch - Connection established
19:03:00.760 [pool-3-thread-1] INFO  com.jcraft.jsch - Remote version string: SSH-2.0-OpenSSH_5.6
19:03:00.760 [pool-3-thread-1] INFO  com.jcraft.jsch - Local version string: SSH-2.0-JSCH-0.1.44
19:03:00.760 [pool-3-thread-1] INFO  com.jcraft.jsch - CheckCiphers: aes256-xxx
19:03:00.873 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_KEXINIT sent
19:03:00.874 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_KEXINIT received
19:03:00.874 [pool-3-thread-1] INFO  com.jcraft.jsch - kex: server->client aes128-xxx
19:03:00.874 [pool-3-thread-1] INFO  com.jcraft.jsch - kex: client->server aes128-xxx
19:03:00.884 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_KEXDH_INIT sent
19:03:00.884 [pool-3-thread-1] INFO  com.jcraft.jsch - expecting SSH_MSG_KEXDH_REPLY
19:03:00.938 [pool-3-thread-1] INFO  com.jcraft.jsch - ssh_rsa_verify: signature true
19:03:00.939 [pool-3-thread-1] INFO  com.jcraft.jsch - Host 'xxx' is known and mathces the RSA host key
19:03:00.939 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_NEWKEYS sent
19:03:00.939 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_NEWKEYS received
19:03:00.940 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_SERVICE_REQUEST sent
19:03:00.999 [pool-3-thread-1] INFO  com.jcraft.jsch - SSH_MSG_SERVICE_ACCEPT received
19:03:01.055 [pool-3-thread-1] INFO  com.jcraft.jsch - Authentications that can continue: publickey,keyboard-interactive,password
19:03:01.055 [pool-3-thread-1] INFO  com.jcraft.jsch - Next authentication method: publickey
19:03:01.107 [pool-3-thread-1] INFO  com.jcraft.jsch - Disconnecting from xxx port 22
[WARNING] Could not transfer metadata xxx from/to xxx): Cannot connect. Reason: Auth fail

Unfortunately, as you can see, it doesn't really add a lot of information to help you debug the real issue. In this case I had just modified my username to be a wrong one and I don't see how I can guess this from this log... 

I'm just reporting this in case it helps someone emoticon_smile

Note that a better solution is to check the SSH logs on the server side. In my case when I had the problem I got the following:

Sep 25 14:55:54 xxx sshd[801]: reverse mapping checking getaddrinfo for xxx [xxxx] failed - POSSIBLE BREAK-IN ATTEMPT!

This means that the PTR of that IP must match the hostname.

Strangely it worked fine when using SSH directly. I'm assuming this is because this message is a warning and the standard SSH client continue whereas JSCH chokes on it and stops.

Jun 29 2013

Issue: Jenkins and large Maven projects

This is a call to Jenkins developers and experts. I'm looking for a solution to the following problem.

Statement of the problem:

  • On the XWiki project we have a Git repository with a lot of top level Maven modules (about 85 as of today)
  • Each of these top level modules have several sub modules including modules that run functional tests (long to execute)
  • Right now we build the full platform in a single job on Jenkins and this takes too long: about 1 hour

Note that we do not want to create many Git repositories like one repo per top level module since that makes it a lot harder to release all modules at once (we release them together with the same version) and it would mean creating lots of jobs manually (85!).

The ideal solution I can think of would be:

  • Give to Jenkins the ability to create "virtual" jobs (one per top level module)
  • Each such job would automatically have its dependencies on other jobs defined based on the Maven dependencies so that jobs wait automatically for jobs that need to be run before them
  • This would allow to dispatch the full build to all the agents available

Doing this manually would be a real pain since it would mean creating 85 jobs and recreating the Maven dependencies with job triggers. Of course it should be possible to use the Scriptler plugin to automate this but parsing the Maven POMs to create the matching job hierarchy is not something trivial to do...

Does anyone have a solution for this? Do you agree it's a legitimate use case?


Apr 20 2013

CodeCamp Iasi 2013

I'm in Iasi, Romania for 1 week visiting the XWiki SAS Romanian office and I was invited to talk at CodeCamp 2013. It was my first time I spoke in Romania and it was fun!

The conference is nice (about 300 participants I'd say); the only issue for me being that all sessions are in ... Romanian... emoticon_smile

I gave the talk I did at Devoxx France 2013 last month. It's a technical talk and I wasn't sure how it would get appreciated since it requires some prerequisite knowledge of Maven, Jenkins and Software factories in general... Seems people like it overall.

Here are the slides:

Mar 31 2013

Devoxx France 2013

I was very happy to present at Devoxx France again! Big kudo to the organizers, it was a great event. As usual it was great to meet all my Java friends.

This year I was happy to be on 4 events:

Here are the slides of my quality on Java projects:

In the slides I mention some Scriptler script to automated modifying all jobs to prevent false positive emails.

A big thank you to Frédéric Bouquet who volunteered during the Hackergarten to work on participating to XWiki. We ended up coding a new CRaSH integration. It's still in progress but I hope we can release a 1.0 version real soon.

See you next year!

Jan 11 2013

Tip: Find out max Clover TPC for Maven modules

On the XWiki project we have a Jenkins Job that runs every night and checks that the quality of Maven modules have not decreased.

This is done by using the Maven Clover plugin and failing the build if the TPC is under a specified threshold for each module.

It's defined in a quality profile since it takes some time to execute (this is why we execute it once per night ATM).

Here's the setup:

   <!-- Profile for QA verifications that takes time -->
         <!-- Fail the build if the test coverage is below a given value. -->

Then in each Maven, we have (for example):


Now that's fine and it allows to find out when someone adds code in an existing module and doesn't add enough unit tests to keep the TPC above the current threshold.

There remains an issue. Imagine that I add some code with unit tests. I also need to not forget to update the TPC value in the pom.xml file.

So here's a quick command line tip to find out the current TPC max threshold for all modules located under the directory when you run it:

mvn clean install -Pquality -Dxwiki.clover.targetPercentage=100%
-Dmaven.clover.failOnViolation=false 2>&1 |
awk '/Checking for coverage of/ { module = $9; }
/^Total/ { split(module, array, "/"); print array[length(array)-3],$4 }'

For example when run in xwiki-commons/xwiki-commons-core, it gives:

xwiki-commons-test-simple 0%
xwiki-commons-text 93.5%
xwiki-commons-component-api 22.7%
xwiki-commons-classloader-api 0%
xwiki-commons-classloader-protocol-jar 0%
xwiki-commons-observation-api 15.9%
xwiki-commons-component-observation 76.2%
xwiki-commons-component-default 74.6%
xwiki-commons-context 76.7%
xwiki-commons-script 0%
xwiki-commons-configuration-api 0%
xwiki-commons-test-component 0%
xwiki-commons-environment-api -100%
xwiki-commons-environment-common 0%
xwiki-commons-environment-standard 67.3%
xwiki-commons-environment-servlet 84.6%
xwiki-commons-properties 76.6%
xwiki-commons-logging-api 29.5%
xwiki-commons-observation-local 90.8%
xwiki-commons-job 36.1%
xwiki-commons-logging-logback 91.8%

Now the next step is to write a script that will automatically change the pom.xml files with the max TPC threshold values.

UPDATE 2013-01-31: To do the same with Jacoco you would use:

mvn clean install -Pquality -Dxwiki.jacoco.instructionRatio=100
-Djacoco.haltOnFailure=false 2>&1 |
awk '/jacoco-check/ { module = $6 } /Insufficient/ { print module, $7 }'

UPDATE 2013-07-09: Starting with Jacoco 0.6.4 we need to use:

mvn clean install -Pquality -Dxwiki.jacoco.instructionRatio=2.00
-Djacoco.haltOnFailure=false 2>&1 |
awk '/jacoco-check/ { module = $6 } /instructions covered ratio is/ { print module, $(NF-5) }'
Created by Admin on 2013/10/22 14:34
This wiki is licensed under a Creative Commons 2.0 license
XWiki Enterprise 6.3-NODE1 - Documentation