On software quality, collaborative development, wikis and Open Source...

Last modified by Vincent Massol on 2021/01/07 06:02

Mar 20 2018

Onboarding Brainstorming

I had the honor of being invited to a seminar on "Automatic Quality Assurance and Release" at Dagstuhl by Benoit Baudry (we collaborate together on the STAMP research project). Our seminar was organized as un unconference and one session I proposed and led was the "Onboarding" one described below. The following persons participated to the discussion: V. Massol, D. Gagliardi, B. Danglot, H. Wright, B. Baudry.

Onboarding Discussions

When you're developing a project (be it some internal project or some open source project) one key element is how easy it is to onboard new users to your project. For open source projects this is essential to attract more contributors and have a lively community. For internal projects, it's useful to be able to have new employees or newcomers in general be able to get up to speed rapidly on your project.

This brainstorming session was about ideas of tools and practices to use to ease onboarding.

Here's the list of ideas we had (in no specific order):

  • 1 - Tag issues in your issue tracker as onboarding issues to make it easy for newcomer to get started on something easy and be in success quickly. This also validates that they're able to use your software.
  • 2 - Have a complete package of your software that can be installed and used as easily as possible. It should just work out of the box without having to perform any configuration or additional steps. A good strategy for applications is to provide a Docker image (or a Virtual Machine) with everything setup.
  • 3 - Similarly, provide a packaged development environment. For example you can provide a VM with some preinstalled and configured IDE (with plugins installed and configured using the project's rules). One downside of such an approach is the time it takes to download the VM (which could several GB in size). 
  • 4 - A similar and possibly better approach would be to use an online IDE (e.g. Eclipse Che) to provide a complete prebuilt dev environment that wouldn't even require any downloading. This provides the fastest dev experience you can get. The downside is that if you need to onboard a potentially large number of developers, you'll need some important infra space/CPU on your server(s) hosting the online IDE, for hosting all the dev workspaces. This makes this option difficult to implement for open source projects for example. But it's viable and interesting in a company environment.
  • 5 - Obviously having good documentation is a given. However too many projects still don't provide this or only provide good user documentation but not good developer documentation with project practices not being well documented or only a small portion being documented. Specific ideas:
    • Document the code structure
    • Document the practices for development
    • Develop a tool that supports newcomers by letting them know when they follow / don't follow the rules
    • Good documentation shall explicit assumptions (e.g. when you read this piece of documentation, I assume that you know X and Y)
    • Have a good system to contribute to the documentation of the project (e.g. a wiki)
    • Different documentation for users and for developers
  • 6 - Have homogeneous practices and tools inside a project. This is especially true in a company environment where you may have various projects, each using its own tools and practices, making it harder to move between projects.
  • 7 - Use standard tools that are well known (e.g. Maven or Docker). That increases the likelihood that a newcomer would already know the tool and be able to developer for your project.
  • 8 - It's good to have documentation about best practices but it's even better if the important "must" rules be enforced automatically by a checking tool (can be part of the build for example, or part of your IDE setup). For example instead of saying "this @Unstable annotation should be removed after one development cycle", you could write a Maven Enforcer rule (or a Checkstyle rule, or a Spoon rule) to break the build if it happens, with a message explaining the reason and what is to be done. Usually humans may prefer to have a tool telling them that than a way telling them that they haven't been following the best practices documented at such location...
  • 9 - Have a bot to help you discover documentation pages about a topic. For example by having a chat bot located in the project's chat, that when asked about will give you the link to it.
  • 10 - Projects must have a medium to ask questions and get fast answers (such as a chat tool). Forum or mailing lists are good but less interesting when onboarding when the newcomer has a lot of questions in the initial phase and requires a conversation.
  • 11 - Have an answer strategy so that when someone asks a question, the doc is updated (new FAQ entry for example) so that the next person who comes can find the answer or be given the link to the doc.
  • 12 - Mentoring (human aspect of onboarding): have a dedicated colleague to whom you're not afraid to ask questions and who is a referent to you.
  • 13 - Supporting a variety of platforms for your software will make it simpler for newcomers to contribute to your project.
  • 14 - Split your projects into smaller parts. While it's hard and a daunting experience to contribute to the core code of a project, if this project has a core as small as possible and the rest is made of plugins/extensions then it becomes simpler to start contributing to those extensions first.
  • 15 - Have some interactive tutorial to learn about your software or about its development. A good example of nice tutorial can be found at www.katacoda.com (for example for Docker, https://www.katacoda.com/courses/docker).
  • 16 - Human aspect: have an environment that makes you feel welcome. Work and discuss how to best answer Pull Requests, how to communicate when someone joins the project, etc. Think of the newcomer as you would a child: somebody who will occasionally stumble and need encouragment. Try to have as much empathy as possible.
  • 17 - Make sure that people asking questions always get an answer quickly, perhaps by establishing a role on the team to ensure answers are provided.
  • 18 - Last but not least, an interesting thought experiment to verify that you have some good onboarding processes: imagine that 1000 developers join your project / company on the same day. How do you handle this?

Onboarding on XWiki

I was also curious to see how those ideas apply to the XWiki open source project and what part we implement.

IdeasImplemented on XWiki?
1 - Tag simple issuesaccept Onboarding issues
2 - Complete install packageaccept Debian apt-get, Docker images.
3 - Dev packaged environmentaccept We have a Developer VM
4 - Online IDE onboardingcancel Hard to do provide for an OSS project in term of infra resources but would love to provide this
5 - Good documentationaccept User guide, Admin guide, Dev guide + there's a wiki dedicated to development practices and tools.
6 - Have homogeneous practices and tools inside a projectaccept See http://dev.xwiki.org
7 - Use standard tools that are well knownaccept Maven, Jenkins, Java, JUnit, Mockito, Selenium
8 - Automatically enforced important rulesaccept See Automatic checks in build
9 - Have a bot to help you discover documentation pages about a topicaccept IRC bot (used here - been broken since 2017-05-09).
10 - Projects must have a medium to ask questions and get fast answersaccept XWiki Chat
11 - Have an answer strategy so that when someone asks a questionaccept XWiki answer strategy and FAQ
12 - Mentoring (human aspect of onboarding)error Done to some extent by employees of XWiki SAS who are committers on the open source project but not a generic open source project practice.
13 - Supporting a variety of platforms for your softwareaccept Windows, Linux, Mac, multiple DBs, multiple browsers, multiple Servlet containers.
14 - Split your projects into smaller partsaccept Core getting smaller and more and more Extensions.
15 - Have some interactive tutorial to learn about your softwarecancel Would be nice to have
16 - Human aspect: have an environment that makes you feel welcome.accept This is subjective. Sometimes we may be a bit abrupt when answering (especially me! Sorry guys if I've been abrupt, it's more a consequence of doing too many things. I need to improve. I think we're globally a welcoming community, WDYT?
17 - Make sure that people asking questions always get an answer quicklyaccept I think we're very good at answering fast. See the Forum for example. We also answer fast on Matrix/IRC (we try).
18 - 1000 devs joining at once experimentaccept Actually we participated to Google CodeIn 2017 and this is exactly what we experienced: 756 students interacting with us.

So globally I'd say XWiki is pretty good at onboarding. I'd love to hear about things that we could improve on for onboarding. Any ideas?

If you own a project, we would be interested to hear about your ideas and how you perform onboarding. You could also use the list above as a way to measure your level of onboarding for your project and find out how you could improve it further.

Feb 05 2018

FOSDEM 2018

Once more I was happy to go to FOSDEM. This year XWiki SAS, the company I work for, had 12 employees going there and we had about 8 talks accepted + we had a stand for the XWiki open source project that we shared with our friends @ Tiki and Foswiki.

Here were the highlights for me:

  • I talked about what's next on Java testing and covered test coverage, backward compatibility enforcement, mutation testing and environment testing. My experience on the last two types of tests are directly issued from my participation the STAMP research project where we develop and use tools to amplify existing tests.
  • I did another talk about "Addressing the Long Tail of (web) applications", explaining how an advanced structured wiki such as XWiki can be used to quickly create ad-hoc application in wiki pages.
  • Since we had a stand shared between 3 wiki communities (Tiki, Foswiki and XWiki), I was also interested in comparing our features, and how our communities work.
    • I met the nice folks of Tiki at their TikiHouse and had long discussions about how similar and differently we do things. Maybe the topic for a future blog post? emoticon_smile
    • Then I had Michael Daum do a demo to me of the nice features of Foswiki. I was quite impressed and found a lot of similarities in features. 
    • Funnily our 3 wiki solutions are written in 3 different technologies: Tiki in PHP, Foswiki in Perl and XWiki in Java. Nice diversity!
  • I met a lot of people of course (Fosdem is really THE place to be to meet people from the FOSS communities) but I'd like to thank especially Damien Duportal who took the time to sit with me and go over several questions I had about Jenkins pipelines and Docker. I'll most likely blog about some of those solutions in the near future.

All in all, an excellent FOSDEM again, with lots of talks and waffles emoticon_wink

Dec 15 2017

POSS 2017

My company (XWiki SAS) had a booth at the Paris Open Source Summit 2017 (POSS) and we also organized a track about "One job, one solution!", promoting open source solutions.

I was asked to talk about using XWiki as an alternative to Confluence or Sharepoint.

Here are the slides of the talk:

There were about 30 persons in the room. I focused on showing what I believe are the major differences, especially with Confluence that I know better than SharePoint.

If you're interested by more details you can find a comparison between XWiki and Confluence on xwiki.org.

Nov 17 2017

Controlling Test Quality

We already know how to control code quality by writing automated tests. We also know how to ensure that the code quality doesn't go down by using a tool to measure code covered by tests and fail the build automatically when it goes under a given threshold (and it seems to be working).

Wouldn't it be nice to be also able to verify the quality of the tests themselves? emoticon_smile

I'm proposing the following strategy for this:

  • Integrate PIT/Descartes in your Maven build
  • PIT/Descartes generates a Mutation Score metric. So the idea is to monitor this metric and ensure that it keeps going in the right direction and doesn't go down. Similar than watching the Clover TPC metric and ensuring it always go up.
  • Thus the idea would be, for each Maven module to set up a Mutation Score threshold (you'd run it once to get the current value and set that value as the initial threshold) and have the PIT/Descartes Maven plugin fail the build if the computed mutation score is below this threshold. In effect this would tell that the last changes have introduced tests that are of lowering quality than existing tests (in average) and that the new tests need to be improved to the level of the others.

In order for this strategy to be implementable we need PIT/Descartes to implement the following enhancements requests first:

I'm eagerly waiting for this issues to be fixed in order to try this strategy on the XWiki project and verify it can work in practice. There are some reason why it couldn't work such as being too painful and not being easy enough to identify test problems and fix them.

WDYT? Do you see this as possibly working?

Oct 29 2017

Softshake 2017

I had the chance to participate to Softshake (2017 edition)  for the first time. It's a small but very nice conference held in Geneva, Switzerland.

From what I gathered, this year there were less attendees than in the previous years (About 150 vs 300 before). However, the organization was very nice:

  • Located inside the Hepia school with plenty of rooms available
  • 6 tracks in parallel, which is incredible for a small conference
  • Breakfast, lunch and snacks organized with good food
  • Speaker dinner with Fondue and all emoticon_wink

I got to present 2 talks:

I was also very happy to see my friend and ex-OCTO Technology colleague Philippe Kernevez, and to meet new OCTO consultants. Reminded me of the good times at OCTO emoticon_smile

Sep 17 2017

Using Docker + Jenkins to test configurations

On the XWiki project, we currently have automated functional tests that use Selenium and Jenkins. However they exercise only a single configuration: HSQLDB, Jetty and FireFox (and all on a fixed version).

XWiki SAS is part of the STAMP research project and one domain of this research is improving configuration testing.

As a first step I've worked on providing official XWiki images but I've only provided 2 configurations (XWiki on Tomcat + Mysql and on Tomcat + PostgreSQL) and they're not currently exercised by our functional tests.

Thus I'm proposing below an architecture that should allow XWiki to be tested on various configurations:

Here's what I think it would mean in term of a Jenkins Pipeline (note that at this stage this is pseudo code and should not be understood literally):

pipeline {
  agent {
    docker {
      image 'xwiki-maven-firefox'
      args '-v $HOME/.m2:/root/.m2'
    }
  }
  stages {
    stage('Test') {
      steps {
        docker.image('mysql:5').withRun('-e "MYSQL_ROOT_PASSWORD=my-secret-pw"') { c ->                    
          docker.image('tomcat:8').withRun('-v $XWIKIDIR:/usr/local/tomcat/webapps/xwiki').inside("--link ${c.id}:db") {
            [...]
            wrap([$class: 'Xvnc']) {
              withMaven(maven: mavenTool, mavenOpts: mavenOpts) {
                [...]
                sh "mvn ..."
              }
            }
          }
        }
      }
    }
  }
}

Some explanations:

  • We would setup a custom Docker Registry so that we can prepare images that would be used by the Jenkins pipeline to create containers
  • Those images could themselves be refreshed regularly based on another pipeline that would use the docker.build() construct
  • We would use a Jenkins Agent dynamically provisioned from an image that would contain: sshd and a Jenkins user (so that Jenkins Master can communicate with it), Maven, VNC Server and a browser (FireFox for ex). We would have several such images, one per browser we want to test with.
    • Note that since we want to support only the latest browsers versions for FF/Chrome/Safari we could use apt to update (and commit) the browser version in the container prior to starting it, from the pipeline script.
  • Then the pipeline would spawn two containers: one for the DB and one for the Servlet container. Importantly for the Servlet container, I think we should mount a volume that points to a local directory on the agent, which would contain the XWiki exploded WAR (done as a pre-step by the Maven build). This would save time and not have to recreate a new image every time there's a commit on the XWiki codebase!
  • The build that contains the tests will be started by the Agent (and we would mount the the Maven local repository as a volume in order to sped up build times across runs). 
  • Right now the XWiki build already knows how to run the functional tests by fetching/exploding the XWiki WAR in the target directory and then starting XWiki directly from the tests, so all we would need to do is to make sure we map this directory in the container containing the Servlet container (e.g. in Tomcat it would be mapped to [TOMCATHOME]/webapps/xwiki).

This is just architecture at this stage. Now we need to put that in practice and find the gotchas (there always are emoticon_wink).

WDYT? Could this work? Are you doing this yourself?

Stay tuned, I should be able to report about it went in the coming weeks/months.

Created by Admin on 2013/10/22 14:34