07 Feb

Selenide tests – The Maven way

Selenide is a wrapper over Selenium which greatly simplifies its use. For instance, to support an AJAX event in Selenium, the code would be something like this:

FluentWait<By> fluentWait = new FluentWait<By>(By.tagName("TEXTAREA"));
fluentWait.pollingEvery(100, TimeUnit.MILLISECONDS);
fluentWait.withTimeout(1000, TimeUnit.MILLISECONDS);
fluentWait.until(new Predicate<By>() {
    public boolean apply(By by) {
        try {
            return browser.findElement(by).isDisplayed();
        } catch (NoSuchElementException ex) {
            return false;
        }
    }
});
assertEquals("John", browser.findElement(By.tagName("TEXTAREA")).getAttribute("value"));

Whereas with Selenide would be something like:

$("TEXTAREA").shouldHave(value("John"));

and it is basically this way with everything (there are plenty of examples on a dedicated page at Selenide’s GitHub project).

Selenide logo

Moreover, in the functional testing world, there is a well-known pattern, Page Objects, which boils down to create a dedicated class (a Page Object) for each page which is going to be simulated. These Page Object classes would them implement the access to each page element, in order to reduce code duplication throughout the entire test suite. With Selenide, this pattern more or less looks like this:

package com.sweftt.swdm.pages;

import org.openqa.selenium.By;

import static com.codeborne.selenide.Selenide.$;
import static com.codeborne.selenide.Selenide.page;


public class GooglePage {
    
    public SearchResultsPage searchFor( String text ) {
        $( By.name( "q" ) ).val( text ).pressEnter();
        return page( SearchResultsPage.class );
    }
    
}
package com.sweftt.swdm.pages;

import com.codeborne.selenide.ElementsCollection;
import com.codeborne.selenide.SelenideElement;

import static com.codeborne.selenide.Selenide.$;
import static com.codeborne.selenide.Selenide.$$;


public class SearchResultsPage {

    public ElementsCollection getResults() {
        return $$( "#ires .g" );
    }

    public SelenideElement getResult( int index ) {
        return $( "#ires .g", index );
    }
    
}
    @Test
    public void userCanSearch() {
        GooglePage page = open( "http://google.com", GooglePage.class );
        SearchResultsPage results = page.searchFor( "selenide" );
        results.getResults().shouldHave( size( 10 ) );
        results.getResult( 0 ).shouldHave( text( "Selenide: concise UI tests in Java" ) );
    }

(by default, the test will use Firefox to perform the navigation, although it can be changed either through environment properties, -Dselenide.browser=ie, or through the com.codeborne.selenide.Configuration class)

The only problem is that, when running this test, on a clean environment, something like this will show up:

org.openqa.selenium.firefox.NotConnectedException: Unable to connect to host 127.0.0.1 on port 7055 after 45000 ms. Firefox console output:
pi	DEBUG	Calling bootstrap method startup on aushelper@mozilla.org version 1.0
1485199741216	addons.xpi	DEBUG	Registering manifest for XXXXXXXXXX\Mozilla Firefox\browser\features\e10srollout@mozilla.org.xpi
1485199741216	addons.xpi	DEBUG	Calling bootstrap method startup on e10srollout@mozilla.org version 1.5
1485199741217	addons.xpi	DEBUG	Registering manifest for XXXXXXXXXX\Mozilla Firefox\browser\features\firefox@getpocket.com.xpi
[~40 lines of stacktrace omitted]
1485199759009	addons.manager	DEBUG	Calling shutdown blocker for LightweightThemeManager
1485199759009	addons.manager	DEBUG	Calling shutdown blocker for GMPProvider
1485199759012	addons.manager	DEBUG	Calling shutdown blocker for PluginProvider
1485199759013	addons.manager	DEBUG	Calling shutdown blocker for <unnamed-provider>
1485199759015	addons.manager	DEBUG	Calling shutdown blocker for PreviousExperimentProvider
1485199759018	addons.xpi	DEBUG	Notifying XPI shutdown observers
1485199759023	addons.manager	DEBUG	Async provider shutdown done

	at org.openqa.selenium.firefox.internal.NewProfileExtensionConnection.start(NewProfileExtensionConnection.java:113)
	at org.openqa.selenium.firefox.FirefoxDriver.startClient(FirefoxDriver.java:347)
	at org.openqa.selenium.remote.RemoteWebDriver.<init>(RemoteWebDriver.java:116)
	at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:259)
	at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:247)
	at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:242)
	at org.openqa.selenium.firefox.FirefoxDriver.<init>(FirefoxDriver.java:135)
	at com.codeborne.selenide.webdriver.WebDriverFactory.createFirefoxDriver(WebDriverFactory.java:123)
	at com.codeborne.selenide.webdriver.WebDriverFactory.createWebDriver(WebDriverFactory.java:47)
	at com.codeborne.selenide.impl.WebDriverThreadLocalContainer.createDriver(WebDriverThreadLocalContainer.java:240)
	at com.codeborne.selenide.impl.WebDriverThreadLocalContainer.getAndCheckWebDriver(WebDriverThreadLocalContainer.java:113)
	at com.codeborne.selenide.WebDriverRunner.getAndCheckWebDriver(WebDriverRunner.java:136)
	at com.codeborne.selenide.impl.Navigator.navigateToAbsoluteUrl(Navigator.java:68)
	at com.codeborne.selenide.impl.Navigator.open(Navigator.java:31)
	at com.codeborne.selenide.Selenide.open(Selenide.java:81)
	at com.codeborne.selenide.Selenide.open(Selenide.java:150)
	at com.codeborne.selenide.Selenide.open(Selenide.java:131)
	at com.sweftt.swdm.GoogleSearchTest.userCanSearch(GoogleSearchTest.java:20)

The underlying cause is that in order to execute these tests (or Selenium based ones, for all that matters), it is mandatory to download a binary which allows the WebDriver to manage the browser. Furthermore, it is also necessary to set the absolute path to these binaries as Java properties:

System.setProperty( "webdriver.ie.driver", "C:/absolute/path/to/binary/IEDriverServer.exe" );
System.setProperty( "webdriver.edge.driver", "C:/absolute/path/to/binary/MicrosoftWebDriver.exe" );
System.setProperty( "phantomjs.binary.path", "/absolute/path/to/binary/phantomjs" );

From a Maven / CD-CI environment execution standpoint, this is problematic, as it makes these tests environment-dependant. Fortunatelly this can be avoided, through an additional library, WebDriverManager; so, lets see how it does integrate with Selenide. The source code is available at GitHub.

The first thing to do is to add the appropiate dependency, which in Maven’s land would mean adding something like this to your pom:

<dependency>
  <groupId>io.github.bonigarcia</groupId>
  <artifactId>webdrivermanager</artifactId>
  <version>1.5.1</version>
</dependency>

Second, in the test itself, setting up the library:

package com.sweftt.swdm;

import static com.codeborne.selenide.CollectionCondition.size;
import static com.codeborne.selenide.Condition.text;
import static com.codeborne.selenide.Selenide.open;
import static com.codeborne.selenide.WebDriverRunner.setWebDriver;

import org.junit.After;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;

import com.sweftt.swdm.pages.GooglePage;
import com.sweftt.swdm.pages.SearchResultsPage;

import io.github.bonigarcia.wdm.ChromeDriverManager;


/**
 * Google search test with Selenide.
 */
public class GoogleSearchTest {
    
    WebDriver driver;
    
    @BeforeClass
    public static void setupClass() {
        ChromeDriverManager.getInstance().setup();
    }
    
    @Before
    public void setupTest() {
        driver = new ChromeDriver();
        setWebDriver( driver );
    }

    // as the driver hasn't been created by Selenide, closing it is mandatory
    @After
    public void teardown() {
        if( driver != null ) {
            driver.quit();
        }
    }

    @Test
    public void userCanSearch() {
        GooglePage page = open( "http://google.com", GooglePage.class );
        SearchResultsPage results = page.searchFor( "selenide" );
        results.getResults().shouldHave( size( 10 ) );
        results.getResult( 0 ).shouldHave( text( "Selenide: concise UI tests in Java" ) );
    }

}

And that’s all it takes to solve the problem. However, this can be done better: there is too much ceremony, a lot of configuration which will get repeated in each test. JUnit rules to the rescue! First of all, an Enum will be defined, to bind the driver’s configuration classes from WebDriverManager with their corresponding Selenium WebDrivers:

package com.sweftt.swdm.rules;


import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.edge.EdgeDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.openqa.selenium.ie.InternetExplorerDriver;
import org.openqa.selenium.opera.OperaDriver;
import org.openqa.selenium.phantomjs.PhantomJSDriver;

import io.github.bonigarcia.wdm.BrowserManager;
import io.github.bonigarcia.wdm.ChromeDriverManager;
import io.github.bonigarcia.wdm.EdgeDriverManager;
import io.github.bonigarcia.wdm.FirefoxDriverManager;
import io.github.bonigarcia.wdm.InternetExplorerDriverManager;
import io.github.bonigarcia.wdm.OperaDriverManager;
import io.github.bonigarcia.wdm.PhantomJsDriverManager;

public enum BrowserManagerEnum {
    
    CHROME( ChromeDriverManager.getInstance() ),
    FIREFOX( FirefoxDriverManager.getInstance() ),
    EDGE( EdgeDriverManager.getInstance() ),
    IE( InternetExplorerDriverManager.getInstance() ),
    MARIONETTE( FirefoxDriverManager.getInstance() ),
    OPERA( OperaDriverManager.getInstance() ),
    PHANTOMJS( PhantomJsDriverManager.getInstance() );
    
    private final BrowserManager bm;

    private BrowserManagerEnum( final BrowserManager bm ) {
        this.bm = bm;
    }
    
    public BrowserManager getBrowserManager() {
        return bm;
    }

    public WebDriver getDriver() {
        switch( this ) {
        case CHROME: return new ChromeDriver();
        case FIREFOX: return new FirefoxDriver();
        case EDGE: return new EdgeDriver();
        case IE: return new InternetExplorerDriver();
        case MARIONETTE: return new FirefoxDriver();
        case OPERA: return new OperaDriver();
        case PHANTOMJS: return new PhantomJSDriver();
        default: return null;
        }
    }

}

Done that, all that remains is the JUnit rule itself:

package com.sweftt.swdm.rules;

import static com.codeborne.selenide.WebDriverRunner.setWebDriver;

import org.junit.rules.TestRule;
import org.junit.runner.Description;
import org.junit.runners.model.Statement;
import org.openqa.selenium.WebDriver;


public class WebDriverManagerRule implements TestRule {
	
	protected final BrowserManagerEnum browser;
	protected WebDriver driver = null;
	
	/**
	 * Rule constructor.
	 * 
	 * @param browser see {@link BrowserManagerEnum} for available values.
	 */
	public WebDriverManagerRule( BrowserManagerEnum browser ) {
		this.browser = browser;
	}

	/** {@inheritDoc} */
	@Override
	public Statement apply( Statement base, Description description ) {
		return statement( base );
	}
	
	Statement statement( final Statement base ) {
        return new Statement() {
        	
        	/** {@inheritDoc} */
            @Override
            public void evaluate() throws Throwable {
            	if( driver == null ) {
            		beforeClass();
            		base.evaluate();
            		afterClass();
            	} else {
            		before();
                    try {
                        base.evaluate();
                    } finally {
                        after();
                    }
            	}
            }
        };
    }

    protected void beforeClass() throws Throwable {
    	browser.getBrowserManager().setup();
    	driver = browser.getDriver();
    }

    protected void before() throws Throwable {
    	setWebDriver( driver );
    }

    protected void after() {
    	driver.quit();
    }

    protected void afterClass() throws Throwable {
    }

}

The caveat with this rule is that is going to be used both as a @Rule and as a @ClassRule, as it has to do different things at different stages of the test execution: configure WebDriverManager before actual tests execution and setup and terminate the associated WebDriver used by Selenide, at the beginning and end of each test. Last but not least, in order to annotate a rule with both @Rule and @ClassRule at the same time, the JUnit version must be at least 4.12.

All in all, the unit test now looks like this:

package com.sweftt.swdm;

import static com.codeborne.selenide.CollectionCondition.size;
import static com.codeborne.selenide.Condition.text;
import static com.codeborne.selenide.Selenide.open;

import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;

import com.sweftt.swdm.pages.GooglePage;
import com.sweftt.swdm.pages.SearchResultsPage;
import com.sweftt.swdm.rules.BrowserManagerEnum;
import com.sweftt.swdm.rules.WebDriverManagerRule;


/**
 * Google search test with Selenide.
 */
public class GoogleSearchWithRuleTest {
    
    @ClassRule @Rule
    public static WebDriverManagerRule testRule = new WebDriverManagerRule( BrowserManagerEnum.FIREFOX );
    
    @Test
    public void userCanSearch() {
        GooglePage page = open( "http://google.com", GooglePage.class );
        SearchResultsPage results = page.searchFor( "selenide" );
        results.getResults().shouldHave( size( 10 ) );
        results.getResult( 0 ).shouldHave( text( "Selenide: concise UI tests in Java" ) );
    }

}

Much better than the previous version of the test.

19 Nov

Spring Boot extension for Stripes framework

Stripes and Spring Boot

We’ve recently had the opportunity of bringing Spring Boot into our technology stack and we’ve witnessed first-hand why it has been awarded the most innovative contribution to the Java ecosystem at the 2.016 JAX Innovation Awards.

Additionally, a lot less known, but far from not being top notch quality software, we also use Stripes, an MVC web framework, which heavily relies on configuration by convention and is, above all, extremely easy and fun to work with. It doesn’t have the miryad of posibilities that other frameworks (like, for example, Spring MVC) offer, but for sure it covers the 99.9% of functionalities that a typical web application is going to have, and it covers them pretty well. For us, that is more than enough. Another of its great strengths is how “natural” it feels, how everything works as expected, without nasty surprises along the way, and how easy is to extend it.

Precisely because it is a framework which isn’t widely known, it doesn’t have an Spring Boot starter module, which was an excellent excuse for us to develop it and, while we were at it, learn about Spring’s Boot internals. The end result is available on github.

Developing an Spring Boot custom starter module

Spring’s Boot reference guide is an excellent starting point when developing a new starter module. In our case, we are going to develop twothree maven modules:

  • starter: an (almost) empty jar, basically used to provide the default module dependencies. Other than that, it contains a ./META-INF/spring.provides file, which references the next module,
  • autoconfigure: which contains the Spring Boot magic; this module is responsible of instantiating and setting up whatever Spring beans are needed and, optionally, register a namespace which to provide configuration values. Dependencies on this module should be marked as optional, so they aren’t pushed with this module when importing it.
  • sample: not really necessary, but having a working example which showcases the new starter module is always worth looking at. In this case, we will transform the calculator from Stripes’ quick start guide into an Spring Boot application.

Contents of autoconfigure module

The entry point is located on the ./META-INF/spring.factories file, which references an Spring Configuration class through the org.springframework.boot.autoconfigure.EnableAutoConfiguration entry:

# Auto Configure
org.springframework.boot.autoconfigure.EnableAutoConfiguration=net.sourceforge.stripes.springboot.autoconfigure.StripesAutoConfiguration

The StripesAutoConfiguration class is a normal Spring’s configuration class (and because of this, annotated with @Configuration), which is responsible of setting up a couple of FilterRegistrationBeans which contain both StripesFilter and DynamicMappingFilter. In this concrete case, some other Spring beans are also defined, so that Spring MVC can be deactivated.

@Configuration
@ConditionalOnClass( name="net.sourceforge.stripes.springboot.autoconfigure.SpringBootVFS" ) // @see http://stackoverflow.com/a/25790672
@ConditionalOnProperty( name = "stripes.enabled", matchIfMissing = true )
@EnableConfigurationProperties( StripesProperties.class )
public class StripesAutoConfiguration {

private static final Log LOG = Log.getInstance( StripesAutoConfiguration.class );
[...]

    @Bean( name = "stripesDynamicFilter" )
    @ConditionalOnMissingBean( name = "stripesDynamicFilter" )
    public FilterRegistrationBean stripesDynamicFilter() {
        final DynamicMappingFilter filter = new DynamicMappingFilter();

        final List< String > urlPatterns = new ArrayList< String >();
        urlPatterns.add( "/*" );

        final FilterRegistrationBean registration = new FilterRegistrationBean();
        registration.setFilter( filter );
        registration.setUrlPatterns( urlPatterns );
        registration.setDispatcherTypes( DispatcherType.REQUEST, DispatcherType.INCLUDE, DispatcherType.FORWARD, DispatcherType.ERROR );
        registration.setOrder( Ordered.LOWEST_PRECEDENCE );
        return registration;
}

[...]

It can be observed that this class is annotated at both type and method level with several @ConditionalOnXXX annotations. These annotations allow the configuration beans to be optionally registered based on environment properties, the existence of other beans, etc. They sit at the very center of Spring’s Boot “magic”. It’s an extremely simple, clever and powerful mechanism which, next to the election of very well thought default values (which can be easily replaced), are the main reason of this framework’s success. Kudos to the bright minds behind these ideas.

In addition to all of this, this class is also annotated with @EnableConfigurationProperties( StripesProperties.class ), which signals the class used to reserve the stripes namespace at the application.properties file. This class is a normal POJO, whose class members map to their equivalents at the application.properties file. The only particularity of this class is that it is annotated with @ConfigurationProperties, which is used to state the reserved namespace:

@ConfigurationProperties(prefix = StripesProperties.STRIPES_PREFIX )
public class StripesProperties {

    public static final String STRIPES_PREFIX = "stripes";

    /** value for: {@code ActionBeanPropertyBinder.Class} */
    private String actionBeanPropertyBinder;

    /** value for: {@code ActionBeanContextFactory.Class} */
    private String actionBeanContextFactory;

    /** value for: {@code ActionBeanContext.Class} */
    private String actionBeanContext;

[...]
    /** placeholder for custom configuration */
    private Map< String, String > customConf = new HashMap< String, String >();

[...]

// getter & setters

[...]
}

As a curiosity, this class also includes a Map instance, which will be used to gather all kind of custom values, which will get added to the Stripes’ filter configuration. This way, an entry like stripes.custom-conf.MY_CUSTOM_KEY=MY_CUSTOM_VALUE on the application.properties file will get added to the Stripes’ filter as a parameter with key MY_CUSTOM_KEY and value MY_CUSTOM_VALUE.

Contents of sample module

As said before, not strictly necessary, to the extent that is not mentioned on the Spring Boot reference guide, but having a working example is the best way to have your extension used. This time, the example is taken from the Stripes’ reference guide calculator sample. This module is a normal Maven module, whose pom.xml includes the org.springframework.boot:spring-boot-maven-plugin plugin execution, and which includes both the Java class and the JSP from de reference guide exactly as they are in there. Finally, an application.properties file is added with the required Stripes’ configuration (note: this starter module is smart enough to configure the framework with some reasonable default values as to be able to start up the application without having to set any specific value).

Once all the modules have been compiled, the application can be started from this module either by executing mvn spring-boot:run or java -jar target/stripes-spring-boot-sample-1.0.0.jar, and it will be ready to go at http://localhost:8080/index.jsp.

After doing all this, an announce was sent to the Stripes distribution list, so there is a chance it could get into the main distribution and be available for the 1.7.0 release.

29 Oct

Industrializing the CI/CD environment (1): Jenkins

Software industrialization implies, amongst other things, that it must be buildable from a script which automates the whole process. In the case of a CI/CD environment, it is also interesting to industrialize it, as this allows a better understanding of the system configuration, the ability to introduce changes on a controlled environment, revert to a well-known state, etc. The alternative to this is what Martin Fowler denominates a “Snowflake Server“.

As for the concrete Jenkins’ case, not having a mechanism which allows to rebuild the whole system, can be somewhat risky, specially if you are working with weekly releases: the upgraded version might have problems, or there might be incompatible plugins or unwanted side effects… On these situations, it is complicated to rollback to a previous version. Furthermore, manually installing a previous version, with its plugins, jobs, users, nodes, etc. and leave it as it was un the first place can be quite a challenge; for instance, it is not completely unreasonable that, when upgrading a plugin, it decides to also migrate its configuration on every job that uses it. If, because of whatever reason, it is necessary to downgrade after a situation like this, well, that’s going to hurt. A lot.

Also, it doesn’t help that there isn’t a more or less standard approach to perform this task. As of today, the official documentation regarding this theme is basically null. It is true that there are some Jenkins plugins that help on trying to solve this task, namely: Backup Plugin, thinBackup and SCM Sync Backup, but they are either discontinuated, or they can’t be scheduled, or they also save workspace data, or they do not save all the configuration (f.ex., if you are a user of the promotions plugin, neither of these plugins will save your promotion data).

The way we have solved this at work is with a couple of scripts, the first one is meant to save all the configuration on the scm tool (in our case, Apache Subversion):

#!/bin/bash
echo '*****************************'
echo '0.- Change into $JENKINS_HOME'
echo '*****************************'
cd /path/to/your/jenkins_home


echo '************************'
echo '1.- update plugins list '
echo '************************'
curl -sSL "http://JENKINS_HOST:JENKINS_PORT/jenkins/pluginManager/api/xml?depth=1&xpath=/*/*/shortName|/*/*/version&wrapper=plugins" | perl -pe 's/.*?<shortName>([\w-]+).*?<version>([^<]+)()(<\/\w+>)+/\1 \2\n/g'|sed 's/ /:/' > plugins.txt


echo '*****************************************************************************'
echo '2.- Template saves wipe out promotions folder, so restore it with local data '
echo '*****************************************************************************'
# IFS weirdness to cd into directories whose names contain spaces, ref. http://www.cyberciti.biz/tips/handling-filenames-with-spaces-in-bash.html
cd jobs
SAVEIFS=$IFS
IFS=$(echo -en "\n\b")
for dir in `find ./* -type d -prune`;
do
  echo "svn update promotions on $dir" && cd $dir && svn update promotions --force --accept=mine-full && cd ..
done
IFS=$SAVEIFS
cd ..


echo '********************************************************************************************************************'
echo '3.- Add any new conf files, jobs with promotions, users, userContent, secrets, nodes and list of installed plugins. '
echo '********************************************************************************************************************'
svn add --force --parents *.xml jobs/*/config.xml jobs/*/promotions/*/config.xml users/*/config.xml userContent/* nodes/* secret.key secret.key.not-so-secret secrets/* plugins.txt


echo '***************************************************'
echo "4.- Ignore things in the root we don't care about. "
echo '***************************************************'
echo "warnlog\n*.log\n*.tmpn*.old\n*.bak\n*.jar\n*.json" > jenkins_ignores && svn propset svn:ignore -F jenkins_ignores . && rm jenkins_ignores 


echo '***************************************************'
echo "5.- Ignore things inside jobs we don't care about. "
echo '***************************************************'
echo "builds\nlast*\nnext*\n*.txt\n*.log\nworkspace*\ncobertura\njavadoc\nhtmlreports\nncover\ndoclinks" > jenkins_ignores && svn propset svn:ignore -F jenkins_ignores jobs/* && rm jenkins_ignores 


echo '***************************************************************'
echo '6.- Remove anything from SVN that no longer exists in Jenkins. '
echo '***************************************************************'
svn status | grep '!' | awk '{print $2;}' | xargs -r svn rm 


echo '**********************************************************************************'
echo '7.- And finally, check in of course, showing status before and after for logging. '
echo '**********************************************************************************'
svn st && svn ci --non-interactive --username=SVN_USER --password=SVN_PASSWORD -m "automated commit of Jenkins configuration" && svn st

exit 0

The second step won’t apply always, as it is meant to take into account interactions between EZ Templates and promotions plugins; if you are not using both of them this step won’t do anything.

We use Jenkins to run this script on a weekly basis, so we are able to have all the configuration in a safe place. To recover a Jenkins installation from version control we perform these steps:

  • check out the configuration on a directory, which will become the new JENKINS_HOME.
  • download, through wget or whatever, the appropiate Jenkins war.
  • execute the following script, adaptaded from the official Jenkins’ Docker image:
#! /bin/bash

# Parse a support-core plugin -style txt file as specification for jenkins plugins to be installed
# in the reference directory, so user can define a derived Docker image with just :
#
# FROM jenkins
# COPY plugins.txt /plugins.txt
# RUN /usr/local/bin/plugins.sh /plugins.txt
#
# Note: Plugins already installed are skipped
#

set -e

JENKINS_WAR=../jenkins.war
JENKINS_HOME=.
JENKINS_PLUGINS_DIR=$JENKINS_HOME/plugins
REF=$JENKINS_HOME/ref
# jenkins update center
JENKINS_UC=${JENKINS_UC:-https://updates.jenkins-ci.org}

if [ -z "$1" ]
then
    echo "
USAGE:
  Parse a support-core plugin -style txt file as specification for jenkins plugins to be installed
  in the reference directory, so user can define a derived Docker image with just :

  FROM jenkins
  COPY plugins.txt /plugins.txt
  RUN /usr/local/bin/plugins.sh /plugins.txt

  Note: Plugins already installed are skipped

"
    exit 1
else
    JENKINS_INPUT_JOB_LIST=$1
    if [ ! -f $JENKINS_INPUT_JOB_LIST ]
    then
        echo "ERROR File not found: $JENKINS_INPUT_JOB_LIST"
        exit 1
    fi
fi

# the war includes a # of plugins, to make the build efficient filter out
# the plugins so we dont install 2x - there about 17!
if [ -d $JENKINS_HOME ]
then
    TEMP_ALREADY_INSTALLED=$JENKINS_HOME/preinstalled.plugins.$$.txt
else
    echo "ERROR $JENKINS_HOME not found"
    exit 1
fi

if [ -d $JENKINS_PLUGINS_DIR ]
then
    echo "Analyzing: $JENKINS_PLUGINS_DIR"
    for i in `ls -pd1 $JENKINS_PLUGINS_DIR/*|egrep '\/$'`
    do
        JENKINS_PLUGIN=`basename $i`
        JENKINS_PLUGIN_VER=`egrep -i Plugin-Version "$i/META-INF/MANIFEST.MF"|cut -d\: -f2|sed 's/ //'`
        echo "$JENKINS_PLUGIN:$JENKINS_PLUGIN_VER"
    done > $TEMP_ALREADY_INSTALLED
else
    #JENKINS_WAR=/usr/share/jenkins/jenkins.war
    if [ -f $JENKINS_WAR ]
    then
        echo "Analyzing war: $JENKINS_WAR"
        TEMP_PLUGIN_DIR=/tmp/plugintemp.$$
        echo "TEMP_PLUGIN_DIR: $TEMP_PLUGIN_DIR"
        for i in `jar tf $JENKINS_WAR|egrep 'plugins'|egrep -v '\/$'|sort`
        do
            echo "TEMP_PLUGIN_DIR: $TEMP_PLUGIN_DIR"
            rm -fr $TEMP_PLUGIN_DIR
            mkdir -p $TEMP_PLUGIN_DIR
            PLUGIN=`basename $i|cut -f1 -d'.'`
            echo "$PLUGIN"
            (cd $TEMP_PLUGIN_DIR;jar xf $JENKINS_WAR "$i";jar xvf $TEMP_PLUGIN_DIR/$i META-INF/MANIFEST.MF >/dev/null 2>&1)
            VER=`egrep -i Plugin-Version "$TEMP_PLUGIN_DIR/META-INF/MANIFEST.MF"|cut -d\: -f2|sed 's/ //'`
            echo "$PLUGIN:$VER"
        done > $TEMP_ALREADY_INSTALLED
        rm -fr $TEMP_PLUGIN_DIR
    else
        rm -f $TEMP_ALREADY_INSTALLED
        echo "ERROR file not found: $JENKINS_WAR"
        exit 1
    fi
fi

#REF=/usr/share/jenkins/ref/plugins
mkdir -p $REF
COUNT_PLUGINS_INSTALLED=0
while read spec || [ -n "$spec" ]; do

    plugin=(${spec//:/ });
    [[ ${plugin[0]} =~ ^# ]] && continue
    [[ ${plugin[0]} =~ ^\s*$ ]] && continue
    [[ -z ${plugin[1]} ]] && plugin[1]="latest"

    if [ -z "$JENKINS_UC_DOWNLOAD" ]; then
      JENKINS_UC_DOWNLOAD=$JENKINS_UC/download
    fi

    if ! grep -q "${plugin[0]}:${plugin[1]}" $TEMP_ALREADY_INSTALLED
    then
        echo "Downloading ${plugin[0]}:${plugin[1]}"
        curl --retry 3 --retry-delay 5 -sSL -f ${JENKINS_UC_DOWNLOAD}/plugins/${plugin[0]}/${plugin[1]}/${plugin[0]}.hpi -o $REF/${plugin[0]}.jpi
        unzip -qqt $REF/${plugin[0]}.jpi
        COUNT_PLUGINS_INSTALLED=`expr $COUNT_PLUGINS_INSTALLED + 1`
    else
        echo "  ... skipping already installed:  ${plugin[0]}:${plugin[1]}"
    fi
done  < $JENKINS_INPUT_JOB_LIST

echo "---------------------------------------------------"
if [ $COUNT_PLUGINS_INSTALLED -gt 0 ]
then
    echo "INFO: Successfully installed $COUNT_PLUGINS_INSTALLED plugins."

    if [ -d $JENKINS_PLUGINS_DIR ]
    then
        echo "INFO: Please restart the container for changes to take effect!"
    fi
else
    echo "INFO: No changes, all plugins previously installed."

fi
echo "---------------------------------------------------"

#cleanup
rm $TEMP_ALREADY_INSTALLED
exit 0

and profit!

16 Oct

Software complexity explained to non IT people

There are occasions, when starting with a new customer, specially if (s)he hasn’t an IT background, where the news telling that the requested task is going to be a little more complicated than expected, aren’t well received. Obviously, this isn’t always the case, but a bunch of times I have had the necessity of explaining why software development is a complex activity, to someone who has neither the time nor the desire to read the No Silver Bullet paper, as it’s enough for him/her with their day to day job and, in addition, deal with those pesky IT people who wont stop complaining about whatever they are told to do.

One way of explaining to non IT people why software development is a complex activity, which I’m very fond of, is to make use of an analogy with the Raiders of the Lost Ark movie. Not in the sense that developing software is about swapping an idol with a small sandbag and leave running before poisoned darts, giant stones, pits and a thousand other traps end with the poor developer’s life, but by proposing a challenge: try to guess which is probably the most expensive scene of the movie. Time.

Done? The Nazi submarine? That enormous rolling stone which is going to crush Indy? Nonsense, you are surely thinking in this scene, right?:

Raiders of the lost ark - Cairo scenes

Raiders of the lost ark – Cairo scenes

(image taken from http://indianajones.wikia.com/wiki/Cairo)

Sallah’s terrace on El Cairo scenes? Are you kidding me? Two / three absolutely calm people, totally actionless, a performing monkey… I mean, what? That’s more expensive than a Nazi submarine? How much does it take to train a monkey?

As it turns out, when the movie was shot, special effects were craftmanship, as opposed to nowadays, where computers do their magic 99% of the time. With that in mind, have you located yet all the TV antennas that were on all the roofs in that shot? In 1930s, TV wasn’t widely introduced, so all the roofs looked like in the picture. But in 1981, when the movie was filmed, all those roofs had TV antennas, which had to be taken down in order to shot these scenes.

The good thing is, at this stage, the relationship with the customer changes, and (s)he no longer thinks you are trying to deceit him/her, and what they usually do is try to help, or ease as much as they can, taking down all those antennas.