Install this theme
Java: Input Stream generating random characters

Quick and dirty implementation of the Java’s InputStream, that can generate a desired number of random characters, without buffering it in memory. May be useful for unit testing.

import org.apache.commons.lang3.RandomStringUtils;
import java.io.IOException;
import java.io.InputStream;

public class RandomCharsInputStream extends InputStream {

    private static final int ALPHABETIC = 1;

    private static final int ALPHANUMERIC = 2;

    private static final int ASCII = 3;

    private static final int NUMERIC = 4;

    private static final int CHARS = 5;

    private int type = ALPHABETIC;

    private long size = 0;

    private long charsRead = 0;

    private char[] chars;

    public static final RandomCharsInputStream newAlphabetic(long size) {
        return new RandomCharsInputStream(size, ALPHABETIC);
    }

    public static final RandomCharsInputStream newAlphanumeric(long size) {
        return new RandomCharsInputStream(size, ALPHANUMERIC);
    }

    public static final RandomCharsInputStream newAscii(long size) {
        return new RandomCharsInputStream(size, ASCII);
    }

    public static final RandomCharsInputStream newNumeric(long size) {
        return new RandomCharsInputStream(size, NUMERIC);
    }

    public static final RandomCharsInputStream newWithChars(long size, char... chars) {
        return new RandomCharsInputStream(size, chars);
    }

    public static final RandomCharsInputStream newWithChars(long size, String chars) {
        return new RandomCharsInputStream(size, chars.toCharArray());
    }

    private RandomCharsInputStream(long size, int type) {
        this.size = size;
        this.type = type;
    }

    private RandomCharsInputStream(long size, char... chars) {
        this.size = size;
        this.type = CHARS;
        this.chars = chars;
    }

    @Override
    public int read() throws IOException {
        if (charsRead >= size)
            return -1;

        char c;
        switch (type) {
            case ALPHABETIC:
                c = RandomStringUtils.randomAlphabetic(1).charAt(0);
                break;

            case ALPHANUMERIC:
                c = RandomStringUtils.randomAlphanumeric(1).charAt(0);
                break;

            case ASCII:
                c = RandomStringUtils.randomAscii(1).charAt(0);
                break;

            case NUMERIC:
                c = RandomStringUtils.randomNumeric(1).charAt(0);
                break;

            case CHARS:
                c = RandomStringUtils.random(1, chars).charAt(0);
                break;

            default:
                throw new IllegalArgumentException("Unknown random type: " + type);
        }

        charsRead++;
        return c;
    }
}    

Usage examples:

RandomCharsInputStream in = RandomCharsInputStream.newAscii(10000);

RandomCharsInputStream in = RandomCharsInputStream.newNumeric(100);

RandomCharsInputStream in = RandomCharsInputStream.newWithChars(30, "xyz");
Spring AOP: Intercepting method with custom annotation

Almost every Spring developer already seen and used @Transactional annotation, responsible for the database transaction demarcation.
It creates an AOP around advice, that starts the database tranaction before method execution and commits (or rollbacks) it, after the execution ends.

This post demonstrates, how to create a custom annotation working like @Transactional and responsible for profling method execution. It uses method intercepting capabilities provided by the Spring AOP, however without utilizing AspectJ.

1. We’ll start with the custom annotation @ProfileExecution, that will mark bean methods we want to profile:

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
@Inherited
@Documented
public @interface ProfileExecution {

}

Read More

Embedded SSH daemon and remote shell for your java application

Wouldn’t it be nice to have a possibility to connect you application securely via SSH with a secret user/password and communicate with it within the application specific shell? That was my first idea as I found the Apache’s SSHD project.

Apache SSHD is a 100% pure java library to support the SSH protocols on both the client and server side. This library is based on Apache MINA, a scalable and high performance asynchronous IO library.

This blog post demonstrates how is it possible to create a application shell using Apache SSHD and JLine projects.

Read More

Processing dependent files with Apache Camel

Imagine, you have following scenario:

  1. Some folder has to be polled for files (i will name them “header”-files), that contain only the meta-information (e.g. filesystem location) of the real “data”-file(s), that should be processed.
  2. From the incoming “header”-file you have to read the data”-file(s) filesystem location and process it from there.
  3. Both “header”- and “data”-file(s) should be moved to the particular “done”-folder after processing. Though the “header”-file is only allowed to be moved, if the appropriate “data”-file(s) was/were successfully processed.
  4. In case the appropriate “data”-file does not exist, the “header”-file message should be rollbacked and the “header”-file must be processed once again after the specified timeout.

Read More

Creating custom Apache Camel components

Some time ago I wrote an article about Apache Camel and how to send iOS push notifications using it. Now it’s time to demonstrate how one can create Apache Camel custom components.

Before we start, some words about Apache Camel itself. What exactly is Apache Camel? Wikipedia says:

Apache Camel is a rule-based routing and mediation engine which provides a Java object-based implementation
of the Enterprise Integration Patterns using an API (or declarative Java Domain Specific Language) to configure routing and mediation rules.

But it seems that many people still don’t understand the purpose of this framework. For those I would recommend to read the StackOverflow
discussion regarding this: http://stackoverflow.com/questions/8845186/what-exactly-is-apache-camel

One of the fundamental core elements of Apache Camel are components. Camel provides out of the box a rich set of pre-built components for nearly all common tasks enterprise developer may need nowadays while implementing Enterprise Integration Patterns (EIP). However, in rare cases it can happen that you have to develop a custom component, either to implement not yet covered task/usecase or just to encapsulate a complex route definition.

In this blog post i’ll demonstrate, how to create a simple custom component that can repeatedly generate random character sequences.

Read More

Parallelizing execution with xargs on Unix-like operating systems

In my current project i have to build multiple Java projects with Ant

xargs command line utility can accelerate the build process, by starting the necessary build scripts in parallel threads.

Below is a simple shell script build-all.sh, that finds all build.xml files within “build” subfolders and starts Ant for each of them with the specified command line parameters.

if [ -z "$1" ]
 then
   echo -e "Usage: build-all.sh "
   exit -1
fi

find */build -name 'build.xml' -print0 | xargs -0 -n 1 -P 10 ant "$@" -f
Creating class proxies with Javassist

Javassist is a really easy to use bytecode manipulation library. 

Creating class proxies with Javassist is straight forward, just create an instance of ProxyFactory, a corresponding MethodHandlerthat handles the method invocation and the class proxy itself.

Find below a small proxy creation example, that traces the execution of all class methods:

public class ProxyFactoryExample {

    public void foo() {
		System.out.println("Foo method executed.");
	}

	public void bar() {
		try {
			Thread.sleep(500);
		} catch (InterruptedException e) {
			// ignore
		}

		System.out.println("Bar method executed.");
	}

	public static void main(String[] args) throws Throwable {
		ProxyFactory proxy = new ProxyFactory();
		proxy.setSuperclass(ProxyFactoryExample.class);
		proxy.setInterfaces(new Class[] { Serializable.class });
		proxy.setFilter(new MethodFilter() {
			@Override
			public boolean isHandled(Method m) {
				// skip finalize methods
				return !(m.getParameterTypes().length == 0 && m.getName()
						.equals("finalize"));
			}
		});

		MethodHandler tracingMethodHandler = new MethodHandler() {

			@Override
			public Object invoke(Object self, Method thisMethod,
					Method proceed, Object[] args) throws Throwable {

				long start = System.currentTimeMillis();
				try {

					return proceed.invoke(self, args);

				} finally {

					long end = System.currentTimeMillis();
					System.out.println("Execution time: " + (end - start)
							+ " ms, method: " + proceed);
				}

			}
		};

		ProxyFactoryExample obj = (ProxyFactoryExample) proxy.create(
				new Class[0], new Object[0], tracingMethodHandler);

		obj.foo();
		obj.bar();
	}
}
Simple embedded web server on MacOSX

Since Python is pre-installed on every MacOSX, starting a simple, embedded webserver on MacOSX that points to some local folder is pretty easy:

  • Open terminal ( /Applications/Utilities/Terminal.app )
  • Switch to the directory you want to make available via http:
    cd /path/to/some/folder
  • Start webserver by giving it a port number as command line parameter:
    python -m SimpleHTTPServer 8080

Now you can serve your folder via http. Open a web browser and point it to http://localhost:8080 (or http//<your.machine.ip.address>:8080)

Enjoy.

Git branching model and strategy

A good and useful read about Git branching model and release management strategy. 

http://nvie.com/posts/a-successful-git-branching-model/

Rendering docbook documents with docbook4j library

In my spare time i’ve created a small embeddable java library able to render docbook documents to the well known target output formats like PDF, HTML or RTF.

It can be downloaded here: http://code.google.com/p/docbook4j

Maven users please add following repository and dependency declarations to your POM-File:

<repositories> 
 <repository> 
  <id>googlecode</id> 
  <url>http://docbook4j.googlecode.com/svn/m2-repo/releases/</url> 
 </repository> 
</repositories> 

<dependency> 
	<groupId>com.google.code.docbook4j</groupId>
	<artifactId>docbook4j</artifactId>
	<version>1.0.0</version>
</dependency> 

This post gives a brief overview how it can be used.

Read More

Load/Drop Jars to/from Oracle database using Ant

Oracle Client Package provides loadjava and dropjava tools to loading/dropping java classes / jars / resources to/from the Oracle database. 

Sometimes however it is necessary to run this functionality on the machine that doesn’t have Oracle Client package installed.

This post describes how to achieve this using Ant.

Note! This instruction is for Oracle 11g.

Prerequisites

From the machine having Oracle Client installed, copy ojdbc5.jar (typically located in $ORACLE_HOME/product/11.x/client_1/jdbc/lib) and aurora.zip (typically located in $ORACLE_HOME%/product/11.x/client_1/javavm/lib) to some folder accessible by your Ant script.

Below i’ll assume, that this 2 files are located in the same folder where the Ant script located.

Load Java Target

<target name="loadjava" description="target for deploying jars to the database">
	<java classname="oracle.aurora.server.tools.loadjava.LoadJavaMain" fork="true">
		<jvmarg value="-Xint" />
		<classpath>
			<pathelement location="aurora.zip" />
			<pathelement location="ojdbc5.jar" />
		</classpath>
		<arg line="-thin -user SCOTT/TIGER@DBHOST:1551:DBSID -resolve my-jar-to-upload.jar" />
	</java>
</target> 

This target will deploy my-jar-to-upload.jar file to the Oracle database identified by SCOTT/TIGER@DBHOST:1551:DBSID url.

Drop Java Target

<target name="dropjava" description="target for dropping jars from the database">
	<java classname="oracle.aurora.server.tools.loadjava.DropJavaMain" fork="true">
		<jvmarg value="-Xint" />
		<classpath>
			<pathelement location="aurora.zip" />
			<pathelement location="ojdbc5.jar" />
		</classpath>
		<arg line="-thin -user SCOTT/TIGER@DBHOST:1551:DBSID my-jar-to-upload.jar" />
	</java>
</target> 

This target will drop my-jar-to-upload.jar file from the Oracle database identified by SCOTT/TIGER@DBHOST:1551:DBSID url.

Database-driven unit tests with Liquibase

In the previous article "Database change management with Liquibase" i demonstrated the standard Liquibase usage for managing database changes.

This post will describe, how to construct an infstructure for executing database-driven unit tests, a more likely untypical task for Liquibase.

To be able to execute database-driven tests we have to put our database into a known state between test runs. This is where Liquibase will help us.

First, let’s create a sample set of changes that populates the database with the test data:

<databaseChangeLog xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd">
	<changeSet id="testdata" author="bar">
		<loadData tableName="t_role" file="db/testdata/roles.csv">
			<column name="name" type="STRING" />
			<column name="description" type="STRING" />
		</loadData>
		<loadData tableName="t_user" file="db/testdata/users.csv">
			<column name="id" type="NUMERIC" />
			<column name="username" type="STRING" />
			<column name="role" type="STRING" />
		</loadData>
		<loadData tableName="t_address" file="db/testdata/addresses.csv" />
		<rollback>
			<delete tableName="t_address" />
			<delete tableName="t_user" />
			<delete tableName="t_role" />
		</rollback>
	</changeSet>
</databaseChangeLog> 

In the changelog above we make use of the <loadData> tag that is able to load data from the CSV file and insert it into the database (alternatively you may use <insert>, <update> and <delete> tags to manipulate the database contents). Furthermore the <rollback> block describes how to remove the inserted changes from the database.

As brief overview, here is an example of the roles.csv file:

name,description
USER,A simple user
ADMIN,Administrator user
ANONYMOUS,NULL 

First row in the CSV file specifies the column names to populate. All subsequent rows contains the test data.

Please consult GitHub for other CSV files used in this post: https://github.com/bigpuritz/javaforge-blog/tree/master/liquibase-sample/db/testdata

Now, let’s make our project ready to use Liquibase together with the Junit.

Read More

Database change management with Liquibase

Consult https://github.com/bigpuritz/javaforge-blog/tree/master/liquibase-sample for the sample project sources.

Quote:

Liquibase is an open source (Apache 2.0 Licensed), database-independent library for tracking, managing and applying database changes. 

It is built on a simple premise: All database changes are stored in a human readable yet trackable form and checked into source control.

This post is a simple tutorial demonstrating how to use Liquibase in a real world project. We’ll assume that our sample project lives through multiple phases, each of which adds diverse changes to the database.

Let’s prepare our sample project to use Liquibase within Maven build first. We need to define the liquibase-maven-plugin within the <plugins>…</plugins> block and point it to the liquibase.properties file, containing all properties required by the Liquibase at runtime. Both are demonstrated below.

Read More

Using XSDs or any other custom resources as Maven dependencies

Often it is required to put some custom resources (e.g. XSDs, WSDLs etc.) under dependency management control.

As soon as Maven is a de facto standard dependency management mechanism in the Java world, this post demonstrates how to use it to manage untypical custom resources as dependencies within the pom.xml.

Let’s assume we have some xsd file (test.xsd in this example), that we want to use as dependency in our POM.

First, we’ll deploy it to the local repository (alternatively you can deploy it to your company’s internal maven repository):

mvn install:install-file -Dfile=test.xsd -DgroupId=com.mycompany.app \
           -DartifactId=app-xsd -Dversion=1.0.0 -Dtype=xsd -Dpackaging=xsd

After installation you’ll find the xsd in your local maven repository within the following folder structure:

Next referencing this file as a dependency in your POM is quite simple. Just write:

<dependency>
	<groupId>com.mycompany.app</groupId>
	<artifactId>app-xsd</artifactId>
	<version>1.0.0</version>
	<type>xsd</type>
</dependency>	 

Furthermore if you want to add this file as a resource to the final artifact generated by Maven, you have to configure maven-dependency-plugin in your POM like this:

<build>
	<plugins>
		...
		<plugin>
			<groupId>org.apache.maven.plugins</groupId>
			<artifactId>maven-dependency-plugin</artifactId>
			<version>2.5.1</version>
			<executions>
				<execution>
					<id>copy-dependencies</id>
					<phase>generate-resources</phase>
					<goals>
						<goal>copy-dependencies</goal>
					</goals>
					<configuration>
						<outputDirectory>${project.build.directory}/${project.build.finalName}/META-INF</outputDirectory>
						<includeArtifactIds>app-xsd</includeArtifactIds>
						<includeTypes>xsd</includeTypes>
					</configuration>
				</execution>
			</executions>
		</plugin>
		...
	</plugins>		
</build> 

After executing the maven build the app-xsd-1.0.0.xsd file will land in the artifact’s META-INF folder.

Releasing maven artifacts using Ant and Maven Ant Tasks

In the current project one of my tasks was to simplify the maven release process. This should be a simple one click solution, that satisfies following requirements:

  • Release has to be executed either from IDE or from the command line.
  • Everyone (not only developer) should be able to execute release.
  • It should hide the pain of executing multiple maven goals (release:prepare, release:perform) after each other.
  • After releasing the software released artifacts for all profiles specified in the pom.xml should be deployed to the company’s maven repository.

I decided to write a simple Ant script that internally makes use of Maven Ant Tasks.

First a target asking user for all required release informations was born:

<?xml version="1.0" encoding="UTF-8"?>
<project name="myprj" default="release" xmlns:artifact="antlib:org.apache.maven.artifact.ant">
	<property environment="env" />
	<path id="maven-ant-tasks.classpath" path="maven-ant-tasks-2.1.3.jar" />
	<typedef resource="org/apache/maven/artifact/ant/antlib.xml"
               uri="antlib:org.apache.maven.artifact.ant" classpathref="maven-ant-tasks.classpath" />
 
	<!-- - - - - - - - - - - - - - - - - - 
          target: prerequisites                      
         - - - - - - - - - - - - - - - - - -->
	<target name="prerequisites">
		<fail unless="env.MAVEN_HOME" 
		    		message="Environment variable MAVEN_HOME was not found on your system!" />
	</target>

	<!-- - - - - - - - - - - - - - - - - - 
          target: input                      
         - - - - - - - - - - - - - - - - - -->
	<target name="input">
		<artifact:pom id="local.pom" file="pom.xml" />
		<script language="javascript">
		<![CDATA[	
		    var before = project.getProperty("local.pom.version");
		    project.setProperty("tmp.release.version", before.replaceAll("-SNAPSHOT", ""));
			
		    var tmpVersion = project.getProperty("tmp.release.version");
		    var nextVersionNo = tmpVersion.substr( tmpVersion.lastIndexOf(".") + 1 );
		    project.setProperty("tmp.next.version", 
			      tmpVersion.substr(0, tmpVersion.lastIndexOf(".") + 1) + 
			      (parseInt(nextVersionNo)+1) + 
			      "-SNAPSHOT"
		    );
		]]>
		</script>

		<input message="Please enter the project release version?" 
			defaultvalue="${tmp.release.version}" addproperty="prj.release.version" />
		<input message="Please enter the svn tag name?" 
			defaultvalue="${local.pom.artifactId}-${prj.release.version}" addproperty="prj.release.tag" />
		<input message="Please enter the project next version?" 
			defaultvalue="${tmp.next.version}" addproperty="prj.next.version" />

		<fail unless="prj.release.version" message="Property 'prj.release.version' was not defined!" />
		<fail unless="prj.release.tag" message="Property 'prj.release.tag' was not defined!" />
		<fail unless="prj.next.version" message="Property 'prj.next.version' was not defined!" />
	</target>
</project> 

The “input”-target uses <artifact:pom … /> task and offers the POM structure to the ant script. 

Furthermore it makes use of Ant’s possibility to embed javascript, that reads current artifact version from the POM and pre-set some default variables like release version, release tag and next development version.

Finally the second target specifies the release procedure:

<target name="release" depends="prerequisites, input" description="project release ant script">
	<artifact:mvn mavenHome="${env.MAVEN_HOME}" fork="true">
		<arg value="--batch-mode" />
		<arg value="-Dtag=${prj.release.tag}" />
		<arg value="-DreleaseVersion=${prj.release.version}" />
		<arg value="-DdevelopmentVersion=${prj.next.version}" />
		<arg value="clean" />
		<arg value="release:prepare" />
		<arg value="release:perform" />
	</artifact:mvn>

	<available file="target/checkout" type="dir" property="checkout.dir.present" />
	<fail unless="checkout.dir.present" message="Checkout directory 'target/checkout' is not present!" />

	<!-- deploy 'test1' profile -->
	<artifact:mvn mavenHome="${env.MAVEN_HOME}" pom="target/checkout/pom.xml" fork="true">
		<arg value="-Ptest1" />
		<arg value="clean" />
		<arg value="deploy" />
	</artifact:mvn>

	<!-- deploy 'test2' profile -->
	<artifact:mvn mavenHome="${env.MAVEN_HOME}" pom="target/checkout/pom.xml" fork="true">
		<arg value="-Ptest2" />
		<arg value="clean" />
		<arg value="deploy" />
	</artifact:mvn>		
	
	<!-- deploy 'prod' profile -->
	<artifact:mvn mavenHome="${env.MAVEN_HOME}" pom="target/checkout/pom.xml" fork="true">
		<arg value="-Pprod" />
		<arg value="clean" />
		<arg value="deploy" />
	</artifact:mvn>
</target> 

Because it is just a simple ant script, it can be executed either directly from your favorite IDE

or from the command line tool by calling: ant -f release.xml


Note! maven-ant-tasks-2.1.3.jar file has to be commited to the SCM together with the release.xml