Saturday, September 27, 2008

Configuring applications with Spring

If you’ve used Spring before, you’ve almost definitely used a PropertyPlaceholderConfigurer to inject settings from external sources — most likely properties files — into your application context. The most common use cases include JDBC and Hibernate settings, but it’s not that uncommon to also configure Lucene index, temp file, or image cache directories as well. The simplest case looks something like this:

<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="location" value=""/>
</bean>   <!-- A sample bean that needs some settings. -->
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
    <property name="driverClassName" value="${jdbc.driver}"/>
    <property name="url" value="${jdbc.url}"/>
    <property name="username" value="${jdbc.username}"/>
    <property name="password" value="${jdbc.password}"/>

And might look like this:


Note, you can achieve the same simple configuration using the new spring 2.x style schema configuration, but it doesn’t allow for any further customization so we’re going to use the old style.

<!-- Example of new Spring 2.x style -->
<context:property-placeholder location=""/>

This handles the simple case of replacing placeholders (e.g. ${jdbc.url}) with values found in a properties files (e.g. jdbc.url=jdbc:h2:mem:example). In a real-world application, we not only need to collect settings, but also override them in different environments. Many of our applications are deployed in 4 or more environments (developer machine, build server, staging server, and production), each requiring different databases at the very least.

There are a few ways to enable overriding of properties. Let’s take a look at them in turn:

1. Setting the system properties mode to override (default is fallback)
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE"/>
    <property name="location" value=""/>

When configured in this mode, any value specified as a system property to the JVM will override any values set in properties files. For example, adding -Djdbc.url=jdbc:h2:mem:cheesewhiz to the JVM arguments would override the value in the file (jdbc:h2:mem:example). On a Java 1.5 or newer platform, Spring will also look for an environment variable called jdbc.url is no system property was found.

2. Specifying an optional properties file
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="ignoreResourceNotFound" value="true"/>
    <property name="locations">

When ignoreResourceNotFound is set to true, Spring will ignore resources that don’t exist. You can imagine, containing all of the default settings, versioned in your SCM system. Developers have the option of creating a properties file called to override any settings that differ in their environment. This file should be unversioned and ignored by your SCM system. This works because properties are loaded in order and replace previous values.

3. Web Application overrides

In a web application environment, Spring also supports specifying values in web.xml as context params or in your application server specific meta-data as servlet attributes. For example, if you’re using Tomcat you can specify one or more parameter elements in your context.xml, and Spring will can inject those values into placeholders.

<bean class="">
    <property name="location" value=""/>

The ServletContextPropertyPlaceholderConfigurer conveniently works in non servlet environments by falling back to the behavior of a PropertyPlaceholderConfigurer. This is great when running unit tests.

4. Combining techniques

There’s no reason why these techniques can’t be combined. Technique #1 is great for overriding a few values while #2 is better for overriding many. #3 just expands the field of view when Spring goes to resolve placeholders. When combined, system properties override those in files. When using technique #3, there are some settings available for adjusting the override behavior (see contextOverride). Test the resolution order when combining to ensure it’s behaving as expected.

Optional External Properties

There’s another use case that applies to some projects. Often in non-developer environments, system admins want to keep properties for the environment outside of the deployable archive or the application server, and they don’t want to deal with keeping those files in a Tomcat context file; they prefer a simple properties file. They also don’t want to have to place the file in a hard-coded location (e.g. /var/acmeapp/ or they may keep configuration for multiple servers in the same network directory, each file names after the server. With a little trickery, it’s easy to support an optional external properties file that isn’t in a hard-coded location. The location of the file is passed as a single system property to the JVM, for example: -Dconfig=file://var/acmeapp/ Here’s the configuration to make it happen:

<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="ignoreUnresolvablePlaceholders" value="true"/>
</bean>   <bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="ignoreResourceNotFound" value="true"/>
    <property name="location" value="${config}"/>

The first definition enables basic property resolution through system properties (in fallback mode). The second bean loads the resource from the location resolved from the system property -Dconfig. All spring resource urls are supported, making this very flexible.

Putting it all together

Here’s a configuration that does more than most people would need, but allows for ultimate flexibility:

<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="ignoreUnresolvablePlaceholders" value="true"/>
</bean>   <bean class="">
    <property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE"/>
    <property name="searchContextAttributes" value="true"/>
    <property name="contextOverride" value="true"/>
    <property name="ignoreResourceNotFound" value="true"/>
    <property name="locations">

Every placeholder goes through the following resolution process. Once a value is found it’s set and the next placeholder is resolved:

  1. (optional) Property value specified as a system or environment property; useful for overriding specific placeholders (e.g. / -Djdbc.username=carbon5)
  2. (optional) Context parameters located in web.xml or context attributes specified in application server meta-data (e.g. a Tomcat context.xml).
  3. (optional) Properties file located by the system/environment variable called “config”; useful for externalizing configuration. All URL types are supported (e.g. -Dconfig=c://
  4. (optional) Properties file identified by; useful for specific developer overrides.
  5. (required) Properties file identified by, which contains default settings for our application.
Best Practices
  • Deploy the same exact artifact (e.g. war, ear, etc) across all environments by externalizing configuration. This may seem daunting, but the emergent benefits are huge in terms of simplicity.
  • Only make things that can safely change across environments configurable. Also, only things that need to be configurable should be configurable, it’s easy to go overboard.
  • Configure the minimal properties search path that meets your requirements.
  • When looking for properties files in the project tree, use classpath resources whenever possible. This makes finding those files easy, consistent, and insensitive to the working-dir, which is great when running tests from your IDE and command line.
  • Aim for a zero-configuration check-out, build, run-tests cycle for the environment where its happens most: development.

What other interesting configuration scenarios have you seen?

by christian

Wednesday, September 10, 2008

Lazy loading vs. pre-loading beans with Spring Framework

Spring framework can instantiate and bind (called loading) related Java objects (called beans) according to a given configuration. An XML file can easily be used to define these bindings. Spring framework supports two different types of loading methods; lazy loading and pre-loading respectively managed by BeanFactory and ApplicationContext containers.

Lazy Loading

A bean is loaded only when an instance of that Java class is requested by any other method or a class. org.springframework.beans.factory.BeanFactory (and subclasses) container loads beans lazily. Following code snippet demonstrate lazy loading, concentrate on how "beans.xml" spring configuration file is loaded by BeanFactory container class.
BeanFactory factory = new XmlBeanFactory(
new InputStreamResource(
new FileInputStream("beans.xml"))); // 1
Employee emp = (Employee) factory.getBean("employeeBean"); // 2

Even though "beans.xml" configuration file is loaded with BeanFactory container in line number 1, none of the beans will be instantiated. Instantiation takes place only at line number 2, where bean called "employeeBean" is requested from container. Since the class is instantiated at getBean() method call, time spend to return this method will vary depending on the instantiated object.


All beans are instantiated as soon as the spring configuration is loaded by a container. org.springframework.context.ApplicationContext container follows pre-loading methodology.
ApplicationContext context =
new ClassPathXmlApplicationContext("beans.xml"); // 1
Employee emp = (Employee) context.getBean("employeeBean"); // 2

As all singleton beans are instantiated by container at line number 1, this line will take some considerable time to complete. However line number 2 will return the bean instance immediately since instances are already available inside the container.

Point to note

Decision to choose one from these two methods would depend solely on application specific requirements. Some applications need to load as soon as possible while many others would probably willing to spend more time at startup but serve client requests faster. However some of the beans defined in a configuration may only be used rarely, so instantiating such classes at start up would not be a wise decision. Similarly, some Java instances would be highly resource consuming; leading not to instantiate at start up.

By Kamal Mettananda

Monday, September 8, 2008

Using Spring MVC Controllers in Grails

Groovy is slower than Java and sometimes dramatically slower. Realistically, this has little impact on a web application since response time is affected more by the database and network latency, so as long as the slowdown isn't too dramatic, the benefits of Groovy and Grails far outweigh these concerns. And Grails is still way faster than Rails :)

But having said that, I was wondering how to use a regular Java Spring MVC controller and JSP instead of a Grails controller and a GSP (both of which use Groovy). Turns out it's pretty easy:

  • Register the traditional Spring dispatcher servlet in web.xml (you'll need to have run grails install-templates). In this example the name (SpringMVC) isn't important, use whatever you want, and I've chosen to map *.action URLs to this controller and let Grails handle the rest:


  • Generate web-app/WEB-INF/SpringMVC-servlet.xml:

<?xml version='1.0' encoding='UTF-8'?>

<beans xmlns=''

<bean id='mvcHandlerMapping'
<property name='interceptors'>
<ref bean='openSessionInViewInterceptor' />
<ref bean='localeChangeInterceptor' />

<bean id='mvcViewResolver'

<bean name='baseSimpleController' abstract='true' p:cacheSeconds='0'/>

<bean name='jspController'

<!-- actions -->

<bean name='/test.action'

<bean name='/other.action' parent='jspController' p:successView='other' />


And that's it. Some notes:

  • the handler mapping uses the id mvcHandlerMapping since Grails will create one using the standard name of handlerMapping
  • since handler mappings are auto-discovered by default, you need to set the order attribute to something lower than the Grails mapping's (which uses the default value of Integer.MAX_VALUE) so this mapping is accessed first
  • the HandlerInterceptors that are configured for the Grails mapping (OpenSessionInView, LocaleChange) won't be automatically available to this mapping, but it's simple to borrow them since they're registered as beans; you can also add other custom interceptors to the list
  • I've created an optional abstract parent controller bean (baseSimpleController) for simple controllers (single-page, i.e. not form or wizard controllers)
  • I've also created a simple controller that just shows a JSP – this is useful for pages that don't have any controller logic:


    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;

    import org.springframework.web.servlet.ModelAndView;
    import org.springframework.web.servlet.mvc.AbstractController;

    public class JspController extends AbstractController {

    private String _successView;

    protected ModelAndView handleRequestInternal(
    final HttpServletRequest request,
    final HttpServletResponse response) {

    return new ModelAndView(_successView);

    public void setSuccessView(final String view) {
          _successView = view;

I've mapped two sample URLs – /test.action, which uses a controller, and /other.action, which uses JspController to just show other.jsp.

Note that it is possible to use JSPs with Grails; Grails looks for a GSP using the specified name, but if it doesn't find one it looks for a JSP (under /WEB-INF/grails-app/views/) and uses that if it exists. So another option is to use Grails controllers and JSP.

Big caveat: I haven't used this in production yet – I'm just prototyping so I'll have this available in the future just in case.

original post

Tuesday, September 2, 2008

RESTful URLs with Spring MVC and UrlRewriteFilter

While Spring 3.0 promises to support REST style URLs out of the box, it won’t ship until sometime later this year. Spring 3.0 M1 will offer this functionality for anyone brave enough to work with potentially unstable technologies and should be available soon. And while this is good news for those starting out on new projects (or those willing to undergo a significant refactoring), most won’t want to migrate their applications just for RESTful URLs. All hope is not lost though, thanks to Paul Tuckey’s UrlRewriteFilter.

With this short tutorial, I’ll demonstrate how easy it is to configure UrlRewriteFilter for use within your Spring MVC application. I should mention that this technique will work well for just about any Java-based web framework, like JSF or Struts.

For those of you that want to skip to the end, I’ve created a small, functional sample application which demonstrates the techniques described in this tutorial. It can be downloaded here.

Getting started, we need to register the filter within our application’s web.xml file.



    <!-- UrlRewriteFilter -->

    <!-- UrlRewriteFilter Mapping -->

That out of the way, we can dig into the application itself. Let’s review an example controller class as it might exist in your application today.


public class SprocketsController {

    private final SprocketService service;

    public SprocketsController(SprocketService service) {
        this.service = service;

    public String list(ModelMap model) {
        model.addAttribute("sprockets", service.list());
        return "sprocket/list";

    public String display(@RequestParam("sprocketId") int sprocketId, ModelMap model) {
        model.addAttribute("sprocket", service.find(sprocketId));
        return "sprocket/display";

    public String edit(@RequestParam("sprocketId") int sprocketId, ModelMap model) {
        model.addAttribute("sprocket", service.find(sprocketId));
        return "sprocket/edit";

Our controller contains actions for displaying a list of sprockets, drilling down into each sprocket’s details and editing an individual sprocket. Each action is mapped to a URI path via the @RequestMapping annotation and, as you can see, all three defined here handle requests mapped to the *.do extension. The beauty of the solution I’m demonstrating here is that you should not need to make any changes to your application code at all. That in mind, let’s move on to the interesting part, configuring our application to respond to a REST style URL like http://localhost:8080/sprocket/1234/edit instead of what we’ve got now: http://localhost:8080/sprocket/ You’ll need to create a new configuration file named urlrewrite.xml and make it available on the classpath. (For those with a distaste for XML based configuration, the latest version of the filter offers a means to generate the configuration via annotations. Details on that can be found here.) We’ll start our configuration by defining a few rules to handle the incoming requests.




Let’s take a look at what each one of these rules does. The first rule states that the UrlRewriteFilter should transparently forward each request for http://localhost:8080/sprockets/ to the existing application URL of http://localhost:8080/sprockets/ The second and third rules handle the display and edit requests. These 2 rules define simple regular expressions that are parsed out and appended to the destination URI as query string parameters. If you have more than 1 query string parameter, you’ll need to use & XML entity. (It should be noted that you can use wildcard matching (*) instead, however it lacks some of the flexibility offered by regular expressions.)

That takes care of the inbound URLs, but we still have a problem. Within our application, we have a number of links which point to the old URL structure. No worries, UrlRewriteFilter tackles this issue with ease. By examining the response, the filter can rewrite the existing links defined within anchor tags, i.e. <a href=”<c:url value=’/sprockets/’/>”>Return to the list.</a>. Let’s take a look at the rule definitions to handle this.




More or less, these outbound rules are just the inverse of the inbound rule definitions. All it takes is a simple regular expression to parse out the sprocketId. One common gotcha to note here is that you need to add the \ character before the start of the query string marked by the question mark. If you don’t do this, the filter won’t be able to process your links.

Pretty easy, right? With your rules in place, fire up your application and give it a try. You can always examine the status of the filter by visiting If you’re a bit hesitant to try this out on your own app, fret not, I’ve created a simple, yet functional sample application which you can use to experiment with. The example source code can be downloaded here.

by Carl Sziebert

Saturday, August 30, 2008

Database Testing with Spring 2.5 and DBUnit

We’ve been using DB Unit on our Java projects for years and the mechanics of how it’s used has evolved over time. I’ve recently spent some time making it work a little nicer for how we typically write database tests. What I’ve created makes using DBUnit on a project that is already using Spring and the testing support added in Spring 2.5 just a little easier through the application of convention and annotations.

In general, we’ve adopted the convention of loading data off the classpath from a flat dataset file named after the test located next to the test on the classpath. For example (in the maven standard directory structure):

  • src/test/java/com/acme/ - Java Test Code
  • src/test/resources/com/acme/TripRepositoryTest.xml - DB Unit Data Set for TripRepositoryTest

For most tests, the data set is loaded inside the test’s transaction and rolled back when the test completes so that nothing needs to be cleaned up (see Spring’s reference). For other tests — service or integration tests — the data is loaded outside of a transaction and must be cleared out manually. Most projects have a mix of both strategies and both should be easily supported.

When Spring 2.5 came out with its new testing framework, I threw together a custom TestExecutionListener that looks for test methods that are annotated with @DataSet, and when found, loads the data using DB Unit. Here’s a transaction-per-test example: - Example transaction-per-test Test Case
@ContextConfiguration(locations = {"classpath:applicationContext.xml"})
public class TripRepositoryImplTest extends AbstractTransactionalDataSetTestCase {
    @Autowired TripRepository repository;   @Test
    public void forIdShouldFindTrip() throws Exception {
        Trip trip = repository.forId(2);
        assertThat(trip, not(nullValue()));

The high-level execution path for this example looks like:

  1. Inject dependencies (DependencyInjectionTestExecutionListener)
  2. Start transaction (TransactionalTestExecutionListener)
  3. Load dbunit data set from TripRepositoryImplTest.xml (DataSetTestExecutionListener) using the setup operation (default is CLEAN_INSERT)
  4. Execute test
  5. Optionally cleanup dbunit data using the tear down operation (default is NONE)
  6. Rollback transaction (TransactionalTestExecutionListener)

Here’s the trimmed down log output for this test:

INFO: Began transaction (1): transaction manager; rollback [true] (
INFO: Loading dataset from location 'classpath:/eg/domain/TripRepositoryImplTest.xml' using operation 'CLEAN_INSERT'. (
INFO: Tearing down dataset using operation 'NONE', leaving database connection open. (
INFO: Rolled back transaction after test execution for test context (

For this to work in its current incarnation, a single datasource must be available for lookup in the application context. One of the interesting details is what to do with the connection used to load the data. The framework assumes that if it’s a transactional connection it should be left open because whatever started the transaction should do the closing. When it’s non-transactional it’s closed after the dataset is loaded. This convention works well for how I typically write my database tests.

In addition to the @DataSet annotation, we must add the DataSetTestExecutionListener to the set of listeners that are applied to the test class. As in the above example, you can extend AbstractTransactionalDataSetTestCase which does this for you or you can specify the listener using the class-level annotation @TestExecutionListeners (see example). It’s important that the listener is triggered after the TransactionalTestExecutionListener.

If all test methods use the dataset, then the test class (or super class) can be annotated and every test will load the dataset. Also, if a different dataset should be loaded, the name of the resource can be specified in the annotation (e.g. @DataSet(”TripRepositoryImplTest-foo.xml”) or @DataSet(”classpath:/db/trips.xml”)). Lastly, the setup and teardown database operations can be overriden (e.g. @DataSet(setupOperation = “INSERT”, teardownOperation=”DELETE”)).

This functionality is part of the C5 Test Support package and is available in our maven repository. To use it, first add the C5 Public Maven repository to your pom.xml, and then add the necessary dependencies:

    </dependency>   <dependency>

Check out the sample application for details. It’s mavenized and utilizes an in-memory database. Just check it out of subversion, look over the code, and give it a run using your IDE or from the command-line (mvn install). I’d be psyched to hear what you think and of course, welcome comments and suggestions.


By Carbon Five Community

Sunday, August 17, 2008

using Spring Web Flow 2

I have got opportunity to work with Spring WebFlow 2 recently in a project, here I share my personal views on that with you.

Let me first tell you all nice things about recent spring stack (spring 2.5 and above). Two things which  improved a lot with recent release are: annotation support, specific namespaces.

Annotations lets you spend your time more on writing code than to wiring components through xml. Off-course spring fails fast if you have messed up a configuration, but still annotations are lot better to avoid that in first place. With improved @Repository, @Service and @Component it’s easy to configure beans with required specific responsibilities by default.

Namespace improvements, help to keep the xml configuration minimal and typo-error free. Schema definitions helps to validate you configuration as you type, and also with convention over configuration approach they have reduced the lines of XML we need to wire up objects. If you want to replace a component with your custom implementation, sometimes its easy by using auto-wire option; sometime you have to configure them by the old way (i.e. using beans namespace and manually declaring most of the configuration) which is more painful after you getting used to the new way.

With SpringTest framework it’s fairly easy to write integration test cases. With simple annotation spring will automatically loads the application context on the test start up. Also with @Timed you could even clock your test method, and make it fail if it exceeds specified time. And it also supports Transactional test with automatic rollback on default, so if you could write tests which doesn’t dirties up the database.

Let’s come back to the original topic Spring web flow. Spring webflow works as advertised for, i.e. they are for application which has a natural flow behind in business, and UI acts as a way to capture input for the flow and to display something back. Not for an application that has a different requirement than stated above.

Everything is a flow, each flow has a starting point and a end point, and could have any number of transitions in between. As a part of transition you could go to a sub-flow and come back to the original flow later, but these transitions could only happen at the pre-defined places on the flow. It will be tough to implement a free-flow (random browse) kind of applications with it.

It serializes all the information you add to the flow context and restores them as you resume a flow after UI interaction, so every object like entities, repositories, and whatever should implement Serializable. This restricts what you could share in the flow context.

Most of the decision for transition could be easily handled in the flow definition, this avoids creating Action classes which returns just the outcome.

in JSF UI:

<h:commandButton action=”save” />

in Flow definition:

<view-state …

<transition on=”save” >

    <expression =”validator.validate(model)” />


As you could see, you don’t need to have the Action class which returns outcome ’save’, you could direct specify a transition on the command button. Ok, now you could ask what if the save has to be returned only on certain condition (say after only validation passes on the entity). For that you could have a expression executed on the transition, the transition will execute only if the validator returns true, if the validator returns false it will come back to the same view. The expression will accept any EL method expression, need not be just a validator. So you could run any action before the transition. As you could see the method executions in the action class are moved to the flow definition. This will look elegant only if the number of calls made at transition is small, or your application is well thought and designed to share less number of information in state, and keeping the method calls down. (Basically this is a nice feature , but would go awry for huge apps, and for apps which there is no certain business flow behind it)

Spring web flow also supports inheritance of flows, so you could inherit common transition rules from a parent flow. Which is a nice feature to keep the definition DRY as far as possible.

What makes flow definition looks ugly? Whenever there are more no. of mere actions which is called in the transitions to set a variable, to retrieve a variable from flowScope and setting back to the viewScope or so. One thing I had to do multiple times in flow definitions are to transform a List to dataModel for the UI, so I could use listName.selectedRow to identify item selected by the user.

Adding this kind of non-business related method executions and transformations, etc ., to the flow definitions makes it bulky, and also alienates the flow from resembling the business definitions. This defeats the very own cause of having a flow definition.

WebFlow provides convenient default variables like resourceBundle, currentUser, messageContext available in the flow context, which you could refer directly in the flow definition or pass it as arguments to bean action methods, or call actions on them.

When a root flow ends, all the information will be discarded. This is nice for cleaning unwanted data in the  memory but that also means that you cannot share anything with the user after the flow is ended. Suppose I would like to say that the user have successfully placed an order at the end of the flow, I could not do that! You could ask that why not keep the confirmation as part of the flow, well it depends on what time you are committing the changes to the db, or how you are sharing a persistent context, or even like its just a end message, there should not be interaction after that from the view to end the flow.

It’s like redirecting to the home page after successfully placing the order and showing a banner “Thank you for shopping with us!”, which is not just possible.

One last point is that with UrlMapper definition in the configuration you could make a simple url as a starting point of the flow, but otherwise generally can’t use a RESTFUL GET url to reach a page on the flow.

What’s your experience with Spring Web Flow?

by Srinivasan Raguraman,

Friday, July 25, 2008

Spring Web services and Axis2

As you know Axis2 is a Web service framework which has support many things. It has support for scripting languages , it has support for data services and it has support for EJB , Corba and etc. In addition that since a long time it has support for Spring as well. With that you can deploy Spring bean as Web services in Axis2. Yes I agree it is yet another way of getting the thing done. I also realized that is not enough for spring developers. They need everything works on spring.

To solve that in WSO2 we came up with a solution where we have integrated Axis2 into Spring. When doing this we have convert all the axis2 configurations files into bean descriptors , for example we came up with a set of beans for axis2.xml. With this we have integrated Axis2 smoothly into Spring. After thing anyone can easily expose a bean as a Web service. And get the power of all the other WS* support , such as security , reliability etc. , above all you can get the power of Axis2 while you are in spring container.

With this approach you can make a bean into a Web service just using following line of codes

<bean id="services" class="">
<property name="services">
<bean id="helloService" class="">
<property name="serviceBean" ref="helloworld"></property>
<property name="serviceName" value="helloWorldService"></property>

You can read more about Spring support from the following links

WSO2 Web Services Framework for Spring

Hello World with WSO2 WSF/Spring

by Deepal Jayasinghe

Tuesday, July 22, 2008

Unit test with Spring Dynamic Modules

Follow our story about developing services by using Spring Dynamic Modules. We are developing our application and we have already migrated our application to support osgi. During migration, we have some problems with our test classes, we wanted to run our unit test in osgi environment (so we called integration test - integration test in osgi environment should be the better choice because we can check the class resolving among bundles besides of making sure all business rules are run properly). We already have a lot of unit tests written by JUnit 4 (which are based on Unitils and we already developed a bunch of test module such as Servlet Container, Ldap etc) and unfortunately they can not run in osgi environment. This post is our experience while we develop our integration test base on Spring DM testing framework, actually the cost of migration from our unit test is not much, most of time for researching how to do and simply replaces the revelant things of Unit 4 to Unit 3. Spring DM testing framework supports integration test but for JUnit 3 only now. Here is the general scenerio of Spring DM supports to run integration testing (quoted from Spring DM reference document)

  • Start the OSGi framework (Equinox, Knopflerfish, Felix)

  • install and start any specified bundles required for the test

  • package the test case itself into a on the fly bundle, generate the manifest (if none is provided) and install it in the OSGi framework

  • execute the test case inside the OSGi framework

  • shut down the framework

  • passes the test results back to the originating test case instance that is running outside of OSGi

Setting the integration test environment

Make sure that the following necessary files belong your class-path:

  • spring-osgi-core-1.1.0
  • spring-osgi-extender-1.1.0
  • spring-osgi-io-1.1.0
  • spring-osgi-test-1.1.0

Creating the first Osgi integration test

All integration test class must inherit the org.springframework.osgi.test.AbstractConfigurableBundleCreatorTest. The first step is creating our base class for all osgi integration testing:

 * The base class of integration testing of Engroup. All integration testing
 * classes must be derived from this class
public class AbstractEngroupOsgiTest extends
  AbstractConfigurableBundleCreatorTests {

Configuring boot bundles of Spring DM

By default, Spring DM testing frameworks will load some default bundles before loading specific bundles for your test. You can see the default bundles loaded by Spring DM testing framework at org.springframework.osgi.test.internal.boot-bundles of spring-test project. In some cases, you need to replace the boot bundles of Spring DM testing framework by your bundles. The typical scenario that you need to do that is case you already have some bundles, in these bundles you add some extra osgi meta data (such as DynamicImport-Package field to allow the bundle can dynamic load class at runtime). Having the same package with the same version is prohibited. In addition, we are doing integration test, so make the test environment is similar with real environment is a must. Spring DM allows you can inject the new configuration of boot bundles by simply override the method getTestingFrameworkBundlesConfiguration of class AbstractConfigurableBundleCreatorTest.

Example 1: Inject the new configuration of boot bundles for Spring DM testing

private static final String TEST_FRRAMEWORK_BUNDLES_CONF_FILE = "/META-INF/";
protected Resource getTestingFrameworkBundlesConfiguration() {
  return new InputStreamResource(AbstractEngroupOsgiTest.class

Customizing the bundle content

Due to your test class is bundled in on-the-fly bundle. You can customize the manifest file of this bundle. In some cases, it is the must for complex case while the test class use the class of other bundles run during test executing. To do this, you simply override the method getManifestLocation (note that you can change the content of manifest file programatically by override the method getManifest, however we prefer to use the manifest file)

Example 2: Customizing the test manifest file

protected String getManifestLocation() {
    return "classpath:META-INF/MANIFEST.MF";

Example 3: Customized manifest file

Manifest-Version: 1.0
Embed-Directory: lib
Implementation-Title: Engroup Osgi Integration Test
Spring-Version: 2.5.5
Bundle-Activator: org.springframework.osgi.test.JUnitTestActivator
Implementation-Version: 2.5.5
Tool: Bnd-0.0.160
Bundle-Name: engroup-integration-test
Created-By: 1.6.0 (Sun Microsystems Inc.)
Bundle-Version: 0.0.1
Bnd-LastModified: 1207763595575
Bundle-ManifestVersion: 2
Bundle-ClassPath: .,
Bundle-SymbolicName: engroup.server.engroup-integration-test
Include-Resource: {src\test\resources}

Some tips of writing Osgi Integration Test

  1. Set the default wait timer of creating spring context:
    Waiting time is set to wait until all dependencies are resolved before running test execution. Base on practice the waiting time is varied but limit under 20s (in our project). However, to support debugging while running integration test, the longer value should be set to prevent Spring DM interupt the test while developers are debugging the test program.Example 4: Set the wait time override the default value of Spring DM
        protected long getDefaultWaitTime() {
            return 3000;
  2. Inject spring beans in test class:
    We use Spring DM with the purpose that Spring beans could be exposed as the osgi services and used in other bundles. Of course, we would like to test spring beans in integration testing. To use spring, make sure the integration test class load the spring context files firstExample 5: Load the spring context files
         protected String[] getConfigLocations() {
             return new String[] {
                 "META-INF/spring/db-context-osgi-test.xml" };

    Note: All spring context files should only declare only spring beans are exposed by other bundles and remember to import the necessary packages in your manifest file

    Example 6: Spring context file declares the spring bean service exported by other bundles and they serve for testing purpose

    <beans xmlns=""
     <osgi:reference id="companyService"
      interface="" />
     <osgi:reference id="divisionService"
      interface="" />

    Now, it is ready to inject our spring beans to test class by injecting the spring bean to protected objects of test class

    Example 7: Inject the spring bean to test class protected fields

       protected CompanyService companyService;
       public AbstractEngroupOsgiTest() {

    Now, you can use companyService spring bean is exposed from other bundles in your test method.

by haiphucnguyen

Sunday, July 13, 2008

Seam, Spring and jBPM integration HowTo

This HowTo describes a way to integrate Seam, Spring and jBPM in order to use the same Hibernate SessionFactory in both Spring and jBPM (and of course, Seam).

At first, make sure you use the latest version 2.1.0 of Seam since you could get trouble with 2.0.1 and SpringTransactions.

The relevant parts of the configuration are:

- in your Spring bean config, define your Hibernate sessionFactory as usual and set the following properties in special

<bean id=“hibernateSessionFactory”

<!– The hibernate properties  –>
<property name=“hibernateProperties”>

<prop key=“”>update</prop>
<!– set to create-drop to NOT maintain state between two executions of the app –>

<prop key=“hibernate.connection.release_mode”>


<!– this property must be set to false so we can use independent sessions –>
<property name=“useTransactionAwareDataSource”>

<property name=“mappingResources”>

<!– here you have to list all the *hbm.xml files for jBPM –>
<!– see the default hibernate.cfg.xml file from jBPM –>


- second, for the Seam Spring integration we need two beans

<bean id=“sessionFactory”
<property name=“sessionName” value=“hibernateSession” />
<bean id=“localTransactionManager”
<property name=“sessionFactory” ref=“hibernateSessionFactory” />

That’s it for Spring configuration.

Now in components.xml, we need

<!– use the power of Spring transactions –>
<spring:spring-transaction platform-transaction-manager-name=“localTransactionManager”/>

<persistence:managed-hibernate-session name=“hibernateSession” auto-create=“true”

<component class=“org.jboss.seam.bpm.Jbpm”>
<property name=“processDefinitions”>processdefinition.jpdl.xml</property>

In order to use the hibernateSession in jBPM, I subclassed the DbPersistenceService from jBPM. You need two classes:

package your.namespace.jbpm.integration;

import org.hibernate.Session;
import org.hibernate.SessionFactory;

import org.jboss.seam.Component;
import org.jboss.seam.contexts.Contexts;

import org.jbpm.svc.Service;

* @author Frank Bitzer

public class DbPersistenceServiceFactory extends
org.jbpm.persistence.db.DbPersistenceServiceFactory {

private static final long serialVersionUID = 997L;
SessionFactory sessionFactory;

* {@inheritDoc}
public Service openService() {

//create instance of own service implementation
return new your.namespace.jbpm.integration.DbPersistenceService(this);

* Retrieve Hibernate sessionFactory
public synchronized SessionFactory getSessionFactory() {

if (sessionFactory==null) {


//access seam component holding session
Session session = (Session)

//and extract sessionFactory
sessionFactory = session.getSessionFactory();



return sessionFactory;

* Set sessionFactory
public void setSessionFactory(SessionFactory sessionFactory) {
this.sessionFactory = sessionFactory;


package your.namespace.jbpm.integration;

import org.hibernate.Session;
import org.jbpm.JbpmContext;
import org.jbpm.persistence.db.DbPersistenceServiceFactory;
import org.jbpm.svc.Services;
import org.springframework.orm.hibernate3.SessionFactoryUtils;

* @author Frank Bitzer
public class DbPersistenceService extends
org.jbpm.persistence.db.DbPersistenceService {

private static final long serialVersionUID = 996L;

public DbPersistenceService(
DbPersistenceServiceFactory persistenceServiceFactory) {
this(persistenceServiceFactory, getCurrentServices());

static Services getCurrentServices() {
Services services = null;
JbpmContext currentJbpmContext = JbpmContext.getCurrentJbpmContext();
if (currentJbpmContext != null) {
services = currentJbpmContext.getServices();
return services;

DbPersistenceService(DbPersistenceServiceFactory persistenceServiceFactory,
Services services) {


this.persistenceServiceFactory = persistenceServiceFactory;
this.isTransactionEnabled = persistenceServiceFactory
this.isCurrentSessionEnabled = persistenceServiceFactory
.isCurrentSessionEnabled(); = services;

* Use Hibernate sessionFactory to retrieve a Session instance.
public Session getSession() {

if ((session == null) && (getSessionFactory() != null)) {

session = getSessionFactory().openSession();

mustSessionBeClosed = true;
mustSessionBeFlushed = true;
mustConnectionBeClosed = false;

isTransactionEnabled = !SessionFactoryUtils.isSessionTransactional(
session, getSessionFactory());

if (isTransactionEnabled) {


return session;


To finish work, simply use the brand-new DbPersistenceService in jbpm.cfg.xml like

<service name=“persistence”>
<bean class=“your.namespace.jbpm.integration.DbPersistenceServiceFactory”>
<field name=“isTransactionEnabled”>


Also make sure your Spring WebApplicationContext is initialized before the startup of Seam. This can be achieved by placing the org.jboss.seam.servlet.SeamListener behind the listener for Spring in your web.xml.

That’s it! Now everything should work fine.

Note that I also contributed this HowTo to the official Seam Knowledge Base. You can find it here.

Saturday, June 28, 2008

Spring and Lingo = Easy JMS

With Lingo from codeHaus, Spring remoting can be extended to support JMS.
Here’s a great article on Saniv Jivan’s blog that shows some of its capabilities (synchronous calls over JMS and asynchronous callbacks)

One feature really seducing is asynchronous callbacks over JMS
with POJOS (without a single line of JMS code).
The Lingo site does not provide much documentation on it
so thanks for the author of this nice article.

We applied this technique for our build system to distribute load
on different machines to speed up the process (we only have mono pro
build machines) and gets informed when tasks are done via callbacks.
We used Spring 2.0M4 and ActiveMQ 3.2.2 in standalone mode.

Note that I had troubles to make it run with Websphere MQ 5.3
First, recent MQ JMS 1.1 compliant Jars must be used and
a misinterpretation of the JMS specs by Websphere seems to break
the Lingo Spring JMS service exporter see

which is for JMS templates but can also be applied to lingo.

Here’s the diff of org.logicblaze.lingo.jms.JmsServiceExporter between unpatched and patched version for Websphere MQ:

diff -aur lingo-1.1/src/java/org/logicblaze/lingo/jms/ li
--- lingo-1.1/src/java/org/logicblaze/lingo/jms/ 2006-06-
13 13:45:12.716722400 +0200
+++ lingo-1.1-patch/src/java/org/logicblaze/lingo/jms/JmsServiceExporter.java200
6-06-13 13:44:49.899847100 +0200
@@ -180,7 +180,7 @@   }
else {
-            return session.createConsumer(destination, messageSelector, noLocal
+            return session.createConsumer(destination, messageSelector);

Monday, June 23, 2008

Spring entityManagerFactory in jta and non-jta modes

This blog post is about using JPA with Spring in 2 contexts :

  • production with a JTA transaction manager
  • testing with transactions handled by jpa transaction manager.

It has been inspired by Erich Soomsam blog post
You can achieve such configuration with a PersistenceUnitPostProcessor having a single persistence.xml file and 2 Spring context files (1 for each environment).

Since you are likely to have at least 2 different Spring dataSource definitions : 1 for production that performs a JNDI lookup to find a bound datasource and 1 for development that uses a local and Spring declared datasource backed by a JDBC connection pool (C3p0 or DBCP), place the entityManager declaration in the same file as the datasource declaration.

Let’s say that the default persistence.xml use the non-jta datasource:

<persistence xmlns=""
   <persistence-unit name="seamphony" transaction-type="RESOURCE_LOCAL">
          <!-- Scan for annotated classes and Hibernate mapping XML files -->
          <property name="hibernate.archive.autodetection" value="class, hbm"/>
          <property name="hibernate.dialect"

Here’s how you can use Spring to post process the persistence unit and configure it for production (here
with MySQL datasource and JBoss Transaction Manager):

<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
   <property name="dataSource" ref="dataSource"></property>
   <property name="jpaVendorAdapter">
	<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
            <property name="database" value="MYSQL"></property>
            <property name="showSql" value="true"></property>
            <property name="databasePlatform" value="org.hibernate.dialect.MySQLDialect"></property>
   <property name="jpaPropertyMap">
	<entry key="hibernate.transaction.manager_lookup_class" value="org.hibernate.transaction.JBossTransactionManagerLookup"/>
	<entry key="hibernate.transaction.flush_before_completion" value="true"/>
	<entry key="hibernate.transaction.auto_close_session" value="true"/>
	<entry key="hibernate.current_session_context_class" value="jta"/>
	<entry key="hibernate.connection.release_mode" value="auto"/>
   <property name="persistenceUnitPostProcessors">
         <bean class="JtaPersistenceUnitPostProcessor">
            <property name="jtaMode" value="true"></property>
            <property name="jtaDataSource" ref="dataSource"></property>
 Datasource Lookup
<bean id="dataSource" class="org.springframework.jndi.JndiObjectFactoryBean">
   <property name="resourceRef">
   <property name="jndiName">
</bean>   <!--
 Transaction Manager
-->   <bean id="transactionManager" class="org.springframework.transaction.jta.JtaTransactionManager">
   <property name="transactionManagerName" value="java:/TransactionManager"></property>
   <property name="autodetectUserTransaction" value="false"></property>

Here’s the class that reads the jta mode property and configure the transaction type accordingly:

import javax.persistence.spi.PersistenceUnitTransactionType;
import javax.sql.DataSource;
import org.springframework.orm.jpa.persistenceunit.MutablePersistenceUnitInfo;
import org.springframework.orm.jpa.persistenceunit.PersistenceUnitPostProcessor;   public class JtaPersistenceUnitPostProcessor implements
		PersistenceUnitPostProcessor {   private boolean jtaMode = false;   private DataSource jtaDataSource;
	private PersistenceUnitTransactionType transacType = PersistenceUnitTransactionType.RESOURCE_LOCAL;   public void postProcessPersistenceUnitInfo(MutablePersistenceUnitInfo mutablePersistenceUnitInfo) {   if (jtaMode) {
			transacType = PersistenceUnitTransactionType.JTA;
		}   mutablePersistenceUnitInfo.setTransactionType(transacType);   }   public boolean isJtaMode() {
		return jtaMode;
	}   public void setJtaMode(boolean jtaMode) {
		this.jtaMode = jtaMode;
	}   public DataSource getJtaDataSource() {
		return jtaDataSource;   }   public void setJtaDataSource(DataSource jtaDataSource) {
		this.jtaDataSource = jtaDataSource;
	}   }

Spring really helps tuning your persistence unit for different environments. It could be achieved by a custom build task that could alter the persistence.xml file but since this example assumes that Spring is already used, it can be avoided.

Saturday, June 21, 2008

Jackbrabbit OCM and Spring

Content Repository API becomes more and more popular nowadays, however it is rather difficult in use because its takes time of learning and use. OCM (Object Content Mapping) module is the great tool to helps developers save a lot of time to develop the content-driven application and Spring is the great DI platform to hide most of complexity of initializing, creating and executing repository. We are developing the engroup ECM module base on Jackbrabbit OCM and Spring Module. During development, we look for help in many forums, websites but unfortunately we do not seek the full solution, we hope that this article provide the full example of using Jackbrabbit and Spring in the real application. Part of engroup ECM code base is included in attached file, it is developed base on jackrabbit 1.5 (snapshot version - you can get it at apache maven repository, spring modules 0.9 and the patch spring-ocm got at

First, create the POJOs for JCR repository:

  @Node(jcrMixinTypes = “mix:versionable”)
  public class Content {
    protected String id;
    protected String path;
    protected String name;

Note: if you want to create the POJO inherit the JCR fields of its parent class, you must use the extend property field of annotation Node like the following example:

  @Node(jcrMixinTypes = “mix:versionable”, extend = AbstractFile.class)
  public class File extends Content {
    protected byte[] content;

The next step is creating the spring beans to init the repository, register nodes types and POJOs with repository. Here is the part of configuration file (you can see the full file in the attachment):

  • Initialize the repository:
        <bean id="repository" class="org.springmodules.jcr.jackrabbit.RepositoryFactoryBean">
          <property name="configuration" value="classpath:jackrabbit-repo.xml" />
          <property name="homeDir" value="file:/tmp/repository" />
        <bean id="jcrSessionFactory" class="org.springmodules.jcr.jackrabbit.ocm.JackrabbitSessionFactory">
          <property name="repository" ref="repository" />
          <property name="credentials">
            <bean class="javax.jcr.SimpleCredentials">
              <constructor-arg index="0" value="superuser" />
              <!-- create the credentials using a bean factory -->
                <constructor-arg index="1">
                  <bean factory-bean="password" factory-method="toCharArray" />
          <property name="nodeTypes2Import" value="nodetypes/custom_nodetypes.xml" />
  • Make the mapping between POJOs and annotation mapper
        <bean id="jcrMappingDescriptor" class="org.apache.jackrabbit.ocm.mapper.impl.annotation.AnnotationMapperImpl">
          <constructor-arg index="0">
            <!--Put all your POJOs in this list-->
  • Declare the Content Service Bean with transaction management support
     <bean id=”internalContentService”
        <property name=”jcrTemplate” ref=”jcrMappingTemplate” />
     <bean id=”contentService” parent=”baseTransactionProxy”>
        <property name=”proxyInterfaces”>
        <property name=”target”>
          <ref bean=”internalContentService” />
        <property name=”transactionAttributes”>
            <prop key=”*”>PROPAGATION_REQUIRED</prop>

Well, all configuration tasks are done. Now, you can access the content of repository by using POJOs. Thanks for spring modules that helps you reduce lot of code for initilizing and manage repository. The tasks later just be simply like you work with Hibernate entity, all works are done by Java code (Of course, you need to know a little advance knowledge of repository to customize data types etc for your needs). After I create the service, now it is time to write some little unit test to make sure all configurations are set properly :)

  public class ContentServiceTest {
    private ContentService<Content> contentService;

    public void testSave() {
      File file = createFile();;
      file = (File)contentService.findByPath("/nextss");
      Assert.assertThat(file.getPath(), is("/nextss"));
      Assert.assertThat(file.getFileType(), is(FileType.UNDEFINED));

    private File createFile() {
      File file = new File();
      file.setTitle("Test Exam");
      file.setContent("Hello world".getBytes());
      return file;

I am happy when the unit test run well :). Hope it is the part complements with Jackrabbit OCM and Spring modules - OCM. Welcomes any comments from you.


Tuesday, June 17, 2008

Who hides in your Spring factory?

Using schema-based configuration in Spring framework is powerful yet leads to less transparent configuration. Here I introduce single class that allows you to unleash all details of your Spring context.

Schema based configuration

Spring framework has very interesting and powerful feature - ability to use custom namespaces within ordinary Spring configuration file.

Using that functionality, it's possible to create custom XML element (with providing necessary XSD scheme to Spring) that will be parsed and used for declaring specific beans that corresponds to appropriate custom tags from scheme.

That's feature is really cool, since, from one hand, it allows to create own DSL (domain specific language) that is plugged into usual Spring declaration. Using DSL instead of Spring declaration is pretty convenient since it allows to have clean, compact and, what is more important, domain specific markup instead of generic one.

In addition, by introducing support of such functionality, Spring encourages vendors of third-party tools and libraries to plug them into Spring as components with functionality exposed via elements in custom namespace provided by vendor.

In general, support of custom namespaces in Spring context brings higher level of abstraction and increases overall productivity of developers.


Higher level of abstraction, as usual, adds more complexity and leaves many things under the hood. Until you use namespace you've implemented by own, you can now which beans will be created in Spring context as you use it (well, at least until during some reasonable period of time after developing it). However, what to do if you simply use tags provided by someone else? First, you got that library somehow and description of tag promises that if you use it you may throw away all old configuration for persistence, hibernate etc. etc. since at the moment of adding that custom tag into your context it scans your brain and does everything much better as you can even imagine... Sure thing, you think that it could be great and use it...

And until everything work fine - no one cares what is under that tag (it scans the brain, after all!). But if something goes wrong - well, if you have ordinary Spring config, you have a chance to take a look to configuration and find the problem' source. Not for custom tags - all details are hidden from outside world.

Seriously, using of custom namespaces in Spring config has such a drawback - in general, you don't know what actually happens when you use such a tags.


We here in SoftAMIS have got that problem some time ago trying to figure quite subtle problems with configuration. To avoid that in the future, we've created small utility that performs dump of internals for given Spring context (by examining registered bean definitions). Of course, it was not practical to invent some new format for such a dump, so old good Spring configuration is used - however, it unwraps all custom tags into their internal representation.

Similarly to approach from Spring reference:) If you have in your context declaration like that:


<util:map id="testMap"> <entry key="key" value="value"/> </util:map> <util:property-path path="testMap.values"/>

in corresponding dumped context you'll get:

<bean name="testMap" class="org.springframework.beans.factory.config.MapFactoryBean">
  <property name="sourceMap">
    <map> <entry key="key" value="value"/> </map>
<bean class="org.springframework.beans.factory.config.PropertyPathFactoryBean" name="testMap.values" p:propertyPath="values" p:targetBeanName="testMap"/>

How to use

To make dump of Spring context, I've wrote custom BeanFactoryPostProcessor. Ones is invoked by Spring automatically (if factory is created via application context) and simply inspects internal bean definitions registered in context and generates appropriate XML for them.

Therefore, everything you need to obtain dump of Spring factory - simply add the following declaration into context for wich you'd like to have such dump:

<bean class="org.softamis.tools4spring.dump.DumpBeanFactoryPostProcessor" p:generateSchemaBasedContext="true" p:outputLocation="z:\context.xml"/>

I's possible to specify location of file which dump should be written to and also specify whether dump should be created based on DTD or XSD. In later case, applicable attributes will be written using p: namespace

Of course, since that class works on context level, it's perfectly will dump Spring factory created from several configuration files.

License and download

To download source code for that class, please use this link. That class is licensed under Apache License, so it could be used both in open source and commercial applications.

I hope that this small utility will be helpful for you and will save your time in some tight situation.

Monday, June 16, 2008

Spring & LDAP

Just like you have JDBC/Hibernate/iBatis templates in Spring, we also have an LDAPTemplate. You can download the spring LDAP library from I like this template approach simply because it lets us avoid common pitfalls such as not cleaning up resources after using an API (in JDBC its the connection, statement and resultset). Why bother when the template can do this for you. Same holds true for LDAP queries.
For this example I had the following setup:

  • Apache Directory Server 1.5.2. I decided to use the sample directory data.
  • Installed the Apache Directory Studio eclipse plugin.
To confirm your setup. Open eclipse and go to the LDAP perspective. Create a new connection with following information:
  • hostname - localhost
  • port - 10389
  • Bind DN or user - uid=admin,ou=system
  • password - secret (this is the default password for apache ds)
This should let you into the directory. Under dc=example,dc=com I added two organizations (asia and americas).

Now for the Spring stuff. 

package trial;

import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
public class LDAPSampleImpl implements LDAPSample {

private LdapTemplate ldapTemplate;

public List getOrgNames() {
return ldapTemplate.list("");

The spring XML file looks like:

<?xml version="1.0" encoding="UTF-8"?>

<beans xmlns=""

<context:annotation-config />
<context:component-scan base-package="trial" />

<bean id="ldapContextSource"
<property name="url" value="ldap://localhost:10389" />
<property name="base" value="dc=example,dc=com" />
<property name="userDn" value="uid=admin,ou=system" />
<property name="password" value="secret" />

<bean id="ldapTemplate" class="org.springframework.ldap.core.LdapTemplate">
<constructor-arg ref="ldapContextSource" />

Everything above is self explanatory. Now for the test case to execute all of this.

package trial;

import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;

@ContextConfiguration(locations = { "classpath:spring-context.xml" })
public class DriverTestCase {

private LDAPSample ldap;
public void testGreeting() {

Running this unit test results in output
>> [ou=asia, ou=americas]

Wednesday, June 4, 2008

Send E-mail Using Spring and JavaMail


This brief tutorial will show how to send e-mail using Spring and JavaMail. JavaMail can handle e-mail storage as well, but here we're just worrying about sending e-mail.

I happen to be using Spring 2.5 but this ought to work for earlier versions of Spring as well (at least Spring 2.0 I think).

Let's jump right in. You can configure your JavaMail session either in Spring itself or with JNDI. We'll look at both alternatives.

Alternative 1: Configuring JavaMail with Spring

You may be operating in an environment where you don't have a JNDI enterprise naming context (ENC) available. Or you may have some reason not to use it even if you do have a JNDI ENC. For example, I've written a simple application monitor, and I'll be adding e-mail alerting shortly. This is a standalone app and so there's no JNDI ENC. No problem; I can just configure JavaMail in the Spring application context configuration as shown below.

<!-- Mail service -->
<bean id="mailSender" class="org.springframework.mail.javamail.JavaMailSenderImpl">
    <property name="host" value=""/>
    <property name="port" value="25"/>
    <property name="username" value="yourusername"/>
    <property name="password" value="yourpassword"/>
    <property name="javaMailProperties">
            <!-- Use SMTP-AUTH to authenticate to SMTP server -->
            <prop key="mail.smtp.auth">true</prop>
            <!-- Use TLS to encrypt communication with SMTP server -->
            <prop key="mail.smtp.starttls.enable">true</prop>

This bean is, as its name suggests, a mail sender. It is basically a wrapper around JavaMail SMTP, and the configuration reflects that. In the example I'm showing how you would enable SMTP-AUTH (supports authentication to the SMTP server) and TLS (supports message encryption), assuming your SMTP server has those capabilities. For more information see my article SMTP and SMTP-AUTH.

IMPORTANT: You will need to inject mailSender into your mail-sending service bean.

So that's how to configure JavaMail from Spring. Now here's how to do the same thing with JNDI, which you may want to do if you're running in an environment with a JNDI ENC (like an app server or a servlet container).

Alternative 2: Configuring JavaMail with JNDI

Server JNDI configuration (using Tomcat 6 as an example)

First you will need to expose a JavaMail session factory through JNDI in your server environment. This is environment-dependent, but let's look at an example.

Say you're using Tomcat 6. There are a couple things you must do. First, move mail.jar and activation.jar to your tomcat/lib directory. I say "move" rather than "copy" because you will get an odd error if you leave the two JARs in your application classpath. The error is

    Cannot convert value of type [javax.mail.Session] to required type
    [javax.mail.Session] for property 'session': no matching editors
    or conversion strategy found

Second, define your Tomcat JNDI configuration, which might look like this (e.g. in your context.xml file):

Code listing: /META-INF/context.xml

<?xml version="1.0" encoding="UTF-8"?>

<Context path="/myapp" docBase="myapp" debug="5" crossContext="false">

    <!-- JavaMail session factory -->
    <Resource name="mail/Session"

As with the non-JNDI example, I'm configuring for SMTP-AUTH and TLS. If you are using SMTP-AUTH (authenticated SMTP sessions, which you activate using mail.smtp.auth="true"), then you will need to specify the username and password twice, as shown above. Also, if your SMTP server supports it, you can tell JavaMail to encrypt sessions using TLS by setting mail.smtp.starttls.enable=true.

The above discussion applies only to Tomcat 6 (see Apache Tomcat 6.0 JNDI Resources HOWTO for detailed instructions); you'll need to consult your server docs to expose a JavaMail session factory through JNDI in your environment.

Spring configuration

We still need to create a mail sender, but the configuration is simpler since we did all the heavy lifting in context.xml (or whatever, depending on your server environment). So for the Spring application context, all we need is:

<bean id="mailSender" class="org.springframework.mail.javamail.JavaMailSenderImpl">
    <property name="session" ref="mailSession"/>

IMPORTANT: As before, you will need to inject mailSender into your mail-sending service bean.

So that takes care of configuring the mail sender, and also the JavaMail session factory if you are using JNDI. Let's visit one more topic before we dive into the code itself.

Creating an E-mail Template (Optional)

Sometimes the e-mail you want to send fits inside a standard template (e.g., maybe it always has the same sender, or maybe the same recipient, or whatever). You can define a template using Spring. Here's an example:

Code listing: Spring app context config file

<!-- Mail message -->
<bean id="mailMessage" class="org.springframework.mail.SimpleMailMessage">
    <property name="from">
        <value><![CDATA[Simple Application Monitor <>]]></value>
    <property name="to">
        <value><![CDATA[System Administrator <>]]></value>
    <property name="subject" value="SAM Alert"/>

You can include as many e-mail templates in a Spring app context configuration, including none if you don't need a template. Here I've defined a template that specifies a "from" field, a "to" field and a "subject" field. It doesn't specify the date or the body. That template works say for an application monitoring system but it wouldn't work for an e-commerce site's order confirmation e-mail, which would need to have a variable "to" field.

IMPORTANT: I didn't show it above, but you will need to inject any e-mail templates (i.e. mail messages) you create into your mail-sending service bean. Otherwise the service bean has no way to use the template.

So that's that. Time for the Java code that actually sends the e-mail.

How to Send the E-mail from Your Service Bean

It turns out that the coding part is much simpler than the configuration (not that the config was too bad). Let's suppose for the sake of example that we want to send an e-mail based on the template that we defined above, and that you've injected that template into your service bean as mailMessage. Then here's the service bean code that allows you to use the template to send an e-mail:

Code listing: Your service bean

SimpleMailMessage message = new SimpleMailMessage(mailMessage);
message.setSentDate(new Date());
message.setText("Blah blah blah...");

You can see we're creating a new message based on the mailMessage e-mail template and mailSender that we injected into said service bean.

And that's it! Not too painful, right?

by Willie Wheeler

Monday, June 2, 2008

Spring Batch - Hello World

This is an introductory tutorial to Spring Batch. It does not aim to provide a complete guide to the framework but rather to facilitate the first contact. Spring Batch is quite rich in functionalities, and this is basically how I started learning it. Keep in mind that we will only be scratching the surface.

Before we start

All the examples will have the lofty task of printing "Hello World!" though in different ways. They were developed with Spring Batch 1.0. I'll provide a Maven 2 project and I'll run the examples with Maven but of course it is not a requirement to work with Spring Batch.

Spring Batch in 2 Words

Fortunately, Spring Batch model objects have self-explanatory names. Let's try to enumerate the most important and to link them together:

A batch Job is composed of one or more Steps. A JobInstance represents a given Job, parametrized with a set of typed properties called JobParameters. Each run of of a JobInstance is a JobExecution. Imagine a job reading entries from a data base and generating an xml representation of it and then doing some clean-up. We have a Job composed of 2 steps: reading/writing and clean-up. If we parametrize this job by the date of the generated data then our Friday the 13th job is a JobInstance. Each time we run this instance (if a failure occurs for instance) is a JobExecution. This model gives a great flexibility regarding how jobs are launched and run. This naturally brings us to launching jobs with their job parameters, which is the responsibility of JobLauncher. Finally, various objects in the framework require a JobRepository to store runtime information related to the batch execution. In fact, Spring Batch domain model is much more elaborate but this will suffice for our purpose.

Well, it took more than 2 words and I feel compelled to make a joke about it, but I won't. So let's move to the next section.

Common Objects

For each job, we will use a separate xml context definition file. However there is a number of common objects that we will need recurrently. I will group them in an applicationContext.xml which will be imported from within job definitions. Let's go through these common objects:


JobLaunchers are responsible for starting a Job with a given job parameters. The provided implementation, SimpleJobLauncher, relies on a TaskExecutor to launch the jobs. If no specific TaskExecutor is set then a SyncTaskExecutor is used.


We will use the SimpleJobRepository implementation which requires a set of execution Daos to store its information.

JobInstanceDao, JobExecutionDao, StepExecutionDao

These data access objects are used by SimpleJobRepository to store execution related information. Two sets of implementations are provided by Spring Batch: Map based (in-memory) and Jdbc based. In a real application the Jdbc variants are more suitable but we will use the simpler in-memory alternative in this example.

Here's our applicationContext.xml:

<beans xmlns=""

  <bean id="jobLauncher" class="">
      <property name="jobRepository" ref="jobRepository"/>
  <bean id="jobRepository" class="">
          <bean class="org.springframework.batch.core.repository.dao.MapJobInstanceDao"/>
          <bean class="org.springframework.batch.core.repository.dao.MapJobExecutionDao" />
          <bean class="org.springframework.batch.core.repository.dao.MapStepExecutionDao"/>

Hello World with Tasklets

A tasklet is an object containing any custom logic to be executed as a part of a job. Tasklets are built by implementing the Tasklet interface. Let's implement a simple tasklet that simply prints a message:

public class PrintTasklet implements Tasklet{

  private String message;

  public void setMessage(String message) {
      this.message = message;
  public ExitStatus execute() throws Exception {
      return ExitStatus.FINISHED;

Notice that the execute method returns an ExitStatus to indicate the status of the execution of the tasklet.

We will define our first job now in a simpleJob.xml application context. We will use the SimpleJob implementation which executes all of its steps sequentailly. In order to plug a tasklet into a job, we need a TaskletStep. I also added an abstract bean definition for tasklet steps in order to simplify the configuration:

<beans xmlns=""
  <import resource="applicationContext.xml"/>
  <bean id="hello" class="helloworld.PrintTasklet">
      <property name="message" value="Hello"/>
  <bean id="space" class="helloworld.PrintTasklet">
      <property name="message" value=" "/>
  <bean id="world" class="helloworld.PrintTasklet">
      <property name="message" value="World!"/>

  <bean id="taskletStep" abstract="true"
      <property name="jobRepository" ref="jobRepository"/>
  <bean id="simpleJob" class="org.springframework.batch.core.job.SimpleJob">
      <property name="name" value="simpleJob" />
      <property name="steps">
              <bean parent="taskletStep">
                  <property name="tasklet" ref="hello"/>
              <bean parent="taskletStep">
                  <property name="tasklet" ref="space"/>
              <bean parent="taskletStep">;
                  <property name="tasklet" ref="world"/>
      <property name="jobRepository" ref="jobRepository"/>
Running the Job

Now we need something to kick-start the execution of our jobs. Spring Batch provides a convenient class to achieve that from the command line: CommandLineJobRunner. In its simplest form this class takes 2 arguments: the xml application context containing the job to launch and the bean id of that job. It naturally requires a JobLauncher to be configured in the application context. Here's how to launch the job with Maven. Of course, it can be run with the java command directly (you need to specify the class path then):

mvn exec:java 
-Dexec.args="simpleJob.xml simpleJob"

Hopefully, your efforts will be rewarded with a "Hello World!" printed on the console.

The code source can be downloaded here.

by Tareq Abed Rabbo