Monday, December 25, 2006

2006 : A retrospective

2006 is not over yet, even by my relatively lazy once-a-week blogging standards, but I am off on a vacation with the kids and some relatives to Disneyland (aka The Happiest Place on Earth), followed by a short visit to Las Vegas (aka Sin City). Personally, I would be happier just staying at home and catching up on some stuff, but then, you gotta give the family what they want sometimes. You know, payback for all the nights and weekends they suffer in silence while you pound away at fixing a bug or building the next great feature. So anyway, the upshot is that this is going to be my last post for this year. Not that I expect the blogosphere to react by going into withdrawal symptoms because of this, but I figured that this would be a good time to look back and see what I have accomplished this year with the blog.

I started the blog sometime ago, but only started writing actively since the beginning of this year. At first, it was only once in two weeks, but then I started thinking of more things to write, so it became weekly. Most of my blog posts explain a technology I investigated or a problem I solved using a new (to me anyway) technology, so it is hard to do this more frequently than once a week. That said, some of the raw material for my posts were generated during a single hour long train ride, although it would take more time to refine the solution to post to the blog.

Much of my posts involve Java, which is not surprising, since I am a Java/J2EE developer. I have been doing Java for about 6 years now, starting with Applets for fun when I was an Informix DBA. Late last year, I set out to learn Python, having convinced myself that it was indeed, easier to write and maintain than Perl, my scripting language of choice at that time. While I use Python only for scripting, I have found it to be quite useful, and I am happy to say that I haven't missed Perl even once in this last one year.

I was fortunate that Hibernate and Spring became the ORM and the MVC framework standards, respectively, at my last job, so I had the opportunity to learn and use them at work. Needless to say, I loved both the frameworks, although I still find it hard to force pre-Hibernate data models into the Hibernate straitjacket. However, compared to Struts and Webwork, the other two standard MVC frameworks I used prior to using Spring, Spring continues to make hard things easy and easy things trivial. My current job uses a custom JDBC framework for data access, but we are gradually introducing Spring and getting benefits in terms of cleaner code and increased developer productivity.

I also set out to learn Ruby, having been quite impressed at the ease with which one can build database-driven web applications using Ruby on Rails (RoR). Despite my best intentions, however, I continued to be unimpressed by Ruby the scripting language, comparing it unfavorably with my current favorite, Python. While RoR is impressive, I kept thinking that a Java based framework built with Spring and Hibernate would do the job just as nicely. I looked at Trails for a while, but I did not like the purely annotation based approach it advocated, and then I looked briefly at Sails, which seems to be using IoC and ActiveRecord with Hibernate. I haven't had a chance to look further, but I am convinced that there is a Rails lookalike out there, based on Hibernate and Spring, that's just waiting to be discovered. I even started developing one, using complex Ant scripts and Velocity to do code generation of the DAOs and Controllers needed for the scaffolding, but I gave up midway because of lack of time.

Along the way, I also looked briefly at Tapestry, even developing a little web front end to the Drools database repository implementation. What I liked about Tapestry was the ease with which one can plug in pre-built or customized components into pure HTML pages. What I would like even more is if I could integrate it with Spring's MVC, thereby using Spring's IoC instead of the built-in Hivemind, and having clean URLs. Maybe its possible now, since I haven't looked at Tapestry in a while.

I briefly tried to learn C++ (I moved directly from C to Java) by reading Bruce Eckel's "Thinking in C++" online e-books. I thought that there might be an immediate need for these skills, but turns out I was wrong, so there is less urgency to pick up this skill now. But this is definitely on my to-do list for next year. What I got from it is that C++ is definitely not an easy language to learn. The exercises may look easy to do, but there are always hidden gotchas that are waiting to trip you up. My advice for anyone wishing to really learn C++ from this book is to do the exercises and run the code, you will learn a lot from it. You may also consider using the CDT (C/C++ Development toolkit plugin) if you are an Eclipse user.

Other little things I looked at were the DWR and GWT Javascript frameworks. Unfortunately, I was unable to convince our resident Javascript gurus that these were a fitting adversary for Prototype, our current Javascript framework of choice. I also began to use Tomcat and JBoss Application Server as my web containers of choice. I also began to use Maven2 pretty heavily for my own projects, both inside and outside work, and while the learning curve is and continues to be fairly steep, the benefits of having a standardized project structure and prebuilt tasks outweighs the annoyances.

Actually, going back to my evolution as a Java programmer, the last 4 years at my last job turned me from a backend database/Java developer into a web developer. At my current job, I am gradually morphing into a Java search engineer, using Lucene and Spring to develop searcher modules, and various modules from Apache commons to develop modules to parse text for indexing. The cool thing about Lucene programming, and I think search programming in general, is that its a relatively new field, so there is a lot of room for doing really innovative stuff. I hope to do some of it next year, along with using the web development skills that I already have.

Overall, I think 2006 was quite a good year for me. I learned quite a few new things and had lots of fun at work. Hopefully, 2007 will be as good. Wish me luck, and have a very Happy New Year!

Sunday, December 17, 2006

Spring JdbcTemplate with Autocommit

I recently ran across a situation where I was using Spring's JdbcTemplate and trying to insert a record into a table, then turn around and read from it some data to use for a subsequent insert into another table. Something like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
while (someCondition) {
  jdbcTemplate.update("insert into table1...");
  ...
}
List rows = jdbcTemplate.queryForList("select col1, col2 from table1 where...");
for (Map row : rows) {
  jdbcTemplate.update("insert into table2...", new Object[] {
    row.get("col1"), row.get("col2"), ...
  });
}

Inexplicably (when I started seeing the problem first), there would be no rows in table2. Digging deeper, I found that was because no rows were being returned by the queryForList call, so the code was not entering the for loop at all.

The reason for this strange behavior appears to be as follows. Since JdbcTemplate is configured with a DataSource, which is in turn configured with a pool of Connections, there is no guarantee that the same Connection object will be returned to the JdbcTemplate in subsequent calls. So the first update and the queryForList call may not work against the same Connection object, so the row that was INSERTed may not be visible to the SELECT call. At least, thats true for Oracle, where the default transaction isolation level is READ COMMITTED. I cannot speak for other databases, because the only other database where I have used Spring so far is with MySQL with MyISAM which does not support transactions.

Since the first update and the second queryForList and update are really two distinct code blocks, the correct approach is to either restrict the JdbcTemplate to use a SingleConnectionDataSource, or to put the two operations in their own TransactionTemplate callbacks. Both approaches would have required me to change some code, however. A codeless option would have been to turn auto commit on for my code, but I was using a pre-built reference to the enterprise datasource, so that was not something I could do either. Finally I hit upon the idea of using AOP to intercept update() calls in JdbcTemplate and commit() them on completion. Obviously, it is not ideal if multiple JDBC calls could be grouped into a single transaction, but the concepts used here could be extended to cover that scenario as well, although we would be intercepting DAO methods instead of JdbcTemplate methods.

First, here is the interceptor. It uses a TransactionTemplate callback to wrap all specified (in our case, update()) methods. The "BEGIN TRAN", "COMMIT TRAN" and "ROLLBACK TRAN" debug calls indicate the transaction boundaries for the update() call.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
public class AutocommitInterceptor implements MethodInterceptor {

  private static final Logger LOGGER = Logger.getLogger(AutocommitInterceptor.class);

  private List<String> autoCommitableMethods;
  private TransactionTemplate transactionTemplate;

  public AutocommitInterceptor() {
    super();
  }

  public void setAutoCommitableMethods(List<String> autoCommitableMethods) {
    this.autoCommitableMethods = autoCommitableMethods;
  }

  public void setTransactionTemplate(TransactionTemplate transactionTemplate) {
    this.transactionTemplate = transactionTemplate;
  }

  public Object invoke(final MethodInvocation invocation) throws Throwable {
    if (isAutoCommitableMethod(invocation.getMethod().getName())) {
      return transactionTemplate.execute(new TransactionCallback() {
        public Object doInTransaction(TransactionStatus transactionStatus) {
          LOGGER.debug("BEGIN TRAN");
          try {
            Object retVal = invocation.proceed();
            LOGGER.debug("COMMIT TRAN");
            return retVal;
          } catch (Throwable t) {
            LOGGER.error("A runtime exception has occured:", t);
            LOGGER.debug("ROLLBACK TRAN");
            throw new RuntimeException(t);
          }
        }
      });
    } else {
      return invocation.proceed();
    }
  }

  private boolean isAutoCommitableMethod(String methodName) {
    boolean isAutoCommitable = false;
    for (String autoCommitableMethod : autoCommitableMethods) {
      if (autoCommitableMethod.equals(methodName)) {
        isAutoCommitable = true;
        break;
      }
    }
    return isAutoCommitable;
  }
}

We configure a bean to be a Proxy for JdbcTemplate. Since we are proxying a class (not an interface) we need to have the CGLIB JAR in our classpath. A reference to this proxy can then be passed into the beans wherever the JdbcTemplate reference was being passed in. Here is the Spring configuration.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
  <!-- The original JdbcTemplate definition -->
  <bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
    <property name="dataSource" ref="dataSource" />
  </bean>

  <!-- Definition for the autocommit version of JdbcTemplate -->
  <bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
    <property name="dataSource" ref="dataSource" />
  </bean>

  <bean id="transactionTemplate" class="org.springframework.transaction.support.TransactionTemplate">
    <property name="transactionManager" ref="transactionManager" />
    <property name="propagationBehaviorName" value="PROPAGATION_REQUIRED" />
  </bean>

  <bean id="autocommitInterceptor" class="com.mycompany.interceptors.AutocommitInterceptor">
    <property name="autoCommitableMethods">
      <list>
        <value>update</value>
      </list>
    </property>
    <property name="transactionTemplate" ref="transactionTemplate" />
  </bean>

  <bean id="autoCommittingJdbcTemplate" class="org.springframework.aop.framework.ProxyFactoryBean">
    <property name="target" ref="jdbcTemplate" />
    <property name="proxyTargetClass" value="true" />
    <property name="interceptorNames">
      <list><value>autocommitInterceptor</value></list>
    </property>
  </bean>

So there you have it. With just a single additional interceptor class, and a few lines of configuration, all the update() calls from JdbcTemplate will be transactional, thereby fixing the problem I was seeing. Database purists may argue that this approach is too simple minded. I agree that splitting the application into two distinct transaction blocks may be a much better idea in terms of performance, and I may even end up using that approach, but for a lot of cases, the autoCommit behavior that is the JDBC default is quite acceptable.

Saturday, December 09, 2006

Mock objects for Javamail Unit tests

I recently got to use Javamail for the very first time. Javamail provides the Java application developer with a convenient abstraction to send and receive email. My application would use email as an asynchronous remote invocation mechanism. It would need to read some mail from a specified IMAP mailbox, parse and process it, then send a confirmation email that the requested operation succeeded (or failed). However, I noticed that applications of this nature pose certain problems during development.

  • You need to be always connected to the SMTP, IMAP or POP servers that you are developing your application to talk to. This is not a tall order most of the time, except when you are developing code at a location outside the company firewall, or not connected to the Internet at all.
  • You may be working with database dumps of real email addresses, so you may end up inadverdently sending mail to real people during development, something that usually looks bad for you and your company. You can usually get around this by post-processing the email addresses to fake ones in the dev server, or by restricting the servers to within a certain network.

Both problems can be addressed by a mock framework that would allow Javamail to "send" messages to an in-memory data structure which can be queried by other Javamail components as they "receive" messages from it. That way, you can develop and test the code in offline mode, and also write round-trip integration tests without actually connecting to any real SMTP or IMAP servers. I searched around for something like this for a while (see resources), but could not find one that would meet all my requirements, so I decided to build one, which is described here.

Extension point

Central to Javamail is the Session object. It is a factory that returns an implementation of Transport (for SMTP), or Store and Folder (for IMAP and POP). It is final, so subclassing is not a viable approach to mocking it. What we can do is override the implementations it returns by specifying property file overrides. The property file overrides should be located in a META-INF directory in your classpath. Since mocks are only used during testing, I decided to locate them under the src/test/resources/META-INF directory of my Maven app.

1
2
3
4
5
6
7
8
# src/test/resources/META-INF/javamail.default.providers
# Specifies the mock implementations that would be returned by Session
protocol=smtp; type=transport; class=com.mycompany.smail.mocks.MockTransport; vendor=Smail, Inc.;
protocol=imap; type=store; class=com.mycompany.smail.mocks.MockImapStore; vendor=Smail, Inc.;

# src/test/resources/META-INF/javamail.default.address.map
# RFC-822 docs need to use the SMTP protocol
rfc822=smtp

Since the src/test/resources directory is in the test classpath, the following calls will now return our MockTransport and MockImapStore instead of the default Javamail implementations when we invoke the following calls from JUnit tests.

1
2
3
4
5
6
  // SMTP
  Transport transport = Session.getTransport("smtp"); // returns MockTransport

  // IMAP
  Store store = Session.getStore("imap");     // returns MockImapStore
  Folder folder = store.getFolder("INBOX");   // returns the only MockFolder

The Mock Message Store

For the mock message store which the MockTransport would write to and the MockImapStore would read from, I envisioned a Map of email address to List of MimeMessages. The email address would be the owner of the mailbox when reading (or the recipient address when writing a MimeMessage object). It would be a singleton with static methods which would be called by the MockTransport and MockImapStore and would have some dump methods to allow the developer to dump the object within the JUnit test. It is shown below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
// Messages.java
public class Messages {

  public static Map<String,List<MimeMessage>> messages = new HashMap<String,List<MimeMessage>>();

  public static void addMessage(String toEmail, MimeMessage message) {
    List<MimeMessage> messagesForUser = messages.get(toEmail);
    if (messagesForUser == null) {
      messagesForUser = new ArrayList<MimeMessage>();
    }
    messagesForUser.add(message);
    messages.put(toEmail, messagesForUser);
  }

  public static List<MimeMessage> getMessages(String toEmail) {
    List<MimeMessage> messagesForUser = messages.get(toEmail);
    if (messagesForUser == null) {
      return new ArrayList<MimeMessage>();
    } else {
      return messagesForUser;
    }
  }

  public static void reset() throws Exception {
    messages.clear();
  }

  /**
   * Dumps the contents of the Messages data structure for the current run.
   * @return the string representation of the Messages structure.
   * @throws Exception if one is thrown.
   */
  public static String dumpAllMailboxes() throws Exception {
    StringBuilder builder = new StringBuilder();
    builder.append("{\n");
    for (String email : messages.keySet()) {
      builder.append(dumpMailbox(email)).append(",\n");
    }
    builder.append("}\n");
    return builder.toString();
  }

  /**
   * Dumps the contents of a single Mailbox.
   * @param ownerEmail the owner of the mailbox.
   * @return the string representation of the Mailbox.
   * @throws Exception if one is thrown.
   */
  public static String dumpMailbox(String ownerEmail) throws Exception {
    StringBuilder mailboxBuilder = new StringBuilder();
    List<MimeMessage> messagesForThisUser = messages.get(ownerEmail);
    mailboxBuilder.append(ownerEmail).append(":[\n");
    for (MimeMessage message : messagesForThisUser) {
      mailboxBuilder.append(stringifyMimeMessage(message));
    }
    mailboxBuilder.append("],\n");
    return mailboxBuilder.toString();
  }

  /**
   * Custom stringification method for a given MimeMessage object. This is
   * incomplete, more details can be added, but this is all I needed.
   * @param message the MimeMessage to stringify.
   * @return the stringified MimeMessage.
   */
  public static String stringifyMimeMessage(MimeMessage message) throws Exception {
    StringBuilder messageBuilder = new StringBuilder();
    messageBuilder.append("From:").append(message.getFrom()[0].toString()).append("\n");
    messageBuilder.append("To:").append(message.getRecipients(RecipientType.TO)[0].toString()).append("\n");
    for (Enumeration<Header> e = message.getAllHeaders(); e.hasMoreElements();) {
      Header header = e.nextElement();
      messageBuilder.append("Header:").append(header.getName()).append("=").append(header.getValue()).append("\n");
    }
    messageBuilder.append("Subject:").append(message.getSubject()).append("\n");    messageBuilder.append(message.getContent() == null ? "No content" : message.getContent().toString());
    return messageBuilder.toString();
  }
}

Mocking SMTP

Only Transport needs to be mocked for SMTP. This implementation is almost totally copied from Bill Dudney's blog entry (referenced in resources), replacing System.out.println() calls with LOGGER.debug() calls.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
// MockTransport.java
public class MockTransport extends Transport {

  private static final Logger LOGGER = Logger.getLogger(MockTransport.class);

  public MockTransport(Session session, URLName urlName) {
    super(session, urlName);
  }

  @Override
  public void connect() throws MessagingException {
    LOGGER.info("Connecting to MockTransport:connect()");
  }

  @Override
  public void connect(String host, int port, String username, String password) throws MessagingException {
    LOGGER.info("Connecting to MockTransport:connect(String " + host + ", int " + port + ", String " + username + ", String " + password + ")");
  }

  @Override
  public void connect(String host, String username, String password) throws MessagingException {
    LOGGER.info("Connecting to MockTransport:connect(String " + host + ", String " + username + ", String " + password + ")");
  }

  @Override
  public void sendMessage(Message message, Address[] addresses) throws MessagingException {
    System.err.println("Sending message '" + message.getSubject() + "'");
    for (Address address : addresses) {
      Messages.addMessage(address.toString(), (MimeMessage) message);
    }
  }

  @Override
  public void close() {
    LOGGER.info("Closing MockTransport:close()");
  }
}

Mocking IMAP

For IMAP, we need to provide mock implementations for both Store and Folder. Most of the methods in my case are unsupported, but those that are work against the Messages object. Here they are:

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
// MockImapStore.java
public class MockImapStore extends Store {

  private static final Logger LOGGER = Logger.getLogger(MockImapStore.class);

  private String ownerEmail;
  private MockFolder folder;

  public MockImapStore(Session session, URLName urlName) {
    super(session, urlName);
  }

  public String getOwnerEmail() {
    return ownerEmail;
  }

  @Override
  public void connect(String host, int port, String username, String password) {    this.ownerEmail = buildOwnerEmail(host, username);
    this.folder = new MockFolder(this);
    LOGGER.debug("MockImapStore:connect(String " + host + ", int " + port + ", String " + username + ", String " + password + ")");
  }

  @Override
  public Folder getFolder(String folderName) throws MessagingException {
    return getDefaultFolder();
  }

  @Override
  public Folder getDefaultFolder() throws MessagingException {
    return folder;
  }

  @Override
  public Folder getFolder(URLName urlName) throws MessagingException {
    return getDefaultFolder();
  }

  @Override
  public void close() {
    LOGGER.info("MockImapStore.close()");
  }

  /**
   * Converts user at mail.host.com to user@host.com
   * @param host the hostname of the mail server.
   * @param username the username that is used to connect.
   * @return the email address.
   */
  private String buildOwnerEmail(String host, String username) {
    return StringUtils.join(new String[] {
      username,
      StringUtils.join(ArrayUtils.subarray(host.split("\\."), 1, 3), ".")}, "@");
  }

}

// MockFolder.java
public class MockFolder extends Folder {

  private static final Logger LOGGER = Logger.getLogger(MockFolder.class);

  private Store store;
  private List<MimeMessage> messagesInFolder;

  public MockFolder(Store store) {
    super(store);
    this.store = store;
  }

  @Override
  public void open(int mode) throws MessagingException {
    String owner = ((MockImapStore) store).getOwnerEmail();
    LOGGER.debug("MockFolder.open(int " + mode + "), owner=" + owner);
    this.messagesInFolder = Messages.getMessages(owner);
  }

  @Override
  public Message[] getMessages() throws MessagingException {
    return messagesInFolder.toArray(new Message[0]);
  }

  @Override
  public Message[] expunge() throws MessagingException {
    return new Message[0];
  }

  @Override
  public void close(boolean expunge) throws MessagingException {
    LOGGER.debug("MockFolder.close(boolean " + expunge + ")");
  }

  @Override
  public Message getMessage(int index) throws MessagingException {
    try {
      return messagesInFolder.get(index);
    } catch (ArrayIndexOutOfBoundsException e) {
      throw new MessagingException(e.getMessage());
    }
  }

  @Override
  public int getMessageCount() throws MessagingException {
    return messagesInFolder.size();
  }

  @Override
  public int getType() throws MessagingException {
    return Folder.HOLDS_MESSAGES;
  }

  @Override
  public boolean hasNewMessages() throws MessagingException {
    return (messagesInFolder.size() > 0);
  }

  @Override
  public boolean isOpen() {
    return (((MockImapStore) getStore()).getOwnerEmail() != null);
  }

  @Override
  public Folder[] list(String arg0) throws MessagingException {
    return new Folder[] {this};
  }

  @Override
  public void appendMessages(Message[] messages) throws MessagingException {
    this.messagesInFolder.addAll(Arrays.asList((MimeMessage[]) messages));
  }

  @Override
  public boolean exists() throws MessagingException {
    return true;
  }

  @Override
  public Folder getFolder(String folderName) throws MessagingException {
    return this;
  }

  @Override
  public String getFullName() {
    return "INBOX";
  }

  @Override
  public String getName() {
    return "INBOX";
  }

  @Override
  public boolean create(int type) throws MessagingException {
    throw new UnsupportedOperationException("MockFolder.create(int) not supported");
  }

  @Override
  public boolean delete(boolean recurse) throws MessagingException {
    throw new UnsupportedOperationException("MockFolder.delete(boolean) not supported");
  }

  @Override
  public Folder getParent() throws MessagingException {
    throw new UnsupportedOperationException("MockFolder.getParent() not supported");
  }

  @Override
  public Flags getPermanentFlags() {
    throw new UnsupportedOperationException("MockFolder.getPermanentFlags() not supported");
  }

  @Override
  public char getSeparator() throws MessagingException {
    throw new UnsupportedOperationException("MockFolder.getSeparator() not supported");
  }

  @Override
  public boolean renameTo(Folder newFolder) throws MessagingException {
    throw new UnsupportedOperationException("MockFolder.renameTo() not supported");
  }

}

Calling code

Once the mock objects are in place, the calling code will run unchanged against the mock objects as long as the property override properties files are visible in our classpath. In our case, our JUnit test code (under src/test/java) will automatically use the mock objects, while our production code (under src/main/java) will use the Sun implementations to connect to the real servers.

Resources

  • jGuru's Java Mail Tutorial on the Sun Developer Network - A very quick but comprehensive overview of the Javamail API. Takes about 15 minutes or so to read and contains code snippets you can use to quickstart your Javamail based app.
  • Bill Dudney's "Mocking Javamail" blog entry - this actually got me started on my own mock objects for Javamail unit testing. However, my needs went beyond just sending mail, so I had a little more work to do. However, my Mock Transport implementation (described above) started out as a direct cut-and-paste from the code shown in this entry.
  • Dumbster is a fake SMTP server, which stores messages sent to it in an in-memory data structure similar to the mock implementation described here. I considered using this for a while, but I needed something that would work with Javamail and have support for a mock IMAP store. Dumbster, as far as I know, does not work with Javamail, and it definitely does not have support for POP or IMAP.

Saturday, December 02, 2006

Handling Rules with Functors and Spring

The thought process leading to this post began as a result of a conversation about the possibility of customizing an existing process for a new customer. The process is modelled as a series of transformations on an input object, and the customization consisted of pulling out certain parts in the series and replacing them with new ones.

I first thought of using a Rules engine such as Drools, but then I realized that it was probably overkill. All I really wanted was to pull out the conditional logic outside the code and into configuration. That way I could create a new customized process simply by cloning and changing the configuration, without (hopefully) having to touch a line of existing code. Obviously, the new components to supply the new functionality would have to be coded and inserted into the configuration, unless of course, we could reuse some of the existing components by reconfiguring it.

To model conditional logic, I decided to use the Functor classes from the Apache Commons Collections project. The configuration could be stored as structures of Functors in the Spring application context. This article describes a very small proof of concept I wrote to verify to myself that this was possible.

Basically, we can think of the process as a pipeline of multiple small processes, connected by conditionals. The object that is to be processed moves through the pipeline. The input to the pipeline will consist of an almost empty object, which gets progressively populated as it travels through the pipeline. At the output end, we get a fully populated object. The example basically tries to take the following snippet of code and move it out to Spring configuration. The DataHolder is a plain POJO with two member variables - data and result, and associated getters and setters.

1
2
3
4
5
6
7
8
  public void doPipelineWithoutFunctors() throws Exception {
    if ("foo".equals(dataHolder.getData())) {
      dataHolder.setResult("Got a foo");
    } else {
      dataHolder.setResult("Got a bar");
    }
    LOGGER.debug("Result = " + dataHolder.getResult());
  }

The first iteration using anonymous Closure and Predicate Functor objects looks like this. This is significantly more verbose than our original pipeline, but our objective is to move this logic out of the code in any case, so we are not too concerned about that.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
  public void doPipelineWithFunctors() throws Exception {
    Closure[] closureList = {
      new IfClosure(
        new Predicate() {
          public boolean evaluate(Object object) {
            if (!(object instanceof DataHolder)) {
              LOGGER.debug("Bad object");
              return false;
            }
            DataHolder dh = (DataHolder) object;
            return "foo".equals(dh.getData());
          }
        },
        new Closure() {
          public void execute(Object input) {
            ((DataHolder) input).setResult("Got a foo");
          }
        },
        new Closure() {
          public void execute(Object input) {
            ((DataHolder) input).setResult("Got a bar");
          }
        }
      ),
      new Closure() {
        public void execute(Object input) {
          LOGGER.debug("Result = " + ((DataHolder) input).getResult());
        }
      }
    };
    ChainedClosure chain = new ChainedClosure(closureList);
    chain.execute(dataHolder);
  }

Based on the above, we know that the entire process can be modelled as a ChainedClosure, consisting of an IfClosure and a regular Closure object. The IfClosure uses a Predicate to determine the value of the DataHolder.data variable, and directs to one of the Closures based on the value of this variable. The second Closure simply prints out the value of the variable.

Once we extract all these values out into the Spring configuration (shown below), the our application logic can be as simple as this:

1
2
3
  public void doPipelineWithFunctorsAndSpring() {
    closure.execute(dataHolder);
  }

Of course, TANSTAAFL, and while the code is now really short, the configuration is now quite large, and arguably, as unmaintainable as the code we started out with. However, we do acheive our original goal, which is to pull all the application logic out into the configuration.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
<beans ...>
  <!-- Input object -->
  <bean id="dataHolder" class="com.mycompany.functors.DataHolder" /> 
  <!-- Closure definition -->
  <bean id="pipelineClosure" class="org.apache.commons.collections.functors.ChainedClosure">     <constructor-arg>
      <list>
        <ref bean="checkDataClosure" />
        <ref bean="showResultClosure" />
      </list>
     </constructor-arg>
  </bean>

  <bean id="checkDataClosure" class="org.apache.commons.collections.functors.IfClosure">
    <constructor-arg ref="dataEqualsPredicate" />
    <constructor-arg ref="positiveResultSetterClosure" />
    <constructor-arg ref="negativeResultSetterClosure" />
  </bean>

  <bean id="dataEqualsPredicate" class="com.mycompany.functors.pipeline.DataEqualsPredicate" />

  <bean id="positiveResultSetterClosure" class="com.mycompany.functors.pipeline.ResultSetterClosure" /> 
  <bean id="negativeResultSetterClosure" class="com.mycompany.functors.pipeline.ResultSetterClosure" /> 
  <bean id="showResultClosure" class="com.mycompany.functors.pipeline.ShowResultClosure" />   

  <!-- Driver definition -->
  <bean id="pipelineDriver" class="com.mycompany.functors.PipelineDriver">
    <property name="closure" ref="pipelineClosure" />
  </bean>
</beans>

We have also had to move some of the logic out of the anonymous inner classes to three new external Functor objects, two Closures and a Predicate, which are shown below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
public class DataEqualsPredicate implements Predicate {
  public boolean evaluate(Object object) {
    if (!(object instanceof DataHolder)) {
      return false;
    }
    DataHolder dh = (DataHolder) object;
    return "foo".equals(dh.getData());
  }
}

public class ResultSetterClosure implements Closure {
  public void execute(Object input) {
    ((DataHolder) input).setResult("Got a " + ((DataHolder) input).getData());
  }
}

public class ShowResultClosure implements Closure {
  private static final Logger LOGGER = Logger.getLogger(ShowResultClosure.class);
 
  public void execute(Object input) {
    LOGGER.debug("Result = " + ((DataHolder) input).getResult());
  }
}

Obviously, this is a simple example, and to do this kind of thing for any real system, the Spring configuration is likely to be significantly more complicated. But this is true for configuration for a rule engine as well. What this approach gives us is a way to factor out our workflow rules into configuration, and all this using POJOs.

Saturday, November 18, 2006

More legacy bean mapping strategies with Spring

My last post covered some basic mapping strategies for accessing legacy beans from within Spring's bean context. This post covers three more strategies to expose legacy beans for which the mapping may not be as evident. In all these cases, wrapper code may need to be written or existing code may need to be slightly modified to expose these beans in Spring.

Exposing legacy configuration

The configuration information in the legacy web application is stored in an external directory in a bunch of properties files. By external directory, I mean that it is not in WEB-INF/classes, where you would normally expect it to be. Even though you may cringe at the thought (I know I did, when I first looked at it), there are a number of benefits. First, changing properties is easier for operations folks to do, since the WAR file does not need to be rebuilt, although the application does need to be bounced for the new configuration to take effect. Second, properties can be reused across other web and standalone applications, resulting in less duplication and creating something of an enterprise level configuration. The downside, of course, is that you need a custom strategy to load and customize properties per environment, rather than use Spring's PropertyPlaceholderConfigurer or Maven's environment based filtering that I wrote about earlier.

Properties in the legacy web application is exposed through a Config bean, which exposes static calls such as this:

1
  String mypropertyValue = Config.getConfig("myproperty").get("key");

This call would go out and load the property file myproperty.properties in the specified external directory (passed in to the application as a system property), if it has not already been loaded in a previous invocation, and get back the value for the property named "key". The properties file itself looks something like this:

1
2
# myproperty.properties
key=value

My objective was to have this value exposed through Spring's PropertyPlaceholderConfigurer in the Spring bean context as ${myproperty.key}. I considered building a custom configurer by extending the PropertyPlaceholderConfigurer, but then I found a JIRA post on Atlassian that discussed strategies to expose configuration specified with Jakarta Commons Configuration, one of which I repurposed for my use.

Basically, what I ended up doing was creating a PropertyExtractor class which iterated through all the properties files in the external directory, and loaded all of these into a single Properties object. The keys for each of these properties was the key itself, prefixed by the basename of the properties file. Once this was done, I could pass in the properties to the PropertiesPlaceholderConfigurer by invoking the getProperties() method on the PropertyExtractor class. The Spring configuration for the PropertyPlaceholderConfigurer is shown below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
  <bean id="configPropertiesExtractor" class="com.mycompany.util.ConfigPropertiesExtractor">
    <property name="configDir" value="/path/to/external/config/directory" />
  </bean>

  <bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
    <property name="properties">
      <bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
        <property name="targetObject">
          <ref local="configPropertiesExtractor" />
        </property>
        <property name="targetMethod">
          <value>getProperties</value>
        </property>
      </bean>
    </property>
  </bean>

The code for the PropertyExtractor bean is shown below. It is itself a Spring bean, and is configured using the external directory name. It makes calls to the legacy Config bean to get the properties and rebuild a Properties object which is then injected into the PropertyPlaceholderConfigurer bean. From this point on, all properties can be exposed in the ${property_file_basename.property_key} format within the rest of the context.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
public class ConfigPropertiesExtractor {

  private static final Logger LOGGER = Logger.getLogger(ConfigPropertiesExtractor.class);

  public ConfigPropertiesExtractor() {
    super();
  }

  public void setConfigDir(String configDir) {
    Config.setConfigDir(configDir);
  }

  public Properties getProperties() throws Exception {
    Properties props = new Properties();
    File configDir = new File(Config.getConfigDir());
    if ((! configDir.exists()) || (! configDir.isDirectory())) {
      LOGGER.error("Config dir:[" + configDir.getAbsolutePath() + "] does not exist or is not a directory");
      return props;
    }
    File[] cfFiles = configDir.listFiles(new FileFilter() {
      public boolean accept(File pathname) {
        return (pathname.getName().endsWith(".properties"));
      }
    });
    for (File cfFile : cfFiles) {
      String prefix = FilenameUtils.getBaseName(cfFile.getName());
      Properties cfProps = Config.getConfig(prefix).getAll();
      for (Iterator it = cfProps.keySet().iterator(); it.hasNext();) {
        String key = (String) it.next();
        String value = (String) cfProps.getProperty(key);
        props.setProperty(prefix + "." + key, value);
      }
    }
    return props;
  }
}

Exposing a predefined DataSource

The legacy application was based on JDBC, so there was already a class that built and returned a Connection object from a pool. The DBA had spent considerable effort to optimize the connection pool for our environment, so it made sense to use the optimizations. One approach would have been to build our own DriverManagerDataSource using the exact same optimized configurations. The disadvantage of this approach is that the DBA would have to maintain identical information in two different places, or the developers will have to continuously play catch up with every change. A second approach would have been to add an extra method to the class to return a DataSource object instead of a Connection (since Spring's JdbcTemplate requires a DataSource to be built). The second approach is the approach we went with. The extra code to return a DataSource was minimal, since the implementation of getConnection was DataSource.getConnection().

1
2
3
4
5
6
7
public class DbConnectionManager {
  ...
  public static DataSource getDataSource() throws Exception {
    return _dataSource;
  }
  ...
}

The configuration is shorter than the standard one for DriverManagerDataSource, just a call to a static method on a predefined class.

1
2
  <bean id="dataSource" class="com.mycompany.util.db.DbConnectionManager" 
      factory-method="getDataSource" />

Sometimes the legacy database ConnectionManager does not reference a DataSource object. This was the case with another third-party application, which built the Connection using traditional DriverManager calls, relying on the database driver's pooling capabilities. My solution in that case was to build a DataSource wrapper implementation whose getConnection() method delegates to the ConnectionManager's getConnection() method. Obviously, the other required methods need to have sensible defaults as well.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
public class MyDataSource implements DataSource {

  private DbConnectionManager connectionManager;

  public void setConnectionManager(DbConnectionManager connectionManager) {
    this.connectionManager = connectionManager;
  }

  public Connection getConnection() {
    return connectionManager.getConnection();
  }

  // other methods of DataSource
  ...
}

And the configuration for this would go something like this:

1
2
3
  <bean id="dataSource" class="com.mycompany.util.db.MyDataSource">
    <property name="connectionManager" ref="dbConnectionManager" />
  </bean>

Accessing objects from a factory

This arose out of a desire to remove some boiler-plate code out of my own pre-Spring code. The code parsed XML using the DOM parser. The pattern for parsing an XML file is as follows:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
...
  public void doSomethingWithXmlFile() {
    DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
    // set properties for factory
    dbf.setValidating(false);
    dbf.setIgnoringElementContentWhitespace(true);
    DocumentBuilder builder = dbf.newDocumentBuilder();
    // set properties for the builder
    builder.setEntityResolver(myEntityResolver);
    // finally parse the XML to get our Document object
    Document doc = builder.parse(xmlFile);
    ...
  }
...

I wanted to just pass a pre-built DocumentBuilder object to the class, and be done with the boilerplate code on top. I achieved this with the following configuration:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
  <bean id="documentBuilderFactory" class="javax.xml.parsers.DocumentBuilderFactory" factory-method="newInstance">
    <property name="validating" value="false" />
    <property name="ignoringElementContentWhitespace" value="true" />
  </bean>

  <bean id="documentBuilder" class="javax.xml.parsers.DocumentBuilder"
      factory-bean="documentBuilderFactory" factory-method="newDocumentBuilder">
    <property name="entityResolver" ref="myEntityResolver" />
  </bean>
  ...
  <!-- documentBuilder can now be referenced in a bean definition -->

and the resulting code after moving the boilerplate out to Spring looked like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
...
  // the setter for Spring
  public void setDocumentBuilder(DocumentBuilder documentBuilder) {
    this.documentBuilder = documentBuilder;
  }

  public void doSomethingWithXmlFile() {
    Document doc = documentBuilder.parse(xmlFile);
    ...
  }
...

All the three mapping strategies described are quite complex, and are not readily apparent. However, the XML metalanguage provided by Spring to configure beans is quite powerful and has lots of features. The power of the metalanguage becomes most evident when one has to expose legacy beans rather than ones which are already exposable using Spring's standard setter injection. As I dig deeper into the legacy code and have to interface with more legacy beans, I am sure I will come across more complex situations, solutions to which I will probably share if I think they are useful.