Tuesday, December 1, 2009

Internal DSL for Criteria - part 3:3

In previous 2 articles I have described the basics of the new internal DSL for JPA/Hibernate criteria. Let us dive in to an advanced topic. We would like to be able to specify conditional expressions with logical operators, and with grouping of nested conditions.

// A and (B or C)
criteriaFor(Person.class).withProperty(aaa()).eq("A").and().lbrace()
.withProperty(bbb()).eq("B").or().withProperty(ccc()).eq("C").rbrace()
.build();


That should build a syntax tree like:

and
/ \
A or
/ \
B C


This means that I need to hold state of current and previous not completed operators in the builder. I hold the current operator in a stack and give braces a special meaning. The expression builder looks like this:


public class ConditionalCriteriaBuilder<T> {

private enum ExpressionOperator {
Not, Or, And, LBrace
}

private ConditionalCriteriaBuilder() {
}

public static <T> ConditionRoot<T> criteriaFor(Class<T> clazz) {
return new ConditionalCriteriaBuilder<T>().new RootBuilderImpl();
}

public interface ConditionRoot<T> {
List<ConditionalCriteria> build();
CondditionProperty<T> withProperty(Property<T> property);
ConditionRoot<T> and();
ConditionRoot<T> or();
ConditionRoot<T> not();
ConditionRoot<T> lbrace();
ConditionRoot<T> rbrace();
}

public interface CondditionProperty<T> {
ConditionRoot<T> eq(Object value);
}

public class RootBuilderImpl implements ConditionRoot<T> {
private final SimpleStack<ConditionalCriteria> criteriaStack = new SimpleStack<ConditionalCriteria>();
private final SimpleStack<ExpressionOperator> operatorStack = new SimpleStack<ExpressionOperator>();

/**
* End the expression with this build
*/
public List<ConditionalCriteria> build() {
return criteriaStack.asList();
}

public CondditionProperty<T> withProperty(Property<T> property) {
return new PropBuilderImpl(property.getName());
}

public ConditionRoot<T> and() {
operatorStack.push(ExpressionOperator.And);
return this;
}

public ConditionRoot<T> or() {
operatorStack.push(ExpressionOperator.Or);
return this;
}

public ConditionRoot<T> not() {
operatorStack.push(ExpressionOperator.Not);
return this;
}

public ConditionRoot<T> lbrace() {
operatorStack.push(ExpressionOperator.LBrace);
return this;
}

public ConditionRoot<T> rbrace() {
operatorStack.pop();
if (criteriaStack.isEmpty()) {
return this;
}
ConditionalCriteria lastCriteria = popCriteria();
pushCriteria(lastCriteria);
return this;
}

private ConditionalCriteria popCriteria() {
return criteriaStack.pop();
}

private void pushCriteria(ConditionalCriteria criteria) {
ExpressionOperator currentOperator = operatorStack.peek();
if (currentOperator == ExpressionOperator.Or || currentOperator == ExpressionOperator.And) {
ConditionalCriteria compositeCriteria;
if (currentOperator == ExpressionOperator.Or) {
compositeCriteria = ConditionalCriteria.or(popCriteria(), criteria);
} else {
compositeCriteria = ConditionalCriteria.and(popCriteria(), criteria);
}
criteriaStack.push(compositeCriteria);
operatorStack.pop();
} else if (currentOperator == ExpressionOperator.Not) {
ConditionalCriteria notCriteria = ConditionalCriteria.not(criteria);
criteriaStack.push(notCriteria);
operatorStack.pop();
if (!operatorStack.isEmpty() && !criteriaStack.isEmpty()) {
pushCriteria(criteriaStack.pop());
}
} else if (currentOperator == ExpressionOperator.LBrace) {
criteriaStack.push(criteria);
} else {
criteriaStack.push(criteria);
}
}

private class PropBuilderImpl implements CondditionProperty<T> {
String propertyName;

PropBuilderImpl(String name) {
this.propertyName = name;
}

public ConditionRoot<T> eq(Object value) {
pushCriteria(ConditionalCriteria.equal(propertyName, value));
return RootBuilderImpl.this;
}
}

}
}


The magic is in pushCriteria, which is invoked at the end of a condition, such as eq(), and also when right brace is reached. It is not a trivial problem, but thanks to the stack metaphor the solution is not that complicated.

A final thought about the syntax. Look at the expression again:

// A and (B or C)
criteriaFor(Person.class).withProperty(aaa()).eq("A").and().lbrace()
.withProperty(bbb()).eq("B").or().withProperty(ccc()).eq("C").rbrace()
.build();


I use special keywords for left brace and right brace. Why not use grouping with an single method instead and write the nested expression as method parameter?

// A and (B or C)
criteriaFor(Person.class).withProperty(aaa()).eq("A").and().group(
criteriaFor(Person.class).withProperty(bbb()).eq("B").or().withProperty(ccc()).eq("C"))
.buildSingle();


That would be possible, and probably easier to implement, but I see a few problems from end user perspective:
  1. There are already much clutter of braces due to the methods and it is hard to see and write the end brace of the group operation.
  2. You loose context and have to start the nested expression from scratch, i.e. with static criteriaFor method.

That was all of this exercise. I hope you find it useful as a practical example of how to develop a small Internal DSL. You find the full source code in subversion, including JUnit test.

Monday, November 30, 2009

Internal DSL for Criteria - part 2:3

In previous article I introduced the new internal DSL for JPA/Hibernate criteria. Let us look at how to handle the property names.

criteria().prop("lastName").eq("Svensson").build();
The problem with the above expression is that the property lastName is a String. Refactoring of Person.lastName will not be automatically detected. Another issue is that you have to remember/lookup that the property name is "lastName" when writing this expression, the IDE will not help you.

We, who are using Sculptor, define the domain object in the design model like this:

Entity Person {
String firstName
String lastName
}
From this Sculptor generates the Person class. We can also easily generate meta data of the properties of Person, which makes it possible to write expressions like this:

List<ConditionalCriteria> conditionalCriteria = ConditionalCriteriaBuilder.criteriaFor(Person.class)
.withProperty(PersonProperties.lastName()).eq("Svensson").build();
PersonProperties is generated and it contains static methods for each property, returning a Property<Person> object, which internally defines the strings. The builder is also parameterized, using the class defined in the criteriaFor factory method. This means that the withProperty method only accepts properties of correct type, i.e. Property<Person>.

Static imports can be used to make the expression more compact and readable:

import static org.fornax.cartridges.sculptor.framework.accessapi.ConditionalCriteriaBuilder.criteriaFor;
import static org.library.person.domain.PersonProperties.lastName;

List<ConditionalCriteria> conditionalCriteria = criteriaFor(Person.class)
.withProperty(lastName()).eq("Svensson").build();

Note that the eclipse keyboard shortcut for static import is <ctrl+shift+m> (mac: <cmd+shift+m>).

The expression builder looks like this:

public class ConditionalCriteriaBuilder<T> {

private final List<ConditionalCriteria> criteriaList = new ArrayList<ConditionalCriteria>();

private ConditionalCriteriaBuilder() {
}

public static <T> ConditionalCriteriaBuilder<T> criteriaFor(Class<T> clazz) {
return new ConditionalCriteriaBuilder<T>();
}

public List<ConditionalCriteria> build() {
return criteriaList;
}

private void addCriteria(ConditionalCriteria criteria) {
criteriaList.add(criteria);
}

public CondditionProperty<T> withProperty(Property<T> property) {
return new PropBuilderImpl(property.getName());
}

public interface CondditionProperty<T> {
ConditionalCriteriaBuilder<T> eq(Object value);
}

private class PropBuilderImpl implements CondditionProperty<T> {
String propertyName;

PropBuilderImpl(String name) {
this.propertyName = name;
}

public ConditionalCriteriaBuilder<T> eq(Object value) {
addCriteria(ConditionalCriteria.equal(propertyName, value));
return ConditionalCriteriaBuilder.this;
}
}
}


Alright, let us refactor the model, and introduce a BasicType for the first and last name of the Person:


Entity Person {
- @PersonName name
}

BasicType PersonName {
String first
String last
}


Generate, and you will immediately detect compilation error in the criteria expression. Fix it and it looks like this:

List<ConditionalCriteria> conditionalCriteria = criteriaFor(Person.class)
.withProperty(name().last()).eq("Svensson").build();


The generated Properties classes defines refererences also, so it easy to navigate associations with full code completion support. Great, our goals of code completion and refactoring are fulfilled.

That is not the end of this series of articles. In next article I will illustrate some more advanced operators that requires some intellectual thoughts. See you tomorrow.

Sunday, November 29, 2009

Internal DSL for Criteria - part 1:3

Hibernate has support for criteria queries and that is an awaited feature in JPA 2.0. The criteria API is rather technical and doesn't read to well with your domain terms, your DDD Ubiquitous Language. Therefore I have developed a small internal DSL in Java to be able to express conditional criteria in human readable format. It is not intended to handle everything in the criteria API.

I use this to illustrate how to implement an internal DSL. It is not hard once you have learned the tools to play with. Java is not optimal for DSLs, but let us use it as good as possible.

The goal was to create a fluent api that supported code completion and refactoring in a good way. A simple criteria may look like:

criteriaFor(Person.class).withProperty(sex()).eq(Gender.FEMALE).build();

and a more advanced criteria:

criteriaFor(Person.class)
.withProperty(sex()).eq(Gender.FEMALE).and()
.withProperty(ssn().country()).eq(SWEDEN).and()
.lbrace().withProperty(name().first()).like("A%")
.or().withProperty(name().last()).like("A%").rbrace()
.orderBy(name().last())
.build();
I will show how a few pieces of the DSL was developed. I have mostly used Expression Builder and Method Chaining, but also a few other tricks.

The expression builder doesn't create the JPA/Hibernate criteria directly. It creates an intermediate structure of objects , which can be translated to criteria API. The main reason for that is decoupling and possibility to have vendor specific implementations.

First, let me state the obvious: When developing an internal DSL it is perfect to do it the TDD way. You will add more and more features and do a lot of refactoring until you are satisfied with the language.

Enough of the introductory talking, let us go hands on to a simple equals expression:

List<ConditionalCriteria> conditionalCriteria =   
ConditionalCriteriaBuilder.criteria().prop("name").eq("Svensson").build();
This can be implemented in the builder as:

public class ConditionalCriteriaBuilder {
private final List<ConditionalCriteria> criteriaList = new ArrayList<ConditionalCriteria>();
private String propertyName;

private ConditionalCriteriaBuilder() {
}

public static ConditionalCriteriaBuilder criteria() {
return new ConditionalCriteriaBuilder();
}

public List<ConditionalCriteria> build() {
return criteriaList;
}

public ConditionalCriteriaBuilder prop(String name) {
propertyName = name;
return this;
}

public ConditionalCriteriaBuilder eq(Object value) {
addCriteria(ConditionalCriteria.equal(propertyName, value));
return this;
}

private void addCriteria(ConditionalCriteria criteria) {
criteriaList.add(criteria);
}
}

But that is not very good because it allows invalid expressions, such criteria().eq("A").prop("aaa").eq("B").

Therefore I introduce an interface for the property level:



public PropBuilder prop(String name) {
return new PropBuilderImpl(name);
}

public interface PropBuilder {
ConditionalCriteriaBuilder eq(Object value);
}

private class PropBuilderImpl implements PropBuilder {
String propertyName;

PropBuilderImpl(String name) {
this.propertyName = name;
}

public ConditionalCriteriaBuilder eq(Object value) {
addCriteria(ConditionalCriteria.equal(propertyName, value));
return ConditionalCriteriaBuilder.this;
}
}

This also facilitates code completion (ctrl+space) in a good way, which was one of our goals.

In next post, tomorrow, I will take care of the string property names that are not very refactoring friendly.

Thursday, November 19, 2009

How we do automated regression testing with Selenium and Hudson

When developing a piece of software that has a lifecycle that spans over several years and periodically is released you have to do regression testing, i.e. making sure that previous features doesn't break because of new stuff.
And after a while when your software grows and you add new features, the amount of regression tests increases. To avoid drowning yourself with testing you need to automate as much as possible.

The Sculptor team is a bunch of guys that are driven by interest and are developing the software on there spare time during late nights. We doesn't have the time to do deep manual testing, hence automation is very attractive for us. And since we are geographically distributed and don't have a central CI environment we have to solve some practical problems locally.

What we do (amongst other things), involves having a local Hudson server running on our developing environment (i.e. my iMac) that (of course) builds all projects and runs unit tests.
But we also use Selenium to run automated functional tests to make sure that our example application works. I though I should show some more details about how we use Selenium.

As I said, we have an example application called Library. If you look in the source code for the library-web module you will find the directory:

src/main/webapp/selenium

In there you will find a bunch of tests. The root is the suite file:

___test-suite.xhtml

Selenium test can be written in various programming languages. We have chosen to keep it simple and implement the tests in html.

Having these tests enables us to run them as soon as the code base changes thanks to Hudson and Maven. In Hudson we just creates a new job that is triggered to run as soon as the Library projects is compiled and the unit tests passes. In the pom-file for the Library web project we have a profile that we can use to start a local instance of the Jetty server. Deploy the the application. Run our selenium tests. And finally stop the server. The Maven configuration for this is:
<profile>
<id>regression</id>
<build>
<plugins>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>maven-jetty-plugin</artifactId>
<version>6.1.11</version>
<configuration>
<scanIntervalSeconds>10</scanIntervalSeconds>
<stopKey>foo</stopKey>
<stopPort>9999</stopPort>
</configuration>
<executions>
<execution>
<id>start-jetty</id>
<phase>pre-integration-test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<scanIntervalSeconds>0</scanIntervalSeconds>
<daemon>true</daemon>
</configuration>
</execution>
<execution>
<id>stop-jetty</id>
<phase>post-integration-test</phase>
<goals>
<goal>stop</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>selenium-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<suite>src/main/webapp/selenium/___test-suite.xhtml</suite>
<browser>*firefox</browser>
<results>${project.build.directory}/target/selenium.html</results>
<startURL>http://localhost:8080/${artifactId}</startURL>
</configuration>
<executions>
<execution>
<id>run-tests</id>
<phase>integration-test</phase>
<goals>
<goal>selenese</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
But, since the amount of tests keeps growing and the time it takes to run them all also grows, there is a need to be able to run just a single test or a few of them. For instance, it is very convenient to have that possibility when developing new features, fixing bugs, or just doing a refactoring. To enable this we use the very competent Firefox plugin called SeleniumIDE. Beside recording test case, you can also load already defined test cases (or suites) and run them.




Sunday, November 15, 2009

Promote Quality with Sculptor

We have written an article that has been published in the paper magazine JayView Issue 20.

Without a vision of how to design applications within an organization the development can be compared to lawless Wild West. Development guidelines are often used, but seldom successful over the long haul. We suggest taking the architectural decisions one step further by automating them using a tool such as Sculptor.

When using a general purpose language, such as Java, and its big toolbox of APIs and frameworks there is a huge freedom of choice. This is a double-edged sword. We meet a lot of companies that have realized that they must narrow down the choices so that each new project doesn't invent its own unique system architecture and product suite. The benefits of a homogeneous architecture is obvious when looking at the big picture.

The reference architecture is often accomplished by writing guidelines and maybe a sample reference application. There are several problems with a reference architecture that is only promoted by documentation. We suggest automating some pieces of the development by using a code generator tool, such as Sculptor, to enforce consistency in the architecture.

Read more in the full article.

Tuesday, November 3, 2009

Mocking with App Engine and Spring

In previous article I illustrated how easy it is to get started with unit testing with the local App Engine environment. In this article I will go in to more advanced interaction based testing, i.e. mocking.

The App Engine APIs are simulated in the local environment. Some local implementations are designed with testing in mind, such as the email API. It is possible to verify the emails that were sent.

LocalMailService localMailService = AppEngineTestHelper.getLocalMailService();
List<MailMessage> sentMessages = localMailService.getSentMessages();
assertEquals(2, sentMessages.size());

Some other local implementations are not suitable for unit testing, such as the URL fetch service, which executes a real remote request. To solve this you need to encapsulate usage of external communication and make it possible to replace it when unit testing.

Since we are using Spring for dependency injection it is possible to replace any Spring bean for testing purpose. In our customer-supplier sample the InquiryRepository in the customer application sends inquiries to the customer application with a REST post.

This can be replaced when testing by defining a stub implementation that overrides the method that sends then inquiries. This is done in spring xml configuration (more-test.xml):

<bean id="inquiryRepository"
class="org.customer.inquiry.repositoryimpl.InquiryRepositoryStub"/>


public class InquiryRepositoryStub extends InquiryRepositoryImpl {
@Override
protected boolean sendInquiryToSupplier(Inquiry inquiry, Supplier supplier) {
return true;
}
}

Next step is to use a mocking framework instead. This makes it possible to verify the interaction, i.e. that the sendInquiryToSupplier method was invoked.

Then it is motivated to extract the sending to a separate class and interface. It is this interface that we want to mock.


public interface InquirySender {
boolean sendInquiryToSupplier(Inquiry inquiry, Supplier supplier);
}

The real implementation is an ordinary Spring @Component, that is @Autowired in InquiryRepositoryImpl. It is this implementation we want to replace with a mock when testing.

@Component
public class InquirySenderImpl implements InquirySender {


We use the approach described in the first part of Mocking & Spring tests. The FactoryBean is included in Sculptor so we only need to add the xml definition (more-test.xml):


<bean id="inquirySenderMockFactory"
class="org.fornax.cartridges.sculptor.framework.test.MockitoFactory"
primary="true" >
<property name="type" value="org.customer.inquiry.repositoryimpl.InquirySender"/>
</bean>


The junit test looks like this:


public class InquiryServiceTest extends AbstractAppEngineJpaTests
implements InquiryServiceTestBase {

@Autowired
private InquiryService inquiryService;
@Autowired
private InquirySender inquirySenderMock;

@Before
public void initMock() {
when(inquirySenderMock.sendInquiryToSupplier(any(Inquiry.class), any(Supplier.class)))
.thenReturn(true);
}

@Before
public void populateDatastore() {
Inquiry inquiry1 = new Inquiry();
inquiry1.setMessage("M1");
inquiry1.setOwnerEmail("foo@gmail.com2");
getEntityManager().persist(inquiry1);

Supplier supplier1 = new Supplier("S1");
supplier1.setUrl("http://localhost:8081/rest/inquiry");
getEntityManager().persist(supplier1);

Supplier supplier2 = new Supplier("S2");
supplier2.setUrl("http://localhost:8081/rest/inquiry");
getEntityManager().persist(supplier2);
}

@Test
public void testSendInquiry() throws Exception {
Key key = KeyFactory.createKey(Inquiry.class.getSimpleName(), 1L);
boolean ok = inquiryService.sendInquiry(getServiceContext(), key);
assertTrue(ok);
// there are 2 suppliers
verify(inquirySenderMock, times(2)).sendInquiryToSupplier(
any(Inquiry.class), any(Supplier.class));
}
}

Note that the mock is initialized in the @Before method and then verified last in the test method. In this case two messages should be sent, one for each supplier.

Maybe you have noticed that this approach is not at all specific for App Engine, it can be used for any Spring application. We need to learn a lot of new things when using App Engine, but some old knowledge still applies. :-)

Sunday, November 1, 2009

Unit Testing with App Engine and Spring

Sculptor makes it easy to write JUnit tests for Google App Engine. A test case looks like this:


public class SupplierServiceTest extends AbstractAppEngineJpaTests {

@Autowired
private SupplierService supplierService;

@Before
public void populateDatastore() {
Supplier supplier1 = new Supplier("S1");
getEntityManager().persist(supplier1);

Supplier supplier2 = new Supplier("S2");
getEntityManager().persist(supplier2);
}

@Test
public void testFindAll() throws Exception {
List<Supplier> all = supplierService.findAll(getServiceContext());
assertEquals(2, all.size());
}

@Test
public void testFindByName() throws Exception {
Supplier found = supplierService.findByName(getServiceContext(), "S2");
assertNotNull(found);
assertEquals("S2", found.getName());
}
}


Very natural!

It is interesting to take a look at the base class. It defines a few annotations and extends AbstractJUnit4SpringContextTests to initialize the Spring environment. This enables usage of ordinary @Autowire dependency injection directly in the test class.

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = {"classpath:applicationContext-test.xml"})
public abstract class AbstractAppEngineJpaTests
extends AbstractJUnit4SpringContextTests {

The embedded App Engine environment is initialized from a method annotated with @Before, i.e. invoked before each test method.

public static void setUpAppEngine(ApiProxy.Environment testEnvironment) {
ApiProxy.setEnvironmentForCurrentThread(testEnvironment);

ApiProxy.setDelegate(new ApiProxyLocalImpl(new File(".")) {
});

ApiProxyLocalImpl proxy = (ApiProxyLocalImpl) ApiProxy.getDelegate();
proxy.setProperty(LocalDatastoreService.NO_STORAGE_PROPERTY, Boolean.TRUE.toString());
clearSentEmailMessages();
}

public static void tearDownAppEngine() {
ApiProxyLocalImpl proxy = (ApiProxyLocalImpl) ApiProxy.getDelegate();
LocalDatastoreService datastoreService = (LocalDatastoreService) proxy.getService("datastore_v3");
datastoreService.clearProfiles();
clearSentEmailMessages();
}

It is initialized with in memory data store, i.e. it is empty before each test method. You may populate it with initial data in your subclass in a @Before method, see populateDataStore in the sample above.

I learned one thing when doing junit testing in the app engine environment. When working with ordinary databases I have found the Spring transactional test support useful, i.e. Spring executes each test method in a transaction, which is rolled back after the test mehtod. That is achieved with the following annotations and usage of the annotation @BeforeTransaction instead of the ordinary @Before.

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = {"classpath:applicationContext-test.xml"})
@TestExecutionListeners(TransactionalTestExecutionListener.class)
@TransactionConfiguration(transactionManager = "txManager", defaultRollback = true)
@Transactional
public abstract class AbstractAppEngineJpaTests
extends AbstractJUnit4SpringContextTests {

That was my initial approach also with app engine, but I realized that it was not good. Look at the following test. It will fail on the last assert when using the above transactional support.


@Test
public void testSave() throws Exception {
int countBefore = countRowsInTable(Supplier.class);
Supplier supplier3 = new Supplier("S3");
supplierService.save(getServiceContext(), supplier3);
int countAfter = countRowsInTable(Supplier.class);
assertEquals(countBefore + 1, countAfter);
}

The reason is that queries see a snapshot of the datastore as of the beginning of the transaction.

Data isolation between test methods is no problem, since the datastore is initialized (empty) before each test method.

That's all! Try it yourself by running the Maven Archetype for App Engine and fill in the details in the generated PlanetServiceTest.
  1. mvn archetype:generate -DarchetypeGroupId=org.fornax.cartridges -DarchetypeArtifactId=fornax-cartridges-sculptor-archetype-appengine -DarchetypeVersion=1.7.0-SNAPSHOT -DarchetypeRepository=http://www.fornax-platform.org/archiva/repository/snapshots/
  2. mvn clean eclipse:eclipse

Stay tuned, in next post I will describe how to mock.

Saturday, October 24, 2009

Decouple modules with asynchronous event dispatching using Spring and task queues in GAE

To build applications that are maintainable and robust you should strive for decoupling between modules. To build applications that scale you will always benefit from asynchronism and parallellism.
Here we will look how to accomplish the above in GoogleAppEngine and with some help from springframework.
Lets say we have an application where users can register them self. When they do, the application creates a persistence instance of a User-object. But, we will also keep track of how many users we have registered on the site. Now, being in GAE with BigTable luring in the back, doing queries and calculations (as we are used to with a traditional database) isn't a good idea. So as an alternative we choose to have a separate Counter-object that we updates when ever a new user registers. Ok, nothing strange here. But, there are a couple of flaws here:
  1. The User module needs to know about the Counter module.
  2. The User module has to wait for the Counter module to finish when updating the counting.
Ok, lets solve the first by using spring's mechanism for ApplicationEvent's. First, let us put some aop magic to work to intercept the call to UserService.createUser and when it returns (and we have the transaction boundaries on service methods, so no exception, all went well) fire off an event. Spring config for the aop stuff:


<bean id="userListener" class="org.fornax.sculptor.UserListener"/>
<bean id="userAdvice" class="org.fornax.sculptor.UserAdvice"/>
<aop:config>
<aop:pointcut id="userCreationPointcut" expression="execution(public * org..UserService.createUser(..))"/>
<aop:advisor pointcut-ref="userCreationPointcut" ref="userCreationPointcut"/>
</aop:config>


Next, here is the advice:

public class UserAdvice implements MethodInterceptor, ApplicationContextAware {

private ApplicationContext ctx;

public Object invoke(MethodInvocation invocation) throws Throwable {
User user = (User) invocation.proceed();
fireNewUserEvent(user);
return user;
}

private void fireNewUserEvent(User user) {
ctx.publishEvent(new UserCreatedEvent(user));
}

public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.ctx = applicationContext;
}
}

The listener that is being notified:

public class UserListener implements ApplicationListener<UserCreatedEvent> {
@Autowired private CounterService counterService;
public void onApplicationEvent(UserCreatedEvent event) {
counterService.increment();
}
}

And the event being passed:

public class UserCreatedEvent extends ApplicationEvent {
public UserCreatedEvent(User user) {
super(user);
}
}

Ok, so now we are half way. We have the Observer pattern in place. But we still does everything synchronous.
Enter GAE's task queue's. Let us modify our UserListener:

public class UserListener implements ApplicationListener<UserCreatedEvent> {

public void onApplicationEvent(UserCreatedEvent event) {
TaskOptions task = url("/rest/admin/counter/user").method(POST);
Queue queue = QueueFactory.getDefaultQueue();
queue.add(task);
}
}

And by the wonders of task queue's, we now put a task on the queue and by that we do the counting job asynchronous. And of course, we dropped the reference to the CounterService. But we miss one piece here, right? What does the url in the task point at. Well, nothing strange here, it is just a spring mvc controller:

@Controller
public class CounterCountroller {
@Autowired private CounterService counterService;
@RequestMapping(value = "/admin/counter/user", method = RequestMethod.POST)
public void incrementCounter() throws IOException {
try {
counterService.increment();
} catch (Exception ignore) {
// doesn't matter if we get an exception here, just log it
log.error("Failed to increment counter!", ignore);
}
}
}
And now we have a more loosely coupled system that scales better. And with a little effort, the code can be generalized so more features are easy to add with the same pattern.
Of course, the downside of this kind of design is that error handling gets more complicated and you can't always trust it to be 'right'. But that is system design, you have to decide what's best for each situation.

Wednesday, October 21, 2009

Even Weird Naming Conventions are Good

The good thing with naming conventions are that they are toolable. The DBAs at my department have strong opinions about database naming. They have good reasons for that, even though I don't fully understand all of them :-)
  • Table names should be prefixed with application/component identifier.
  • Primary key id column should be prefixed with table name (without application prefix) and followed by _GID.
  • Underscore to separate words.
  • Foreign key column is concatenation of role name and primary key column name of target table, except when role and table have the same name.
Does this mean that we have to specify each and every name twice, once for Java and once for the database. Argh... NO, we are using Sculptor. With a straightforward customization I implemented these conventions in the generator and we could continue with natural (java point of view) naming and please the preferences of the DBAs without additional effort.

Naming conventions are important for software quality. Supporting the conventions with a tool is the best way to make sure that they are applied in a consistent way.

Monday, October 19, 2009

Sculptor and Agile

For us, the agile way has always been the way of getting things done. Even before there where fancy names for it we did things in a way that enabled us to deliver the right things on the right time. This hasn't change, but now we call it scrum :-)
Sculptor will help you if you want to work agile.

Things change
They do. There is nothing anybody can do about it. And when things change, there will be delays. Doing things in an agile manner helps minimize those delays. Using sculptor helps you even more.
I'll show you by an example. Lets say you have an application that is almost done for production. Testing is done. No more bugs (yeah, right). On the final demo, suddenly, one of the stake holders realize that the customer object needs another attribute, lets call it ICE (In Case of Emergency). And, of course according to the stake holder you can't go into production without it.
So, what are the changes that needs to be done?
From the bottom up:
  • Database schema
  • Persistence layer (JPA, Hibernate, etc) :
  • Domain object
  • Service layer (if we have any logic tied to the attribute)
  • Back office Client (for creating a customer object)
  • Public Client (for viewing a customer object)
Most of these changes are boiler plate code. Changing declarations etc. Without Sculptor these changes takes time and are error prone. If we assume that every change is a risk and then calculate and compare all changes we need to do with and without Sculptor we end up with the following table where every point is a code change needed to do to accomplish the last minute requirement:








Without SculptorWith Sculptor
model:01
db script:10
db migration script:11
property in domain object:10
jpa annotation för property:10
logic in service or domain object:11
back office client (create/update/list/view):70
public client (view):11
junit test (save, find):22
documentation (class diagrams):10
sum:166

So, being simple, lets say that each point is equally valued. By this assumption we reduce risk and time by 150%

This can be very valuable when change comes along. And it does, doesn't it...?

Sunday, October 4, 2009

Sculptor in the Cloud

Now you can use Sculptor to speed up and simplify development of applications running in the Google App Engine cloud.


Powered by App Engine


Let's start with a demo of how easy it is to create a new application and deploy it.



For this we are using Sculptor maven archetype for App Engine. Try it yourself:
  1. mvn archetype:generate -DarchetypeGroupId=org.fornax.cartridges -DarchetypeArtifactId=fornax-cartridges-sculptor-archetype-appengine -DarchetypeVersion=1.7.0-SNAPSHOT -DarchetypeRepository=http://www.fornax-platform.org/archiva/repository/snapshots/

  2. cd to the new directory

  3. mvn clean

  4. mvn generate-sources

  5. mvn eclipse:eclipse

  6. Import the project in Eclipse

Without any changes the new project is ready to run in the local development server or to be deployed at appspot.com. The sample in the demo is available here: http://sculptor-helloworld.appspot.com

The archetype creates a sample of of a RESTful Spring 3.0 Controller and JSP pages for the CRUD operations.




















The archetype also creates a simple sample model, from which Sculptor generates Entity, Repository and Service with the default CRUD operations; findById, findAll, save, and delete.

The model is defined in a textual DSL, with an intuitive syntax, from which Sculptor generates high quality Java code and configuration. It is not a one time shot. The application can be developed incrementally with an efficient round trip loop. The generator is part of the build process (maven).



Sculptor generates JPA mapping annotations for the domain objects defined in the design model. Relations are very limited in App Engine, since the datastore (BigTable) is not a relational database.

Owned and embedded associations are supported and mapped as ordinary JPA associations. They are specified with aggregate and BasicType in the Sculptor model.



Unowned associations are handled with id references and you must lookup the objects with findById when needed.



Services and Repositories are implemented as Spring components with @Autowired dependency injection. Spring AOP is used for error handling and transaction management.



Behavior is implemented with hand written code in subclass, separated from re-generated code in base class. In the above example the sayHello method is typically implemented in the Service by first using the generated findByKey method in the Repository. Note that the name attribute of the Planet is marked as key.

Sculptor also provides support for JUnit testing with the local App Engine environment. I will cover that in another article some day soon.

Thursday, October 1, 2009

Maven Archetype for App Engine

I have developed a maven archetype for Google App Engine projects. The generated project supports:
  • All dependency jar files are downloaded from maven repositories and copied to lib directory as required by App Engine Eclipse plugin, and local development server.

  • Eclipse project is created with mvn eclipse:eclipse. The resulting Eclipse project has the necessary settings for App Engine Eclipse plugin.

  • Entity classes are processed by DataNucleus enhancer in the build lifecycle.

  • JUnit tests with local App Engine environment can be run from maven.

Setting up all of this is not trivial and therefore I would like to share the solution and I hope you find it useful.

Eclipse Project
The maven eclipse plugin need a lot of configuration.
<build>
<outputDirectory>war/WEB-INF/classes</outputDirectory>
<plugins>
<plugin>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.5.1</version>
<configuration>
<!--
buildOutputDirectory doesn't work due to
http://jira.codehaus.org/browse/MECLIPSE-422 An workaround is the
outputDirectory at project/build level
<buildOutputDirectory>war/WEB-INF/classes</buildOutputDirectory>
-->
<testOutputDirectory>target/test-classes</testOutputDirectory>
<classpathContainers>
<classpathContainer>com.google.appengine.eclipse.core.GAE_CONTAINER</classpathContainer>
</classpathContainers>
<buildcommands>
<buildcommand>org.eclipse.jdt.core.javabuilder</buildcommand>
<buildcommand>com.google.gdt.eclipse.core.webAppProjectValidator</buildcommand>
<buildcommand>com.google.appengine.eclipse.core.enhancerbuilder</buildcommand>
<buildcommand>com.google.appengine.eclipse.core.projectValidator</buildcommand>
</buildcommands>
<additionalProjectnatures>
<projectnature>org.eclipse.jdt.core.javanature</projectnature>
<projectnature>com.google.appengine.eclipse.core.gaeNature</projectnature>
<projectnature>com.google.gdt.eclipse.core.webAppNature</projectnature>
</additionalProjectnatures>
<excludes>
<!-- Included in GAE_CONTAINER -->
<exclude>com.google.appengine:appengine-api-1.0-sdk</exclude>
<exclude>com.google.appengine:appengine-api-1.0-labs</exclude>
<exclude>com.google.appengine.orm:datanucleus-appengine</exclude>
<exclude>org.datanucleus:datanucleus-jpa</exclude>
<exclude>org.datanucleus:datanucleus-core</exclude>
<exclude>org.apache.geronimo.specs:geronimo-jpa_3.0_spec</exclude>
<exclude>org.apache.geronimo.specs:geronimo-jta_1.1_spec</exclude>
<exclude>javax.jdo:jdo2-api</exclude>
</excludes>
</configuration>
</plugin>


Some dependencies must be excluded, since they are part of GAE_CONTAINER, otherwise JUnit tests will not work when running inside Eclipse. The output directory is changed to war/WEB-INF/classes. There is a bug (MECLIPSE-422) which cause the test classes to not be separated if buildOutputDirectory is used. The local development server doesn't like the test classes. The trick is to define the output at the top build level and define testOutputDirectory.

Copy Dependencies
When running the local development server and deploying to App Engine all dependent jar files must be located in war/WEB-INF/lib. I have used the maven dependency plugin to copy the jar files during the maven clean phase.

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>clean</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${spring.version}</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<!-- more ... -->
<artifactItem>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-1.0-sdk</artifactId>
<version>${appengine.version}</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-1.0-labs</artifactId>
<version>${appengine.version}</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>com.google.appengine.orm</groupId>
<artifactId>datanucleus-appengine</artifactId>
<version>1.0.3</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-jpa</artifactId>
<version>1.1.5</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>1.1.5</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jpa_3.0_spec</artifactId>
<version>1.1.1</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jta_1.1_spec</artifactId>
<version>1.1.1</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
<artifactItem>
<groupId>javax.jdo</groupId>
<artifactId>jdo2-api</artifactId>
<version>2.3-eb</version>
<outputDirectory>war/WEB-INF/lib</outputDirectory>
</artifactItem>
</artifactItems>
<!-- other configurations here -->
</configuration>
</execution>
</executions>
</plugin>



DataNucleus Enhancer
Running the JUnit tests from maven was a primary goal as I would like to run tests from continous build server. Th JUnit tests are using local App Engine environment with in-memory datastore. Therefore the classes must be processed by DataNucleus enhancer after ordinary compilation.

<plugin>
<groupId>org.datanucleus</groupId>
<artifactId>maven-datanucleus-plugin</artifactId>
<version>1.1.4</version>
<configuration>
<api>JPA</api>
<mappingIncludes>**/*.class</mappingIncludes>
<log4jConfiguration>${basedir}/src/main/resources/log4j.properties</log4jConfiguration>
<verbose>false</verbose>
</configuration>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>enhance</goal>
</goals>
</execution>
</executions>
</plugin>


Archetype
All of this is packaged in a maven archetype. Try it like this.
  1. mvn archetype:generate -DarchetypeGroupId=org.fornax.cartridges -DarchetypeArtifactId=fornax-cartridges-sculptor-archetype-appengine -DarchetypeVersion=1.7.0-SNAPSHOT -DarchetypeRepository=http://www.fornax-platform.org/archiva/repository/snapshots/

  2. cd to the new directory

  3. mvn clean

  4. mvn eclipse:eclipse

  5. Import the project in Eclipse

As an extra bonus your new project is configured for Spring 3.0 with a sample of a RESTful controller.

Sculptor code generator tool is of course also configured and ready to be used in the new project. I will soon write another article about Sculptor's support for App Engine.

Wednesday, September 16, 2009

Customer Specific Addon: Deep Merge

This article illustrates the possibility to add your own features to the Sculptor code generator.

In my customer project we have a need to merge two object graphs. We have a persistent domain model and we receive messages from production systems when changes occur. There are several production systems sending the data in slightly different format and semantics.

We designed this as a first step that converts the production messages to new transient domain object instances.

Next step is to merge that object graph with present persistent objects.

This feels like a tedious and repetitive programming task. If done manually it will require some maintenance when we do changes.

At first I took a look at Dozer, but pretty soon things got complicated and required a lot of XML mapping files. So we gave up that idea.

At home, it struck me that we already have the tool we need. We are already using Sculptor, and it should be a simple addition to generate the merge methods in the domain objects.

Next morning I implemented it like this...

The final java code to be generated looks like this in each domain object. It copies attributes and new associated objects. It traverses existing associations.

  public void deepMerge(Item other) {
Set<Object> processed = new HashSet<Object>();
deepMerge(other, processed);
}

public void deepMerge(Item other, Set<Object> processed) {
if (processed.contains(this)) {
return;
}
processed.add(this);

if (other.getEstimatedTimeOfArrival() != null) {
setEstimatedTimeOfArrival(other.getEstimatedTimeOfArrival());
}

deepMergeShipment(other, processed);

deepMergeEvents(other, processed);

}

public void deepMergeShipment(Item other, Set<Object> processed) {
Shipment currentValue = getShipment();
if (other.getShipment() != null) {
if (currentValue == null) {
setShipment(other.getShipment());
} else {
currentValue.deepMerge(other.getShipment(), processed);
}
}
}

public void deepMergeEvents(Item other, Set<Object> processed) {
for (TrackingEvent each : other.getEvents()) {
if (getEvents().contains(each)) {
TrackingEvent currentValue = eventForKey(other.getKey());
currentValue.deepMerge(each, processed);
} else {
addEvent(each);
}
}
}

protected TrackingEvent eventForKey(Object key) {
for (TrackingEvent each : getEvents()) {
if (each.getKey().equals(key)) {
return each;
}
}
return null;
}
I developed this as a project specific addon, i.e. I invoked a code generation template from SpecialCases.xpt:
«AROUND templates::DomainObject::keyGetter FOR DomainObject»
«targetDef.proceed()»

«EXPAND templates::DeepMerge::deepMerge»
«ENDAROUND»
I started with the simple attributes.
«DEFINE deepMerge FOR DomainObject»

«EXPAND deepMergeMethod»

«ENDDEFINE»

«DEFINE deepMergeMethod FOR DomainObject»
public void deepMerge(«getDomainPackage()».«name» other) {
«EXPAND deepMergeAttribute FOREACH attributes.reject(e | !e.changeable)»
«ENDDEFINE»

«DEFINE deepMergeAttribute FOR Attribute»
if (other.«getGetAccessor()»() != null) {
set«name.toFirstUpper()»(other.«getGetAccessor()»());
}
«ENDDEFINE»
I generated and looked at the result.

I noticed that the auditable fields were included. Ok, then I can use the helper function isSystemAttribute() to skip those.
«EXPAND deepMergeAttribute FOREACH attributes
.reject(e | !e.changeable || e.isSystemAttribute())»

The tricky part is the associations and I could imagine that we would have some corner cases that wouldn't be covered by the generated pattern. Therefore I created separate methods for each association so that it will be possible to override the generated methods in gap classes and handle eventual special cases manually.

I added the templates for references. Starting with the to-one references:

«DEFINE deepMergeOneReference FOR Reference»
public void deepMerge«name.toFirstUpper()»(«from.getDomainPackage()».«from.name» other) {
«to.getDomainPackage()».«to.name» currentValue = get«name.toFirstUpper()»();
if (other.get«name.toFirstUpper()»() != null) {
if (currentValue == null) {
set«name.toFirstUpper()»(other.get«name.toFirstUpper()»());
} else {
currentValue.deepMerge(other.get«name.toFirstUpper()»());
}
}
}
«ENDDEFINE»

Continuing with the to-many case. It is a little bit more tricky, since we need to grab existing instance for collection. Added a helper method for that.
«DEFINE deepMergeManyReference FOR Reference»
public void deepMerge«name.toFirstUpper()»(«from.getDomainPackage()».«from.name» other) {
for («getTypeName()» each : other.get«name.toFirstUpper()»()) {
if (get«name.toFirstUpper()»().contains(each)) {
«to.getDomainPackage()».«to.name» currentValue = «name.singular()»ForKey(other.getKey());
currentValue.deepMerge(each);
} else {
add«name.toFirstUpper().singular()»(each);
}
}
}

protected «to.getDomainPackage()».«to.name» «name.singular()»ForKey(Object key) {
for («to.getDomainPackage()».«to.name» each : get«name.toFirstUpper()»()) {
if (each.getKey().equals(key)) {
return each;
}
}
return null;
}
«ENDDEFINE»

I was testing this using the Library sample in Sculptor. I noticed problem with extended objects, such as Book, Movie that extends Media. The getKey method is not defined in Media. However, I just ignore this for now, since we don't have that kind of association in our model, and the intention is not to develop a general purpose solution.

Not completely done yet. We have the classical case with circular references. To avoid infinite recursion I added a collection that was passed as parameter to keep track of which objects that have been processed.

All this took me 2 hours to implement, probably much less than implementing it manually in all domain objects. The big benefit is that it is much less risk of manual faults and requires zero maintenance when making changes to the domain objects.

The final template file below, in case you are interested in implementing something similar:

«IMPORT sculptormetamodel»
«EXTENSION extensions::helper»
«EXTENSION extensions::dbhelper»
«EXTENSION extensions::properties»


«DEFINE deepMerge FOR DomainObject»
«IF !isImmutable()»
«EXPAND deepMergeMethod»

«EXPAND deepMergeOneReference FOREACH references.select(r | !r.many).reject(e | !e.changeable)»
«EXPAND deepMergeManyReference FOREACH references.select(r | r.many)»
«ENDIF»
«ENDDEFINE»

«DEFINE deepMergeMethod FOR DomainObject»
public void deepMerge(«getDomainPackage()».«name» other) {
java.util.Set<Object> processed = new java.util.HashSet<Object>();
deepMerge(other, processed);
}

public void deepMerge(«getDomainPackage()».«name» other, java.util.Set<Object> processed) {
if (processed.contains(this)) {
return;
}
processed.add(this);

«EXPAND deepMergeAttribute FOREACH attributes.reject(e | !e.changeable || e.isSystemAttribute())»

«FOREACH references.reject(e | !e.changeable) AS ref»
deepMerge«ref.name.toFirstUpper()»(other, processed);
«ENDFOREACH»
}
«ENDDEFINE»



«DEFINE deepMergeAttribute FOR Attribute»
«IF isPrimitive() -»
set«name.toFirstUpper()»(other.«getGetAccessor()»());
«ELSE-»
if (other.«getGetAccessor()»() != null) {
set«name.toFirstUpper()»(other.«getGetAccessor()»());
}
«ENDIF-»
«ENDDEFINE»


«DEFINE deepMergeOneReference FOR Reference»
public void deepMerge«name.toFirstUpper()»(«from.getDomainPackage()».«from.name» other, java.util.Set<Object> processed) {
if (other.get«name.toFirstUpper()»() != null) {
«IF to.isImmutable()»
if (!other.get«name.toFirstUpper()»().equals(get«name.toFirstUpper()»())) {
set«name.toFirstUpper()»(other.get«name.toFirstUpper()»());
}
«ELSE»
«to.getDomainPackage()».«to.name» currentValue = get«name.toFirstUpper()»();
if (currentValue == null) {
set«name.toFirstUpper()»(other.get«name.toFirstUpper()»());
} else {
currentValue.deepMerge(other.get«name.toFirstUpper()»(), processed);
}
«ENDIF»
}
}
«ENDDEFINE»

«DEFINE deepMergeManyReference FOR Reference»
public void deepMerge«name.toFirstUpper()»(«from.getDomainPackage()».«from.name» other, java.util.Set<Object> processed) {
for («getTypeName()» each : other.get«name.toFirstUpper()»()) {
if (get«name.toFirstUpper()»().contains(each)) {
«IF to.isImmutable()»
// replace
remove«name.toFirstUpper().singular()»(each);
add«name.toFirstUpper().singular()»(each);
«ELSE»
«to.getDomainPackage()».«to.name» currentValue = «name.singular()»ForKey(each.getKey());
currentValue.deepMerge(each, processed);
«ENDIF»
} else {
add«name.toFirstUpper().singular()»(each);
}
}
}

protected «to.getDomainPackage()».«to.name» «name.singular()»ForKey(Object key) {
for («to.getDomainPackage()».«to.name» each : get«name.toFirstUpper()»()) {
if (each.getKey().equals(key)) {
return each;
}
}
return null;
}
«ENDDEFINE»

Thursday, August 27, 2009

Screencast: Introduction to Sculptor

During the summer I have published a series of articles that illustrate basic usage of Sculptor. They include screencasts so that you get a feeling of what it looks like when using Sculptor.

If you are totally new to Sculptor you might need to read What is Sculptor? before looking at the practical example.

The series illustrates the following, step-by-step:

  1. Jump Start - Initial creation of maven and eclipse projects. Persistent entity and CRUD GUI are created in a few minutes.


  2. The World is Changing - Adding more to the application. Quick development round trip, short feedback loop, it is not a one time shot.


  3. Testing is Simple - Testability is crucial and is of course supported.


  4. Adding Behaviour - Generated code is well separated from hand written code.


  5. Say Hello - Entity, Repository and Service are some of the available building blocks. Yes, it is real DDD-style.


  6. Introducing a Type - Developing a high quality domain model is the core of Sculptor. Small type objects are typically part of a good domain model.


  7. Refactoring - How is refactoring done when having a mix of hand written and generated code?


Tuesday, August 25, 2009

GAE Transactions

I'm trying to understand what we should do to make Sculptor compatible with Google App Engine (GAE).

I feel a bit sad when looking back to what I have just experienced, but I guess I should be happy, since I have learned a lot. In this post I will share my mistakes and insights to GAE transactions and Entity Groups.

Together with Andreas I'm developing a little sample that consists of 3 interacting applications. Customer, Supplier and Profile apps. User stories for the initial sprint:

  • As a customer I want to specify a request for consultants so that I can allocate resources to my project.

  • As a salesman (supplier) I want to be notified when a customer enters a request for consultants so that I quickly can create an offer to that request.

  • As a salesman I want to offer consultants to a customer so that I can sell our services.

  • As a customer I want to see up to date information in the profiles so that I know that it is not obsolete.


I was developing the form enter of the inquiry in the customer app. I saved the form data in an Inquiry object and sent the request to the supplier app using RestTemplate. No problems so far.

We are using the new REST features in Spring 3.0 and have done some adjustments to Sculptor to make it generate JPA code that is compliant with GAE datastore.

Since one inquiry should be sent to many suppliers it didn't feel very scalable to send them all in the form entry request. Therefore I separated the sending to a separate job, which would be invoked by the cron service (later, better with task queue). This is not only more scalable, it is also more fault tolerant, since supplier apps may not be available all the time. By separating it we can easily retry later.

I created a Supplier entity also. In the sendToSuppliers job I got the first problem:

IllegalArgumentException: can't operate on multiple entity groups in a single transaction

Since I had two entities, Inquiry and Supplier and I was using both in the transaction I assumed that it was not allowed to query the Suppliers and update the Inquiries in the same transaction. I based that on the GAE documentation:
All datastore operations in a transaction must operate on entities in the same entity group. This includes querying for entities by ancestor, retrieving entities by key, updating entities, and deleting entities.

That assumption was a fatal mistake that got me on the wrong track. I started to separate the the retrieval of Suppliers and update of Inquiries in separate transactions.

I learned from the documentation that it was possible to disable transactions, but that it was a temporary workaround.

After removing all code except the update of the Inquiries I realized that the Inquiry instances themselves belonged to separate entity groups. I was looping over all Inquiries that had not been sent to suppliers, i.e. I was updating several instances. Of course, they belong to separate entity groups, otherwise it would not scale when the number of objects increase.

Then I redesigned the sending job so that it would only send and update one Inquiry instance. The job will have to be run many times to send all Inquiries.

On the way I learned some more things about GAE datastore:
* A transaction is necessary for some operations, such flush, otherwise; "This operation requires a transaction yet it is not active"
* Queries also require a transaction, otherwise when iterating over the result;
"Object Manager has been closed"
* Modification several times; "can't update the same entity twice in a transaction or operation"

In the end I think the defaults for transactions in Sculptor are alright. Normally we define transaction boundary at the service layer. This is ok for many cases when using GAE also, but one have to design the operations so that they only update one instance (entity group).

There is probably a need for more fine grained transaction control at the repository level. E.g. starting a new transaction for some repository operations. I think we should implement this with @Transactional annotations. Is it possible to mix txAdvice (defaults) with @Transactional (deviations from default)?

Refactoring

Sometimes I get the question "How is refactoring done when having a mix of hand written and generated code?" It is a good question, since refactoring is very important. The intention is that existing IDE refactoring tools will continue to serve you when using Sculptor.

When doing initial prototyping and you don't have any (or little) hand written code you can easily change in the model and re-generate. When you have hand written code you start with using the refactoring tools in the IDE, as you are used to. Thereafter you do corresponding change in the model and re-generate.

Eventual mistakes will normally be caught by the compiler and JUnit tests.

The following screencast illustrates how to rename Planet to Planet2.




Alternative video format (mpg)

Sculptor doesn't make it more difficult to do refactoring. Sometimes it makes refactoring easier, when the change only affects generated code.

Thursday, August 20, 2009

Introducing a Type

An important building block when creating a high quality domain model is to create small type objects. In this article we will create a Length type for the diameter of the Planet of the helloworld application.



Alternative video format (mpg)

Length is a typical Quantity with a value and unit, e.g. meter, kilometer.

In the design model it looks like this:
BasicType Length {
BigDecimal value min="0"
-@LengthUnit unit
}

enum LengthUnit {
cm, m, km
}

Entity Planet {
gap
scaffold
String name key
Long population min="0"
-@Length diameter nullable
-Set<@Moon> moons opposite planet

Repository PlanetRepository {
findByKey;
}
}


We also need to convert between different units. The behaviour expressed as a JUnit test:
public class LengthTest {

@Test
public void shouldConvertFromMeterToKilometer() {
Length length = new Length(new BigDecimal("31000"), m);
Length lengthInKilometer = length.to(km);
assertEquals(new Length(new BigDecimal("31"), km),
lengthInKilometer);
}

@Test
public void shouldConvertFromKilometerToMeter() {
Length length = new Length(new BigDecimal("44"), km);
Length lengthInMeter = length.to(m);
assertEquals(new Length(new BigDecimal("44000"), m),
lengthInMeter);
}

@Test
public void shouldNotConvertSameUnit() {
Length length = new Length(new BigDecimal("17"), km);
Length length2 = length.to(km);
assertSame(length, length2);
}
}

BasicType objects may contain business logic in the same way as other domain objects. The following screencast illustrates how to implement the conversion.



Alternative video format (mpg)

BasicType is a stored in the same table as the Domain Object referencing it. It corresponds to JPA @Embeddable.

There are a lot of cases when it is a good idea to introduce types.
  • Identifers, natural business keys. It is more readable to pass around an identifier type instead of a plain String or Integer
  • Money
  • Range
  • Quantity
I can recommend reading When to Make a Type, Martin Fowler.

Saturday, August 15, 2009

Say Hello

In previous article our Planet is capable of constructing a greeting message. This article shows how to make it possible for a client application to say hello to the Planet.



Alternative video format (mpg)

Let us create a PlanetService to expose the sayHello method to clients. We lookup the Planet from its name using the built in findByKey repository operation. In Sculptor model file this looks like this:
      Service PlanetService {
String sayHello(String planetName) throws PlanetNotFoundException;
}

Entity Planet {
gap
scaffold
String name key
Long population min="0"
Long diameter min="0" nullable
-Set<@Moon> moons opposite planet

Repository PlanetRepository {
findByKey;
}
}

All hand written java code we need to add is for testing and two trivial lines in PlanetServiceImpl:
    public String sayHello(ServiceContext ctx, String planetName)
throws PlanetNotFoundException {

Planet planet = getPlanetRepository().findByKey(planetName);
return planet.greeting();
}

Tuesday, August 11, 2009

Adding Behaviour

In previous articles we created a simple helloworld application without any hand written code. This article shows how to add some hand written business logic.



Alternative video format (mpg)

The behaviour to implement is that the Planet should be able to construct a greeting message based on its population. Test for this behaviour looks like this:
public class PlanetTest {

@Test
public void shouldSayHelloWhenHasPopulation() {
Planet earth = new Planet("Earth");
earth.setPopulation(7000000000L);
String message = earth.greeting();
assertEquals("Hello from Earth", message);
}

@Test
public void shouldBeQuietWhenNoPopulation() {
Planet pluto = new Planet("Pluto");
String message = pluto.greeting();
assertEquals("", message);
}
}
The video illustrates how to implement this.

As you see nothing special, you add the business logic in Java as usual.

Separation of generated and manually written code is done by a generated base class and manually written subclass, a gap class. It is in the subclass you add methods to implement the behavior of the Domain Object. The subclass is also generated, but only once, it will never be overwritten by the generator.

The gap class is not generated initially. When you need a gap class you specify that in the DSL with gap keyword.

Saturday, August 8, 2009

Testing is Simple

In previous articles we created a simple helloworld application without any tests. This article shows how to do integration testing of services with Sculptor.



Alternative video format (mpg)

We turn on generation of junit test and complete the failing tests from previous articles. For each Service there is a generated JUnit test class that we are encouraged to implement. It uses Spring transactional test fixtures and DbUnit.

Spring beans are injected in the test with ordinary @Autowired annotations.

public class PlanetServiceTest extends AbstractDbUnitJpaTests
implements PlanetServiceTestBase {
private PlanetService planetService;

@Autowired
public void setPlanetService(PlanetService planetService) {
this.planetService = planetService;
}

@Test
public void testFindById() throws Exception {
Planet found = planetService.findById(getServiceContext(), 1L);
assertEquals("Earth", found.getName());
}

@Test
public void testFindAll() throws Exception {
List<Planet> found = planetService.findAll(getServiceContext());
assertEquals(2, found.size());
}

@Test
public void testSave() throws Exception {
int countBefore = countRowsInTable("PLANET");
Planet planet = new Planet("Pluto");
planet.setPopulation(0L);
planetService.save(getServiceContext(), planet);
assertEquals(countBefore + 1, countRowsInTable("PLANET"));
}

@Test
public void testDelete() throws Exception {
int countBefore = countRowsInTable("PLANET");
Planet planet = planetService.findById(getServiceContext(), 2L);
planetService.delete(getServiceContext(), planet);
assertEquals(countBefore - 1, countRowsInTable("PLANET"));
}
}


The initial test data is defined in DbUnit xml file. The database is refreshed for each test method.
<?xml version="1.0" encoding="UTF-8"?>

<dataset>
<PLANET ID="1" NAME="Earth" POPULATION="7000000000" VERSION="0"/>
<PLANET ID="2" NAME="Jupiter" POPULATION="0" VERSION="0"/>
<MOON/>
</dataset>


Above tests only covers the normal cases so far and we should of course do more tests for exceptional cases and validation boundaries, such as negative population.

The tests illustrated here are kind of integration tests and you should do ordinary unit tests for domain objects and other classes of importance.

Tuesday, July 28, 2009

The World is Changing

In the previous article the simple helloworld application consisted of one single domain object, the Planet. In this article I will add some more features to the picture, including an association to the Moons of the Planet.



Alternative video format (mpg)

It is good to have a natural business key, which is used for equals and hashCode. The name of the Planet is a candidate. Let us add some properties for the population and diameter of the Planet also. Hibernate validator is supported and validations can be defined directly in the model, e.g. min="0".



We add the Moon Entity and its association to Planet.



Build and start Jetty. The changes are immediately reflected in the generated CRUD GUI.



The DSL and the code generation drives the development and is not a one time shot. The application can be developed incrementally with an efficient round trip loop.