Poll: Help us make Spock rock the (testing) universe!

July 7, 2010 5 comments

Since its first release in March 2009, Spock has come a long way. Time to ask you where to head next! Please take a moment to share your thoughts. Thanks!

Categories: Spock Framework

What’s New In Spock 0.4? Episode 2: Better Mocking

June 13, 2010 7 comments

As you may know, Spock comes with its own mocking framework, tightly integrated with the rest of the specification language. While certainly useful and already quite powerful, the mocking framework used to be one of the lesser developed parts of Spock, and had a few rough edges. Well, not anymore! Here are the improvements we’ve made for 0.4:

1. All mocks are thread-safe

Previously, things could go wrong if a mock was called from a thread other than the spec runner’s thread. Now, the mocking framework will behave correctly no matter how many threads you throw at it.

2. Vararg syntax

In Groovy, all methods whose last parameter has an array type can be called with vararg syntax (even methods implemented in Java). Therefore, it was only logical for Spock to support the same vararg syntax when defining interactions with such methods. Here is an example.

Suppose you have the following Java interface:

class Subscriber {
  public void receive(String... messages); // or: receive(String[] messages)
}

An interaction between a Publisher and its Subscriber could be described as follows:

given:
def publisher = new Publisher() // this is the object under test
def subscriber = Mock(Subscriber)
publisher.add(subscriber)

when:
publisher.sendNotifications() // let's say this should send "foo", "bar", and "baz"

then:
1 * subscriber.receive("foo", "bar", "baz") // vararg syntax

Alternatively, you could describe the interaction as follows:

then:
1 * subscriber.receive(["foo", "bar", "baz"]) // list syntax

Prior to 0.4, only the list syntax would have worked; the vararg syntax would have resulted in an InteractionNotSatisfiedError.

3. Sensible defaults for toString(), equals(), and hashCode()

Previously, a mock object’s toString(), equals(), and hashCode() methods didn’t get any special treatment. If you didn’t say otherwise, they would always return null, false, and 0, respectively. As of 0.4, more sensible defaults are in place: toString() now returns a descriptive message, equals() implements object identity, and hashCode() delegates to System.identityHashCode() (which behaves like Object.hashCode()).

Another change is that toString(), equals(), and hashCode() no longer match the wildcard method (as in “1 * foo._()”). This prevents tests from failing in the presence of tools like debuggers, which often call these methods for their own purposes.

Despite all this, toString(), equals(), and hashCode() can still be stubbed (and even mocked) like any other method, overriding their default behavior. Most Java mocking frameworks don’t support this feature.

4. _ as an abbreviation for _._

By default, Spock allows interactions that haven’t been specified explicitly, and returns default values (null, 0, false) for them. This helps to avoid over-specification and makes tests more resilient to change. In cases where you want to be more strict, add the following as your last interaction:

0 * _._ // no (other) call of any method on any mock object

In 0.4, this can be abbreviated to:

0 * _  // nothing else!
5. Support for property syntax

Properties and getter methods can now be stubbed with property syntax:

item.price >> 42

However, mocking a property setter still requires method syntax:

1 * item.setPrice(42) // not: 1 * (item.price = 42)
6. Ordered interactions

By default, interactions may occur in any order. For example:

when:
...

then:
1 * foo.moo()
2 * bar.baz()

Here we are expecting a total of three calls, but don’t demand a particular order. As a consequence, the following invocation order would be acceptable:

bar.baz()
foo.moo()
bar.baz()

In general this is a good thing, because it prevents you from specifying unimportant details which might change over time. However, sometimes order is really important. Therefore, you can now impose ordering constraints by using multiple then-blocks:

when:
...

then:
1 * tank.fill()
1 * door.close()

then:
1 * plane.takeOff()

To paraphrase: “I don’t care if you first fill the tank or close the cabin door, but you must do both before takeoff!” If this constraint isn’t met, Spock will throw a WrongInvocationOrderError.

That’s it for better mocking in Spock 0.4. In the next part of this series, we’ll have a look at how Spock 0.4 simplifies testing concurrent code.

Categories: Spock Framework

What’s New In Spock 0.4? Episode 1: Data Tables

March 11, 2010 22 comments

With the Spock Framework 0.4 release around the corner, I thought it was time to demonstrate some of the upcoming new features (which are already available in recent snapshots). So here comes the first episode in the mini-series “What’s New In Spock 0.4”. Today’s topic is data tables, a new way to write data-driven test methods.

Data-driven testing has always been one of Spock’s strong points. To give you some background, let’s have a look at a typical example taken from the spock-example project:

def "maximum of two numbers"() {
  expect:
  Math.max(a, b) == c

  where:
  a << [3, 5, 9]
  b << [7, 4, 9]
  c << [7, 5, 9]
}

This method is testing the Math.max() operation. The expect block contains the test logic (what we expect from the Math.max() operation), and the where block contains the test data. The key to interpreting the where block is to read it from top to bottom. That is, variables a, b, and c will be assigned values 3, 7, and 7 for the first run of the method; values 5, 4, and 4 for the second run; and values 9, 9, and 9 for the third run. We say that the method has three iterations.

Data-driven test methods are a powerful tool, but the where block syntax shown above suffers from two problems:
1. It is a bit noisy (shift operators, brackets, commas)
2. It is read from top to bottom, which is less intuitive than reading from left to right (at least for people used to left-to-right languages).

This is where data tables come in. Their purpose is to provide a convenient syntax for in-line test data. Let’s modify the previous example to use a data table:

def "maximum of two numbers"() {
  expect:
  Math.max(a, b) == c

  where:
  a | b | c
  3 | 7 | 7
  5 | 4 | 5
  9 | 9 | 9
}

Run this example in Spock Web Console

Now the where block reads like a table. The first row (the table header) specifies the data variables, and the remaining rows specify their values for each iteration. Compared to the previous example, this one is both nicer to read and write.

As demonstrated in their specification, data tables cannot just hold literals but also more complex expressions. However, other forms of parameterizing a test method (like the one we saw in the first example) haven’t lost their value. For example, loading data from external sources is still best left to multi-parameterizations:

where:
[a, b, c] << sql.rows("select a, b, c from maxdata")

See DatabaseDriven for the full example.

Apart from the different syntax, data tables behave like all other forms of parameterizations. For example, their data variables can be optionally declared as method parameters, and they can be combined with derived parameterizations. To learn more about the different forms of parameterizations, see their documentation.

That’s it for the first episode of “What’s New In Spock 0.4”. Please tune in tomorrow for the second episode. Until then, I’ll leave you with a real-world usage of data tables taken straight from the Sales Intelligence Engine codebase:

@Unroll
def "determine dominant color"() {
  when:
  action.image = image
  action.ignoreWhite = ignoreWhite	
  action.execute()
	
  then:
  action.dominantColor == dominantColor		
			
  where:
  image                   | ignoreWhite | dominantColor 
  '/images/white.png'     | false       | 'ffffff'
  '/images/black.png'     | true        | '000000'
  '/images/28843_300.jpg' | true        | 'ffcc33' 
  '/images/20341_300.jpg' | true        | 'cc6666'
  '/images/20692_300.jpg' | true        | 'cccccc' 
  '/images/7870_300.jpg'  | true        | '993333'
}
Categories: Spock Framework

Empowering Annotations with Groovy Closures

March 4, 2010 10 comments

A few days ago, we received the following feature request for the Spock Framework:

Run a specification only if JDK 1.6 or higher is present, and skip it otherwise.

Given Spock’s extensible nature, it would have been easy to jump straight in and implement this as an extension activated with @JdkVersion(x). But this approach didn’t feel right to us. Surely, users would soon come up with other constraints (JDK version less than x, OS version equal to y, and so on), and we wanted a solution that could handle them all. Alas, if Groovy just allowed us to write:

@RunIf({ jdkVersion >= 1.6 })
class MySpec extends Specification { ... }

Now, this would be heaven. No more @JdkVersion, @OsVersion, and so on. Instead just one annotation that takes an arbitrary Groovy expression (contained in a closure) and is processed by one Spock extension. Problem solved – forever. (At this point you might wonder where “jdkVersion” comes from. I assume we would provide some convenience properties like it, but with the power of Groovy at your fingertips, you could always go beyond them.)

Staring at my screen, I became a little saddened because I knew that Groovy wouldn’t let me stick a closure into an annotation. But wait – the code in my IDE wasn’t underlined in red! Was it a bug, or were I up to something here? Hastily I fired up Groovy’s AST Browser and noticed that Groovy’s grammar does allow closures as annotation values. It is only the compiler’s code generator that eventually raises a compile error. At this point I became nervous. Could it be that I was just one AST transformation away from heaven?

In case you haven’t heard about AST transformations yet, let me quickly explain what they are: a way to hook into the Groovy compiler and participate in a program’s compilation. What’s particularly nice about AST transformations is that they are trivial to activate from the outside: Just put a transformation on the class path, and the compiler will pick it up automatically. Some of Groovy’s built-in features like @Lazy and @Immutable are in fact based on AST transformations, and so are many of Spock’s features.

With a rough idea in my head, I started coding an AST transformation that would allow me to pass closures as annotation values. A few hours later, I had a pretty compelling prototype done. Let me show you an example of what you can do with it today. Let’s say we wanted to build a simple field-based POGO (Plain Old Groovy Object) validator. To start out, let’s define an annotation type:

@Retention(RetentionPolicy.Runtime)
@interface Require {
  Class value()
}

The annotation type has one attribute of type Class. That’s where we will stick in our closure. Really, it’s a simple as that. Now let’s code the validator:

class Validator {
  def isValid(pogo) {
    pogo.getClass().declaredFields.every {
      isValidField(pogo, it)
    }
  }

  def isValidField(pogo, field) {
    def annotation = field.getAnnotation(Require)
    !annotation || meetsConstraint(pogo, field, annotation.value())
  }

  def meetsConstraint(pogo, field, constraint) {
    def closure = constraint.newInstance(null, null)
    field.setAccessible(true)
    closure.call(field.get(pogo))
  }
}

To paraphrase the first two methods, a POGO is valid iff every of its fields is valid, and a POGO field is valid iff its constraint is met (or it has no @Require annotation). This leaves us with the meetsConstraint() method, which deserves a closer look. Remember that constraint holds the closure’s Class, which now gets instantiated with Class.newInstance. Since the closure only needs access to the field’s value, we pass null for delegate and thisObject. Finally we make the field accessible (important for non-public fields), get the field’s value, and call the closure. That’s it!

To demonstrate that our validator works, let us add some test code. First we define the POGO that is going to be validated:

class Person {
  @Require({ it ==~ /[a-z A-Z]?/ })
  String name
  @Require({ it in (0..130) })
  int age
}

What we are saying here is that a Person’s name may only consist of the characters a-z (in lower or upper case) and the space character, and that a Person’s age must lie between 0 and 130. Admittedly these are quite simple constraints, but it’s all just plain Groovy! If necessary, we could compute the answer to the universe instead. Well, almost. But you get the point.

Now it’s time to validate some persons:

def validator = new Validator()

def fred = new Person(name: "Fred Flintstone", age: 43)
assert validator.isValid(fred)

def barney = new Person(name: "!!!Barney Rubble!!!", age: 37)
assert !validator.isValid(barney)

def dino = new Person(name: "Dino", age: 176)
assert !validator.isValid(dino)

If you run this code (read on to learn how it’s done!), you will find that Fred meets both constraints, Barney gets a little too excited about his own name, and Dino is apparently too old (although he certainly doesn’t look like it). What more could we expect from a simple validator?

Although they are in their early days, annotation closures (that’s the working title) can already do more than what I showed you here. For now, they live in my groovy-extensions project on GitHub, and will be updated regularly. This means that you can start experimenting with annotation closures today, and use them in your apps tomorrow. In fact, if you happen to have Groovy 1.7 installed, do yourself a favor right now: Open GroovyConsole, paste in the code you’ve just seen, and kick off a few validations! Thanks to Grape, the groovy-extensions Jar (only a few kB large) will be downloaded on the fly. Alternatively, you can download the Jar from Spock’s build server (see the Artifacts tab).

Of course, my longer-term plan is to roll annotation closures into Groovy 1.8, and I’m pretty confident that it will happen. Spock users will be able to benefit from annotation closures even sooner; after all, someone* out there is waiting for @RunIf!

Now it’s your turn. Fork the groovy-extensions project, start your Groovy engines, and keep the feedback coming!

* Those of you following me on Twitter might know that I myself requested this feature, in order to simplify testing JDK 1.6-only features in Spock’s Spring extension.

Categories: Groovy