Archive for the ‘Experience Report’ Category

I just wrote a report about europython on my company blog.

Today at europython we listened to a keynote about Bletchley Park. This was the centre of British and allied codebreaking activities during the second world war, and where the first digital, programmable computer was built, Colossus. We heard about the current financial plight of the museum there, and the need for investment to renovate the huts that amongst others Alan Turing worked in. Dr Sue Black told us about her experiences trying to help lobby the government for more money for Bletchley park, using social networking, blogs and twitter. She recounted that she had recently met an elderly gentleman, one of the surviving codebreakers. She told us how close she felt to history when he related a story about when he was decoding a nazi message during the war, and his shock when he got to the end and discovered the message was signed “Adolf Hitler, Fuhrer”.

As a professional programmer, I think the site where the first digitally programmable computer was built has to be a place worth preserving. I hope that people will be able to visit there and see the reconstructed Colossus computer and be inspired by the stories of innovation and codebreaking that it enabled.

It was particuarly poignant for me to think about this when in the next session I checked my email and found a message from my mother saying that my grandmother died this morning. She was a living link to the history of the second world war for me. During the war she was a wireless operator, transmitting and receiving messages in morse code. And now she is not there any more. I am kind of in shock. But it just confirms for me that we need museums like the one at Bletchley Park to retain contact with our history.

(I wrote this post yesterday)

Recently I’ve had the priviledge of working with a team of developers where I sit in the same room as half of them, and the other half are in China. My role is to help them to develop a suite of automated system tests alongside the production code. After a few month’s work, we now have quite a substantial product, with quite a substantial test suite.

When we started, very few of the developers had written much in the way of system tests, and even fewer knew how to write good, maintainable ones. Over the weeks, I have been promoting practices to enhance test readability, reviewing test code, and pointing out areas that need better coverage.

I’ve noticed that with the local developers, reviews and feedback are usually conducted face to face, informally, whereas with the offshore developers, it all goes via email, with a substantial time delay. This has meant that the Swedish developers have learnt faster, since they benefit from shorter feedback cycles, and a richer form of communication. Having said that, the Chinese developers are doing nearly as well. They seem really motivated to deliver what I ask for, and keep requesting and responding to feedback until they have written what I consider to be some pretty good tests.

It’s not all sweetness and light, however. As much as learning the technical skills of writing tests, the team needs to learn the culture of maintaining them. The CI server complains the build is broken far too often, and it is because the developers generally are not running the tests before they check in. My perception is that the offshore developers are worse at this, and my interpretation is not that they are somehow less good developers, far from it. I think that they just don’t have the same management support to spend time on maintaining the tests as the onshore ones.

Management in Sweden has really bought into the idea that investing in automated tests pays off over the long term, and vigorously support me in discussions with recalcitrant developers. Management in China has not. My impression is that they see only the costs associated with writing, running and maintaining automated tests, and would rather hire some (ridiculously cheap) Chinese students to run manual tests instead.

I would like to believe that this automated test suite is a really good investment for the future of this product. My experience tells me it should enable regression bugs to be found very soon after insertion, and enable much more frequent product releases. (You don’t have to wait for a 6 week manual test cycle before each release). Over the many year lifetime of the product, this should significantly outweigh the initial investment we have made creating it, and the ongoing costs of keeping it running.

The reality may be quite different. Future versions of the product will likely be developed entirely in China, and I suspect that without their Swedish colleagues’ enthusiasm, the Chinese management might decide the test suite should be quietly dismantled and left to rot. That may be the right economic decision, although it makes me weep to think of it. All I can do is console myself with the thought that at least the tests are so readable they will be easy to convert into manual test cases detailed enough for dirt cheap unskilled Chinese students to perform.

At the speakers dinner the night before the conference:

Ola Bini: “Do you have any actual code examples in your talk about clean code tomorrow?”
Me: “No”
Ola Bini: “Well, I’m sorry but that means I can’t come and listen to it”

Not such an auspicious start perhaps, but fortunately about 125 other conference participants didn’t seem to mind the lack of actual code, and did turn up for my talk. Some of them even blogged favourably about it. To my surprise, some guy came up to me afterwards and said he helped organize the JFokus conference, and did I have a Java talk I could give at it?

It was a lot of work preparing my presentation, and I got some really useful feedback from the two practice runs I did, at GothPy and for my colleagues at IBS. It was this feedback that prompted me to take out all the code examples I originally had in the presentation, actually.

Overall the conference seemed to go really well. There were about 450 participants, about 40 speakers, and 6 parallel tracks. I attended some great sessions, too but I’ll leave a summary of them to another post.

Just in case you were wondering, I didn’t go to Ola Bini’s talk either 😉

At agile2008 I attended a session with Dan North about Behaviour Driven Development. Someone on the agile sweden mailing list was asking about it, so I decided to write up my notes here.

Most cellphone and computer software is delivered late and over budget. The biggest contributing factor to cost bloat is building the wrong thing. So what software and business people need is “a shared understanding of what done looks like”.

Test Driven Development is about design, conversations, and writing examples for a system that doesn’t yet exist. It’s not really about testing. However, once the system exists, your examples turn into tests, as a rather useful side effect.

A User Story is a promise of a conversation, and it is in that conversation that things go wrong. The customer and developer rarely agree what “enough” and “done” look like, which leads to over- or under- engineering.

Dan suggests a format for User Story cards which aims to prevent this communication gap.

On the front of the User Story index card, you have the title and narrative. The narrative consists of a sentence in this format:

As a stakeholder
I want feature
so that benefit

where benefit is something of value to stakeholder.

On the back of the card, you have a table with three columns

Given this context | When I do this | then this happens

Then you have 4 or 5 rows in the table, each detailing a scenario. (If you need more than that then the story is too big and should be split)

Dan finds that in his work, this leads to conversations about User Stories where “done” and “enough” are discussed, and defined.

User Stories should be about activities, not features. In order to check that your User Story is an activity, you should be able to do a thought experiment where you implement the story as a task to be performed by people on rollerblades with paper. You must think about it as a business process, not a piece of software.

When creating the story cards, the whole team should be involved, but it is primarily the business/end user stakeholders and business analysts who write the title and narrative on the cards. They then take in a tester to help them to write the scenarios.

Are people familiar with the V model of software testing? When this was conceived, they thought that the whole process would take 2 years, and span the whole project. Dan ususally does it in 2 days. Many times for each project.

Then Dan offered to show us how to do BDD using plain JUnit. He requested a pair from the audience, so I volunteered. At this point my notes dry up, and I am working from memory, but I think the general idea is like this.

You talk about “behaviour specs” not tests. The words you use influence the way you think, and “behaviour specification” gives much better associations than “tests”.

Each behaviour specification should be named to indicate the behavour it is specifying. Not “testCustomerAccountEmpty” rather “customerAccountShouldBeEmpty”.

In the body of the spec, you can start out by typing in the prose of one of the scenarios you have on the user story, as a comment.

//given we have a flimble containing a schmooz
// when we request the next available frooble
// then we are given a half baked frooble and the schmooz.

Then you can fill in code after the “given” comment. When you have code that does what the comment says, delete the comment. Repeat with the “when” and “then” comments.

In this way, you build up a behaviour specification that drives your development of the system. A few minutes later (hopefully) you have a system which implements the specification, and at that point your spec helpfully turns magically into a regression test which you can run. At that point you can start calling it a test if you like. But actually it is more helpful to your brain to continue to think of it as a behaviour specification. It leads to much more constructive conversations about the system.