Quality & Agility in Software: Session With Paul Carvalho

Quality & Agility in Software: Session With Paul Carvalho

Quality And Agility

Last week, at Magnet Forensics we were fortunate enough to have a very talented quality expert come in and talk to us. Paul Carvalho was able to bring a great deal of information, perspective, and activities to our development team that truly proved to be an eye opener. We were all extremely excited to get to sit down and hear what Paul had to say.

There’s lots to read about Paul over at his STAQS site, but I’ll re-iterate some of it here. Paul has held many roles when it comes to software development. He’s been a developer, a manager, a tech support person, and part of quality assurance. He certainly has a full perspective on software development. Coming from a science background, he does a great job of explaining why things are a certain way or why you’ll expect results given some conditions. This made understanding his approach to quality and software much easier for me to get a handle on.

It felt like we covered so much, but I want to provide a high level of the take away points that really stuck out for me.

Quality: What Is It?

Paul started us off by getting us to describe what quality means. In the end, we arrived at something that provides value (where value is actually defined) to an audience (again, where audience is actually something that’s defined). So what does that mean? Quality is dependent on who and what you’re talking about, so it’s important that when you’re trying to deliver quality that you are all aligned on what exactly quality will mean.

To illustrate his point, Paul had us perform an exercise where we really only used one method of communicating some information to our partner. The result of the activity proved that even when sitting right beside your partner, you can lose a ton of information in how you communicate. So having tickets written up or having a one way communication channel is almost guaranteed to cause some misunderstanding when it comes to expectations. Part of the solution? Converse. Actually have the person describe back to you what you said. This way, you both start to hone in on what your expectations are. It was a cool exercise, and it definitely proved his point.

Perspective

We also discussed perspective a lot in our session with Paul. He had examples where simply altering your (physical) view point on something would cause you to describe something completely different. From there, we drew parallels into our software development process. There’s the old “but it builds on my machine” scenario that mirrors this really well. Paul really wanted to drive home that product owners, developers, and testers all have different perspectives on things so we need to get on the same page to ensure we can deliver quality.

Paul also asked us what seemed to be a really simple question, which for a few of us meant that it actually had to be a loaded question underneath the obvious. By showing us something on the TV in our board room, he asked us if what was being shown was a particular object. The easy answer sounded like “well, duh, yes”, but he was driving home a huge point on perspective. We weren’t looking at the object, we were looking at a TV showing a digital rendering of a piece of artwork that depicted some object. The answer was kind of abstract, but it was really cool to have that conversation about what you’re really looking at.

Awesome Agile Activity Alliteration

As a team lead, there was one activity I found absolutely awesome. I’m hoping everyone got the same take away form it that I did. Paul had us work as one big group to perform some activity. The goal was to circulate as many balls through every individual in the group as we could in two minutes. There were a few rules that I won’t bother explaining, so it wasn’t necessarily that easy.

Quality & Agility in Software: Session With Paul Carvalho (Image by http://www.sxc.hu/)

Having 15 people shouting and debating in the first planning portion made things pretty tough, but we came up with a workable solution. Paul made us estimate how many balls we thought we could get through, and we felt so poorly about our design that we said only four. We tried it out, and managed to get about 20. Awesome!

We were told to do another iteration, and he gave us another mini planning session. Again, the shouting and mayhem ensued, but we wanted to tweak our first approach. We finally organized, and when we tried it out, we doubled our initial throughput. We were onto something. The next planning session, the shouting was a little less, and we added a few more tweaks. We noticed the following problems from before:

  • We were moving too fast, so balls were dropping.
  • We were only moving one ball between people at a time.
  • Our throughput was nearing the limit of how many balls we had.
  • Our counting was extremely poor.

Solutions? Slow down. Move two balls instead of one, which should be easy if we’re going slower. Recirculate the balls instead of putting them back in the container. Dedicate one person to counting that wasn’t me (the person who would either inject more balls or get them from the end to recirculate). We shattered our previous record, nearly doubling it again. Next planning was similar. We addressed some minor pain points and decided to go for a whopping four balls at once. We added another 25 to our throughput, so we were sitting around 100.

At this point we realized we were nearing the potential maximum throughput for this design and given the time restrictions for planning, we couldn’t generate much to improve upon. We debated a completely different approach for a bit, and realized it was going to be unrealistic–but this added to the takeaway.

So, with that said, what WAS the takeaway from this activity and why did I like it so much?

  • The activity is a direct parallel to our sprints. The balls are the features we’d be developing.
  • The planning process was hectic. Everyone was involved and generated great ideas, but it took a select few “leaders” to actually pull triggers. Otherwise, nobody would agree on ANY of the great ideas we heard.
  • Dropping a ball was equivalent to a problem sneaking through or something not getting done. In terms of quality, this was like a customer seeing a bug. The more people that touched it likely meant that the bug would be smaller or have less of an impact. A cool little quality parallel.
  • We need to be continuously improving. We should leverage our retrospectives to improve our process. It’s important that we keep experimenting with our process to see if it can be improved.
  • There will be a point where we think we can’t improve anymore with our current process. If we’re not happy with throughput, it may require a completely different approach.

Summary

The whole team thought having Paul in was great. We have a ton of things to take back to the drawing board to use and try to improve our processes. I think it’s safe to say any one of us would recommend consulting from Paul. His insights are excellent, he conveys his thoughts clearly, and he has activities that are so incredibly aligned to the points he’s trying to get across.

I’m looking forward to using Paul’s techniques for improving the quality in our software and the processes we use to develop. Thanks again, Paul.

Links

author avatar
Nick Cosentino Principal Software Engineering Manager
Principal Software Engineering Manager at Microsoft. Views are my own.

Leave a Reply