Monday, September 10, 2012

Managing Expectations

Heyhey!

I've started several posts, six to be exact, one being a follow-up to my first episode of Building A Tester series, but I cannot complete any of them. My Achilles' Heel.

So I started a new one! :D

It's about an epiphany of what Jon Bach tweeted about the deeper meaning of Pass/Fail testing and how it's tied to one of the three phrases a Sogeti consultant should always remember. It's about promises, wishes and fulfillment their of. It's about managing expectations.

Now, let's start about what Jon Bach (brother of James Bach) tweeted:
"Passing a test" means "meeting expectations", but we don't often take time to examine those expectations in the first place.
This is dead-on and quite nicely points out the risk that lies within Pass/Fail Testing; Expectations are set, or as one of the replies to that tweet pointed out, managed poorly. And as requirements documentation is never a synonym for actual requirements and explicit/implicit expectations set to a software or system, and as they tend to change and reshape constantly, ground starts to fall from under Pass/Fail Testing...

Someone (Was it Michael Bolton?) once said something about judging a tree just by looking at the color of it's leaves. Ok, it can tell you something, but don't base you judgment on just that as there's so much more.

Book is more than it's covers.

Now, to those three phrases that guide our work as Sogeti consultants:

  • Manage expectations
  • Always one day ahead
  • You are never alone 

Those are actually not that bad if you come to think of them. The last two shortly about staying at forefront of testing, beating your competition, bringing value to customer, getting support from vast Sogeti network (20000+ people), etc. All very good. But what fascinates me the most is the "Manage expectations" part.

When you manage expectation as a consultant, the first thing that leaps into mind are the expectations aimed towards you and of course the means to manage them. The customer has certain wishes that you need to fulfill. They are not experts in the field of testing so at this point you have to sell them understanding and certain promises about what you deliver. Some consultants and even whole consulting companies quite blatantly exploit this possibility and voilá; You've bought a health care system for 1,8 billion euros (article in Finnish, sorry)! :D

Empty promise?
Some consultants walk right into the deep end of the pool by promising too much. Then they have work long hours over the weekend and vomit blood to fulfill these promises. Apparently they haven't been taught as a child that you eat what you take. So start small and build you way up! If your performance has a 5 step scale, rising from 2 to 4 looks better than dropping from 5 to 4. This is not about slacking, but adjusting the expecations to fit your ability to fulfill them.

So, let's put this personal adjustment to project level. There are several ways (CMMI, Six Sigma, etc.) to measure how well the project is going, but all of them rely heavily on people's ability to utilize them. Quite often the results aren't that good, and quite often because everything's just too big and complex; Softwares, systems and processes are often built like Saint Isaac's Cathedral. People are aiming to do something extraordinary without ever measuring their capability to achieve that. The whole thing swells beyond control and no matter how good the methodology or how good the people are, failure is inevitable.

You eat what you take.

Brass tacks; Do something you KNOW you're able to do well (looked from every reasonable angle and dimension) and build from there. You can have grand visions about excellence, but have also very clear vision about the steps that take you there. Some of them will take you backwards, which is only good. Fail and learn. Fail again and learn again. Adjust! If the only thing you're able to do is walk, do it well. When you're absolutely sure about your ability to walk, start running. Do it well. Fall. Do it well. Walk. Do it well. Run. Do it well. And if you're responsible for someone else's expectations, make sure the easier ones are fulfilled before moving into more difficult ones.

Even Pass/Fail testing, which is still good for some occasions, has now a better propability of succeeding. For instance agile methodologies revolve around constant adjustments and building manageable increments in short iterations. You eat what you take and you take some more if that wasn't enough. That's all there is to it.

I actually felt a bit ashamed when writing this, because I have an assumption you already know all this, dear reader. It is indeed as elementary as it gets. I just had to get it out from my system. But if it made you think and alter you behaviour to better, I'm truly happy.

Quote time! This is actually one of my favourite quotes as it culminates our key strength as human beings and of course, professional testers:
"It is not the strongest of the species that survives, nor the most intelligent, but the one most responsive to change." -Charles Darwin
Yours truly,

Sami "Yey! I completed a blog post!" Söderblom

6 comments:

  1. Hi Sami,

    I have done a lot of thinking on expectations (or on ones purpose in a given testing situation), so thought I would want to comment on this.

    I kind of feel that it is often a problem that the testers try to set the expectations, or just work by their "default" expectations, instead of exploring all the possible ones. Examples would be that:
    1. Testers tell the customer what is their purpose (so kind of set the expectations)
    2. Testers just start testing and reporting as many problems as they can.
    Both being more tempting (and dangerous) if the customer does not seem to care on what you actually are doing.

    So basically, the harder bit to me is to find out the meaningful expectations, i.e. the purpose(s) of testing in the given time. So would you see a fourth phrase "Find out all expectations, and negotiate on them" being useful for Sogeti? Or do you kind of include it already to the managing them bit?

    Not taking too much on your plate is another skill then (which I suck at), and I like what you say about it. It's (or is it?) kind of like admitting that you are only human, but at the same trying to show that still possibly a good one for the job..?

    ReplyDelete
  2. Hi Anssi,

    Thanks for your feedback! You are right on the money with exploring all the possible expectations. Many testing knowledge sources, even ISTQB and TMap, talk about explicit and implicit requirements, or expectations in general. Explicit ones often represent the requirements documentation, which unfortunately is only a fraction of the truth. Implicit ones on the other hand are all that what those who matter cannot put into words, but often are the actual "want" that is put to product or service. Implicit requirements/expectations/wants are what makes requirements management so darn hard.

    A good tester sees this an looks beyond cumbersome requirements management processes, digs deep, explores and finds information even from unexpected places. This is indeed hard work and many give up. They either move into other positions or - what's worse - revert into handling only the obvious information and manipulating customer as you nicely pointed out there. Eventually quality regresses and nobody seems to know why.

    I'm being a bit harsh here, but this is the deal. A tester not willing to dig important information from beneath the surface shouldn't do testing on the first place. They are the reason testers are often considered second class citizens, an unproductive element which should be discarded, replaced by automation or outsourced to a country which offers the cheapest deals at the moment. They might have a fine career in verification, development or selling hot dogs, but in testing they do only harm.

    In testing work I put these tough expectations on "managing expectations", but when handling your workload I cut some slack. There's just too many components involved and often what is put to your plate cannot be entirely controlled by you. But there's this thing I call "ping-ponging"; You swing back everything that might cause you to lose control of your work and decisions you make. This requires a lot of "Do you understand that by participating on all these meeting I cannot get work done?", "Do you understand that by discussing directly to each other we remove a lot of email overhead?", etc. talk. It's a balancing act. Lots of common sense required (even though not written anywhere ;). Lots of selling common sense to those who might not have it. Lots of building reason to where it's absent. Admitting that you and people around you are only human, but indeed the best ones for the job.

    Ok, it happened again. I started writing and couldn't stop. Time to read it through if it made any sense... :D

    ReplyDelete
  3. I came to think about oracles, more precisely Bach's/Bolton's HICCUPS mnemonic and Doug Hoffman's approach. Cem Kaner wrote an excellent article about those: http://kaner.com/?p=190.

    If a tester manages those, there's no reason for me to be angry at all. :)

    ReplyDelete
  4. Agreed. Finding out the WHAT is a big big deal.

    But I was also referring to expectations on testing, stuff on top of the expectations that testing has on the product.. In other words the WHY.

    Expectations such as;
    - we are expecting especially on finding certain kind of bugs (versus all bugs)
    - we are expecting to get reports on only critical bugs (versus a lot of bugs) (although Critical bug includes a lot expectations as well)
    - we are expecting in getting testing done cheap
    - we are expecting a lot of process improvement suggestions
    - we are expecting you to assess vendors success
    - we are expeting you to include and teach end users on the game
    - etc etc

    There's a Lot of stuff you can do in testing.. Out of which some are sometimes more important then the others.

    Am I making any sense? Or am I just totally missing the point of this blog entry :)

    ReplyDelete
  5. You are making very much sense. I think it's me that got sidetracked. A lot fits under the umbrella of managing expectations. More than I could apparently handle... :)

    Those are funny expectations, but still the very default array that customer throws at you. I've always wondered why they do that? Where have they learned that e.g. certain bugs should be found, and sometimes even an expected number?

    I've seen companies that give a quarantee that they'll find 50 bugs in a week (or was it in a day). They use it as a sales pitch. However there's no saying what kind of bugs. I could point out 50 bugs from this blog post and you would be none the wiser!

    This has grown a counteraction; Some people have begun to hate metrics blindly. A sight of numbers makes them climb over barricades and write hateful blog posts. They see a clock and they stomp on it. Some could even be excellent testers without this handicap.

    I think metric-religious people are on Level 1 on understanding testing. Metric-haters have understood the dangers of metrics, so they could be Level 2. What about Level 3? A combination of both? Level 4? Level 5? Level N?

    I think that when understanding the expectations you're setting on deeper level than just one book or school of testing tells you, when you have thought them thoroughly through individually and together, when you have made an actual effort of finding the latest knowledge about each expectation you set, you might be on to something.

    Unfortunately this is unreasonable. That's why there's people like you and me to help people to understand. ;)

    ReplyDelete
  6. Somehow I came to think of losing weight to gain health. Some start to exercise, eat healthier, etc. and some only buy a new mirror that makes them look slimmer. The means and results can vary even though the expectations can be identical. I sense a new blog post brewing... ;)

    ReplyDelete