Archive for July, 2011
Every few months the STE vs. SDET debate reemerges like the crazy outcast relative that comes to visit unexpectedly and sits around complaining about imaginative ailments, and reminiscing about how things were in the good ol’ days. We certainly don’t want to be rude to our relatives, so we tolerate their rants while watching the clock and giving subtle suggestions about the late time. But, with the ridiculous ‘debate’ between STE and SDET I can be rude; drop it! It’s a baseless discussion without merit. It’s only a title!
In this previous post I explained the business reasons why Microsoft changed the title from STE to SDET. But, for some reason people commonly mistake the title with the role or job function. In the good ol’ days our internal job description for STE at level 59 included ‘must be able to debug other’s code,’ and ‘design automated tests.’ Almost all STEs hired prior to 1995 had coding questions as part of their interview and were expected to grow their ‘technical skills’ throughout their career. That was the traditional role of the STE.
As I explained in this previous post we established the title of SDET to ensure that testers at a given level in one organization in the company had comparable skills to another tester in a different organization. As part of the title change, the company decided that we needed to reestablish the base skill set of our testers to include ‘technical competence.’ Unfortunately when the career profiles were introduced some managers misinterpreted ‘technical competence’ with raw coding skills and the naive ideology of 100% automation. These same managers now complain their SDETs don’t excel at ‘bug finding’ and customer advocacy.
On my current team, the program managers are big customer advocates. They run their own set of ‘scenarios’ against new builds at least weekly. My feature area is testing private APIs on our platform. Our primary customers are the developers who consume those APIs, but we also must understand how bugs we find via our automated tests might manifest themselves and impact our customers. So, our team spends quite a bit of time also self-hosting, doing exploratory testing, and we even started a new approach that takes customer scenarios to the n-th degree that we call "day in the life" testing to help us better understand how customers might use our product throughout their busy days. Our product has 93% customer satisfaction.
So, if its true that the SDETs on some teams aren’t finding bugs and lack customer focus (and I suspect it is for some teams) then they hired the wrong people onto their test team. If SDETs don’t balance their technical competence with customer empathy then we have a problem; and I will say it is likely a management problem.
The testing profession is diverse and requires people to perform different roles or job functions during the development process and over the course of their career. Microsoft didn’t eradicate the STE “role” we simply changed the title of the people we hire in our testing “roles” and reestablished the traditional expectations of people in that role.
Differentiating between STE and SDET in our industry seems nonsensical to me, and I also think this false differentiation ultimately limits our potential to positively impact our customer’s experience and advance the profession. Testers today face many challenges, and hiring great testers (regardless of the job title) is about finding people who not only have a passion and drive to help improve our customer’s experience and satisfaction, but can also solve tough technical challenges to advance the craft and help improve the company’s business.
Last Sunday evening our summer league Monarchs hockey team had a game against the Ice Dogs. In our previous game we tied against this team so I knew this would not be an easy game. To compound things we had a short bench (10 players and a goalie); enough for 2 forward lines and 2 defense lines. It was a hard game and our team really congealed and we played one of our best games this summer season. Just like the saying ‘when the going gets tough the tough get going.’ When you have a great team of people they don’t sit around and cry like a bunch of panty wastes, play the victim card, point fingers, or incessantly complain. A good team buckles down in hard times in spite of the difficulties that might lie ahead and work together to get things done. Individual hero’s need not apply. Team’s don’t worship heroes; they value every person on the team.
The weekend hockey game was a good break for me. You see, we have been in ship mode for our Mango release on the Windows Phone. The hockey game was a good outlet for some pent up frustration. Every D-man had at least 2 shots on goal, and I blocked a couple of shots; one off the mask and one off my inner thigh of course where there are no pads, and yes it left a pretty good bruise. But, as they say, “pain is temporary; a win is forever.”
Seemingly against the odds, we ended up winning the hockey game 5 to 1.
Ship mode often times gets a little crazy. Second guessing takes on a whole new meaning. “Did you test this"?” “What about that?” “I have a situation when I do such and such, and the sun came out (remember we’re in Seattle) something bad happened. Have you seen this before?” Some people run around looking for fires, others are trying to start them.
As I get older I have learned not to react to fires as I did in my younger days. I have learned that sometimes fires aren’t really fires at all; it’s just a spark that someone is recklessly trying to fan into a flame. Sometimes there are fires that burn themselves out, but you just have to manage them in a control burn. And then there are the fires that have to be dealt with. Dealing with fires late in the product cycle is not fun for on the team. But, a team of people are responsible for doing just that and it is seldom easy; and it happens in the ship room.
Our ship room looks at a lot of data every day throughout the product cycle to help us manage our release schedule and stay focused. In ship mode, data is scrutinized even closer and every bug goes under the microscope. Managers must now work together to make some hard decisions about whether to take a fix. There is often intense discussion, but you will never hear anyone play the consultant card saying “it depends.” These guys in ship room have been in the game a long time, they know the risks and they know the business. Of course they know “it depends.” They don’t want a bunch of hand waving and bloviating, they need facts to make hard decisions. If you say an issue going to adversely affect customers you better be able to explain how customers are impacted, how many customers are impacted, how customers get into that predicament, and know if there is a potential work-around.
In the end, a team of senior managers must make hard decisions about what issues to punt and which to fix based on the information that is presented. This never easy at anytime during the product cycle, but in ship mode each issue is carefully investigated down to root cause, the fix is understood, and the impact of fix and testing considerations are well defined before the final decision is made. Of course, many products are schedule driven, but at the forefront of every decision in our ship room is customer impact. Perhaps that is why customer satisfaction for Windows Phone 7 is at 93%, and why I am glad to be on a team that works hard to do the right thing for our customers.