Friday, September 13, 2013

A Look at the Relationship Between Testers and Developers

I believe that one of the important things in the work life of a tester is to have a good working relationship with developers. This will not only make a tester's job and bug advocacy process easier but will also give him the ability to exert influence on quality, promote positive collaboration and learning opportunities.

Roy Osherove interviewed James Bach on “The Role of the Tester” where Bach talks about this. He says that:

‘…developers are the people who create quality; they make the quality happen. Without the developers, nothing would be there; you’d have zero quality.’

After all without developers on the project us – testers – will not have a job!

In the last few years I have worked thoroughly integrated into Agile development teams as a lone tester. I tend to gravitate more towards developers as opposed to testers as I feel that it helps me expand my domain knowledge and technical skills and…be a better tester. Plus programmers are WYSIWYG. I prefer it when people do not beat around the bush when working together.

On occasion I hear and read stories about the challenges and conflicts both sides have experienced in the workplace, as well as tips and recommendations on how to handle difficult situations and work on improvements from the tester’s viewpoint.

I thought it would be interesting to find out from developers what they thought about testers and their skills. I conducted a very unscientific survey of some developers that I have worked with and whom I also consider professional mentors and good friends.

The replies were pretty refreshing. I hope testers find them useful and take them into consideration in how they go about their work.

Here are the results:

1.   In your opinion what skills make a good tester, and what do you consider to be the most important skill/quality a tester should have?

Testers must be highly organized and meticulous. They must be able to identify all the possible ways that a piece of software can be interacted with, all the different possible sequences for performing an action, and all the possible ways that they can trip up the software. They have to know the requirements, use cases, and specifications inside and out. Then, they have to have the ability to do many verifiably repeatable sequences in order to reproduce issues, verify fixes, and find regressions. And they have to be excellent communicators who can compartmentalize the abuse that they often get in return at shops with insecure developers.
If I had to pick one, I would say the ability to capture the complete conditions for a bug or issue. The biggest frustration for fixing bugs is not being able to reproduce them, and software is complex enough that there is almost always environmental conditions or sequences of action that cause the problem. Understanding enough software enough to include the right information (and erring on the side of too much) means less back-and-forth to clarify, and fewer "it works on my system" situations.
If could have one more, it would be the ability to communicate well.

Curiosity - I believe it takes more than just "following the user stories" I think the best skill a tester could have would be to ask "what if?"  Test outside the boundaries.  Test beyond what the Product Owner, developer, or Manager says the application can do.  Better it's found in house than in production.  I personally like it when a tester does something out of the ordinary to break my code, it helps me make my code that more solid.  Sometimes I get so lost in the code that I don't see outside.

I think the single most important quality in a tester is inquisitiveness. Because inquisitiveness leads to a compulsion - ideally an insatiable one - to understand, at an insanely detailed and accurate level, how things work. And that, in turn, feeds into two vitally important skills that any excellent tester must have.

First is the ability to proactively acquire a complete and accurate understanding of the product they're testing. How is this thing *supposed* to work, in every tiny, gory detail? No making assumptions. No guessing. No "close enough" understanding (except in those cases the tester's product understanding is thorough enough that they know for sure that "close enough" is good enough - and could tell you exactly why that's so). If the process and/or personalities of other team members are such that communication that leads to this kind of thorough understanding is not available, then an excellent tester would consider it their responsibility to address that process and/or personalities - because otherwise, they simply can't do their own job well.

And then, of course, this painstakingly thorough product understanding enables the excellent tester to do the fun part of their job: coming up with painstakingly thorough attempts to break it. This may require the acquisition of technical skills and knowledge (of tools, code, network protocols, etc., etc.) that the tester never thought they would need. The excellent, inquisitive, insatiable tester looks at that and says: gimme more. Learning leads to breaking, and breaking is fun. :-)

Finally, the ability to thoroughly and clearly document and communicate your findings is also essential.

Desire to make a product better, but accept that there is a threshold to release when there can be acceptable problems. This one is tough, but always make sure problems are documented. 
* Backbone - devs are not always right (even though we totally think we are) and we will tend to defend what we built ardently, but something wrong is still something wrong and needs to be fixed.
 * If it's odd - log it. Logging is cheap these days (In good systems). If this causes problems with some (thing|one), then there are bigger issues to deal with that you probably won't be able to change and you might want to get out.
 * Be able to accept rejection. This is hard for any human. Though you might think something is a bug a dev will see it differently and could reject for a number of reasons. There should be definitions of bugs and feature requests that the group has accepted.
 * Native literacy and grammar - this is really difficult and not the representative teams' fault but can lead to major issues when stress levels grow. It's usually a big problem with outsourcing. This is a direct consequence of company decisions and is not the fault of the outsource hires or the insource teams. It's a horrible position for both teams. I'd argue it's one of the biggest sources of frustration between testers and programmers.
 * Recognition that the dev doesn't make all design decisions. In fact, they may simply be implementing something that they had no decision in. This is more in bigger groups than smaller - just be aware of the situations.
 * Get out of testing if it's just a job. Likewise if you're working with devs who think that way. The product will always suffer.
 * Please be friendly. I really hate working with jerks.

2. Many in software development say that testers and programmers don't get along and don't like working with each other. Do you think it is true? Why? Why not?
Testers and programmers don't get along only if one or the other (or both) have a problem. A tester must be technical enough to speak the developer's language, and organized enough to produce good solid reproduction cases for defects that actually violate a requirement (otherwise a detailed feature request). But, a developer must realize that a bug report is not a personal attack on them, but rather a useful piece of data that helps produce more robust code. Personally, it is my goal as a developer to make life as easy as possible for testers (while making them crazy by not actually producing buggy code).

I actually haven't generally found that to be true. When it is, I think it's more of a management problem — programmers are being pushed to get something out, with deadlines being more important than quality, which makes testers seem like a roadblock to that result. I have an appreciation for testers, because I have worked too many places where we didn't have them, and programmers did the testing.

I also think that the trendiness of concepts like unit testing, test-driven development, etc., has gotten programmers more involved in testing, and produced better understanding.

I think it's true because of maybe a previous bad experience.  Perhaps the tester didn't fully explain what caused the problem was, or maybe the developer wants to work on the new program but the tester has found fault in the current program that needs fixing.  Could be ego, I have heard (I’ve never said it) among developers "Testers are those who can't code their way out of a paper bag, so they become QA".

It depends. In my experience, even the most curmudgeonly of coders (and oh my, we do have curmudgeons among us) will fairly quickly develop respect (lending itself to a good working relationship) with a tester who displays the excellence I outlined above. Some programmers may begin with the assumption that a tester is probably *not* like that - that they are someone who is essentially overhead within the product development cycle - eating up time and energy from developers and other team members, while lacking the skills and diligence to do testing that's anywhere near the equal of the programmer's own testing efforts. It's on the tester to prove them wrong. Unfortunately, it's also often on the tester to carry the lion's share of relationship-building with the programmer, because, sadly, many programmers do not excel at social interaction. And programmers who are hostile to or dismissive of testers, or who try to bypass formal testing because they consider it pointless, or who are overly defensive about their "perfect" code (bull!!) can be a cause of poor working relationships too.

I think it's true in certain venues and is, ultimately, a problem caused by corporate decisions/cultures. When a company doesn't treat testers as first-class citizens of a company/product then it already sets a negative tone. Outsourcing reflects this, in my opinion. I've seen much better results when testers are integrated during development instead of being handed a pile of code and told "Ok, test this". Cultures beget problems and solutions.

3. What are your recommendations/advice to testers for working better with programmers?

For testers: be detailed, be organized, automate EVERYTHING. 
For developers: be nice, take tester reports seriously, but not personally, and be glad for the help. Having development experience is helpful for testers and having testing experience is helpful for developers. But they are necessarily two different types of personality, and if you draw your testers from the developers who weren't good enough to write your code, you're going to have a bad time. Testing is its own profession, and good testers are worth at least as much to a team as a good developer.

The thing to understand is that programmers, unless told otherwise, tend to approach the task of testing as "prove the software works," trying out all the ways they know how it's supposed to work. I was lucky enough to find a book called "The Art of Software Testing" at one of those companies that didn't have testers, and the most important concept I got from it is that the purpose of testing is to find bugs, so if you find bugs, you're doing it right (and if you don't find them, you're not.)

I think this is a concept best explained by project managers, rather than by testers. They can also set up a development process where time for testing and bug fixing is allowed for, rather than being treated as something exceptional. An agile development process, where working software is produced (and tested!) frequently, can make it a lot easier to do this than having a big testing and bug fix phase right before release.

Essentially, a "programmers vs. testers" environment as a failure of management, not the fault of either programmers or testers, and it's not hard for programmers to understand the testing perspective if testing and testers are treated as an important part of the process.

Problem Solving - Sometimes it isn't the code that halted testing.  Perhaps configuration, maybe a user error, or even data is corrupted.  I think a good tester should be able to quickly ascertain the problem.  If not, the next necessary skill would be...

Collaboration - Or the willingness to talk the developer and work through a problem that doesn't seem quite so obvious.  This is a point that developers should work on as well.  It takes a village.

CS 101 Coding Skills - Nothing major  but being able to query the database and write minor scripts that create test data would be nice to have.

My advice to testers is:
     - Be an excellent tester, as described above in #1. If you believe you lack that quality of near- pathological inquisitiveness, seriously consider another line of work. If you have that quality, though, then stop at nothing in your pursuit of thorough product understanding, and in your dogged efforts to break the product.
     - If you encounter programmers who assume that you add no value, prove them wrong (see above), but do it gently and in a friendly manner. I wish I could say you shouldn't expect to suffer any bad feelings during this process. But unfortunately, the stereotype about programmers' poor social skills does have some basis in reality. But if you persevere, and perhaps work on maintaining thick skin for a while, programmers can also make excellent friends and allies, and a strong team with good working relationships will definitely lead to better products and a more enjoyable work day. A tester's job is not easy, either on a technical or a relational level, but the rewards of working as part of a high-functioning team are many!

*   Be honest. It's hard to have conflicts if no one is hiding anything

*   Don't tolerate abuse from programmers (or anyone, really)
     This part really sucks because it can cost you or others a job, but no one should be in an abusive environment. It's such a shitty situation to be in and can just drag people (and a product) down. I might be talking on a high-horse but I will gamble my job to end shitty environments. I wish it were that easy for others, and I wish abusive environments didn't exist, but if you have the ability to change/leave it - do not even hesitate to do so. You might also find that you're not alone.

*     Make sure the programmer(s) that you work with are on your team(/side). Being thrown under the bus sucks. Being thrown under the bus repeatedly is abuse. See above.

*   Communicate with your team. I work so much better with testers I keep in constant communication with and everyone benefits because it allows channels of clarity to problems.

I want to thank all the programmers for participating and wish you all a Happy Programmer Day! It’s my hope that this will help foster some thinking and quality, level-headed discussion in the tester community. As my anonymous programmer says, it takes a village. Each team needs programmers, testers and more, but importantly good 
good leadership from management. And like with a family it takes understanding and compromise to help make it work. Testers have it tough, but we have to be more than knowledgeable, courageous and professional to hold up our end.


  1. "automate EVERYTHING"!? Please explain to me how you automated everything? Can test automation replace the timing and interactions of a human being? Using test automation is also dependent on the context of a project. As a developer, can you automate everything?
    Also, one of the most important traits a tester can have is creativity. Having the ability to approach the software in unique and new ways is one of the best ways to find important defects - think Exploratory Testing.

  2. Greg,

    Brian is unavailable now for answers. So I will chime in since I worked with Brian on a project where we wished things had been automated for regression testing and validation as the system, the underlying algorithms and the data were extremely complex. We were unable to do so due to demanding schedule deadlines and lack of resources and ended up spending way too much time on re-testing , re-checking and re-calculating things manually. It was a huge waste of time and pretty costly.

    If/when functionality/backend code is properly separated from presentation (in some contexts/cases) then a team (developers and testers) should be able to figure out what and how to automate- whether it is test data generation or API validation or whatever contextually has to be scaled. This way testers and the team will not have to waste their time on regression and can focus on new functionality/feature implementation and exploratory testing. It helps to have both sapient and automation testers on a team however.

    I understand that saying ‘automate everything’ is too broad of a statement but no need to get defensive. :-) In the answers, he and the other developers do say that testers should think outside the box, so let’s keep things in context in our replies and take it for what it’s worth. Use the feedback if it’s applicable, otherwise don’t. We are mostly testers here, so this should be second nature to us not to take things out of context.