One of the most important non-technical skills for a software test engineer is the ability to communicate clearly. In software testing, you are frequently confronted with situations where you must describe complicated tests, system configurations, and sequences of events. And, you sometimes have to do this during moments of stress during project development and test cycles, where the last thing that anyone wants to hear about is vague descriptions of yet another "interesting" bug.
I talking to former co-workers of former companies recently, I remembered a couple of instances from years ago where I saw, or was guilty of, less than clear communications. Each of these instances presented a different type of "teachable moment."
The first one involved a problem with lack of substantive detail. I had submitted a test analysis report on the state of a product's quality where I said that it was "barely adequate" for release to beta. My boss then informed me that my report was "barely adequate" as it was more of an editorial than a factual report. ;-) The correct approach to have taken in the report would have been to compare the state of the product to its defined beta test requirements in clearly measurable terms and not in personal opinions.
The second one involved a problem with language and jargon. In reviewing some existing tests, I noticed a comment in the code that read "then NADA packets result." It took me a few puzzled minutes to realize that "NADA" was not an obscure 4 letter acronym, but the Spanish word for "nothing." Yes, it would have been a lot easier if the comments had been at least as clear as the code they were meant to augment.
So, to ensure that you're communicating in a better than "barely adequate" manner, stick to the facts, and keep things clear.
(Gracias Cheryl!)
1 comment:
Good post, Len.
As a software tester with a past (as a developer) I know the value of a clear set of steps to reproduce a problem.
When I write a defect report, I always record all the steps that lead to the problem. In fact I always try to reproduce the problem as I write the report. Sometimes this means that I have to throw the report away - if I don't get the same behavior - but this way the developers will know exactly how to get there.
Post a Comment