There's a few blogs that will list blog posts that the author has read that week - or day. One thing I find lacking from these is what the author took from these posts, did they learn anything new, what things will they do differently, why should I bother going to read the post, what might I get out of it ?
So I thought I'd try and do just that with posts I've read this week - and maybe turn this into a regular series. But as I was doing this I wondered what a newbie to testing might make of these posts and how they could end up very confused...
Imagine you're a newbie to testing, maybe you've read a book on the basics, had a surf around Wikipedia and you have the idea that there are test strategies, plans, scripts, test cycles and defects.
Determined to get better you start surfing The Interwebz and start reading the blogs....
First thing you might find out is that The Test Plan Is Dead ( poor old test plans, not long ago there was the 10 Minute Test Plan now they are pushing up the daisies )
Reading that post, the noob tester finds out about Confidence Maps, this seems exciting stuff, he should go and start implementing some of this !!!
Not so fast, another blog post appears which talks about The Confidence Game and making confidence the mission of your testing is a mistake !!
( and that NOT giving your opinion is the hardest testing skill to master. Noob tester makes note to start practicing NOT giving his opinion, seems simple as his test manager seems to ignore him anyway apart from asking how many scripts he has written/run )
Luckily for the noob tester the testing mission is for his test managers to work out, he can concentrate on doing his testing and reporting his bugs.
Or not. He finds out he's not using Hexawise or doing combinatorial testing which means he's at least only half as productive as he could be.
He hangs his head in shame.
And his scripts never mention running a Shoe Test, maybe that's a special input that Hexawise has, Shoe/Sock/Barefoot ?
He makes a note to make sure he wears fresh socks every day and to wear his best shoes in case he ever has to run this test.
The test lifecycle he is following also has no mention or distinction of when and how much creative and critical thinking to do. No worries, he's now learned his lesson and will make sure to include this ( once John Stevenson has detailed what the Best Practice for this is and precisely how much of each to do in each phase ) in his next Test Plan.
Damn, forgot that Test Plans were dead.
Ah well, stick to logging bugs - oh noes his defect tracker does not have an "unclassified" field for the severity, should he ask his manager to add one ?
Undaunted, the valiant noob tester decides to work on his skills ( other ones besides the 'not giving his opinion one' already noted )
But which ones ?
Technical skills of course otherwise he'll end up as dead as the Test Plan - but taking care to find a balance as Manual testers rock and are the heart and soul of testing.
So to balance the technical web testing course he's signed up for, maybe some sociology, along with some whisky, cigars and a history lesson in using soft skills with Hiram Ulysses Grant, US Civil War General
After all this the noob tester still considers himself lucky - what about his colleague doing UAT ?
Seems that UAT is the hardest testing to do