Sunday, 21 December 2008
Highlights of 2008 for me
Daughter graduates from uni and gets herself a cool job
Found out I had a hidden talent at doing presentations - done several at my company's monthly meetings which have been very well received
Became a moderator of the Software Testing Club which has now grown to over 2000 members. Got used to deleting posts every day asking for ISTQB sample papers
Been accepted as a delegate to Software Craftsmanship 2009
Met Gojko Azdic after a talk at SkillsMatter, looking forward to reading his new book on Agile Acceptance Testing
Started an OU course on Management which has started to open my eyes on what management is about
Jerry Weinberg wrote a new book on testing
Added far too many blogs to my blog reader
Printed out far too many pages to read
Bookmarked too many sites for me to never refer to again
Started using Twitter and found like many others that it can be useful
Started blogging again - thanks to everyone who reads it, has linked to it and left comments
Saturday, 20 December 2008
Friday, 19 December 2008
During Bj Rollisons talk on "How We Test At Microsoft" he showed one slide that listed the tools they used
- Code Analysis
- Dynamic Analysis
- Monitoring Tools
- Fault Injection
- Harnesses and frameworks
- Project Management
- Bug Management
- Reporting Systems
- Perf/Stress tools
- Source Control
- Data generators
- Productivity tools
- Add-ons and enhancements
- Visual Studio Team System
- Many, many more
All very impressive exept for a little naggy feeling that maybe it was sign that something wasn't quite right with the way we build software in that we need more and more tools
An opinion echoed in a blog by a programmer struggling with productivity
For his "one page" web application he needed
- ASP.NET 2.0 framework classes
- ASP.NET AJAX (.NET 3.5 FCL)
- ASP.NET AJAX Control Toolkit (Tab and Hover controls)
- .NET 3.5 Windows Communication Foundation (WCF) framework classes
- Visual Studio 2008 IDE
- MDbg CLR Managed Debugger (mdbg) Sample application and API framework
- .NET 2.0 Winforms framework classes
- Microsoft Agent SDK (API framework)
- Microsoft Remote Desktop Web Connection Active X component
- Google Earth application
- Prototype.js open source AJAX framework
- IronPython 2.0B2 and Dynamic Language Runtime (DLR) with IronRuby support
- IronPythonTextbox – open source IronPython rich client text box and interpreter
- Color List Box – open source WinForms modified List Control
- RealWorldGrid – open source ASP.NET modified GridView control
Thats an awful lot of technology to learn and master
( and install ! )
Monday, 15 December 2008
There wasn't a test environment as the DBA's were never free to set things up as they were too busy firefighting the problems being found in the live system.
And of course there were problems in the live system because insufficient testing was being done because of the lack of a test environment
Lots of vigorous nodding of heads from other delegates there ( including mine )
Sadly though, the delegate wouldn't have gone away with any answers - she would have learned how MS recruits its testers, how exploratory testing is over-rated, what things to consider when deciding what level to write her test cases to and how it was impossible to not visualise a pink elephant when told not to
She did stress how everyone was postitive where she worked, there was no developer/tester clash - which maybe was one clue to the problems. No-one wanted to be seen as negative and pointing out the problems that one day the technical debt should be paid off ( or at least pay off more than the interest payments ) - and unlike the auto makers and banks, there wont be the possibility of a government bailout
post your suggestions for breaking this cycle - and I'll see if the good members of the Software Testing Club have any ideas
Wednesday, 10 December 2008
Opening keynote was from Bj Rollings on Exploratory Testing Exposed where he tried to counter some of the claims of ET experts about it's effectiveness. He produced some figures from 7 years of study ( I'd like to see more details abpout this ) that showed there was no real evidence to prove the case and that ET was more likely to find behavioral issues and scripted tests more likely to find technical defects though this seemed to be based on an assumption that ET was almost always done through the UI.
The summary ?
"The overall effectiveness of both exploratory testing and designing effective scripted tests depends heavily on the individual tester's professional knowledge of the system and testing!" - in other words a great tester can write great test scripts and do great exploratory testing, a poor tester will be bad at both ?
Sadly I had to miss the Testoff being held by Stewart Noakes and his PEST team - saw Stewart afterwards and he said he'd had a good chat with Bj....
Instead I had a talk about context driven test documentation which gave some ideas on what level to write your test scripts to. The answer is ( of course ) that 'it depends' but the talk gave some useful ideas on what factors to take into account when working out what it depends on
Dot Graham then gave a talk on the Three C's that a tester needs - Criticism, Communication and Confidence. Nothing really new in it for me ( the Satir model was touched upon but after reading Mr Weinberg I know all about that ) but I suppose the fact that it was nothing new showed how much I've learnt over the last couple of years
The afternoon workshop was about Soft Skills for Testers but it didnt get off to the best start when the presenter asked us NOT to think about a pink elephant on the ceiling as an example of how it was impossible to not process a negative.
Followed by a story about a dog on a porch whining because he was on a nail that wasnt so painful he felt the need to move and I was beginning to feel the need to move. But I learnt about time management ( make a priority list and act on the top priority ones first ) and how to handle email overload ( delete the unimportant ones ).
( whoops, dont think I listened closely enough to Dot where being sarcastic is one way NOT to give criticism )
The session did get interesting as it turned into a discussion about whether learning to act and using techniques to get your way was ethical
Would be interesting if a program manager and a test manager went to the same session and then tried the techniques on each other to either ship or delay a release...
A quick presentation on DbFit made it look like it was something to investigate sometime
Final keynote was Bj Rollison again with How We Test at Microsoft ( he and James Whittaker really seem to be out and about at the moment ( James is presenting at the next SIGIST in March ) and there is the MS Testing Book out ( still waiting for Amazon to ship my copy ) )
An interesting talk that might provide some future blog material
Met some faces from the Software Testing Club and also some new people - the networking aspect is a great reason to go to the event
Some interesting ideas to think about - watch this space !
Friday, 5 December 2008
Struck a huge resounding chord with me as I am such a big reader. His blog also linked to an article which suggested holding book reviews where people at a company would get together to read a book and work out how the ideas and principles could be applied to their organisation. Seems a great idea and one I am going to try and get up and running
( if I have time from helping out with the Software Testing Club book )
Then read the latest Rands in Repose blog which was also about books and his belief that there can never be enough books - and I really liked the end of his post where he is putting the proceeds from any T-shirt sales to a nonprofit organization that helps give children from low-income families the opportunity to read and own their first new books.
That doesn't seem to be my problem ( not enough books ), the problem I have is never enough time to read them all
Currently on pre-order I have
How We Test Software at Microsoft (PRO-best Practices)
and Lisa Crispin's new book
Agile Testing: A Practical Guide for Testers and Agile Teams
My Amazon wish list is now at 5 pages and has books ranging from Corps Business: The 30 Management Principles of the U.S. Marines ( supposed to be a great book on Agile Development ) to Adrenaline Junkies and Template Zombies: Understanding Patterns of Project Behavior ( I've definitely worked with both types ) and Dealing with People You Can't Stand: How to Bring Out the Best in People at Their Worst has to be a useful book.
I still like to keep my programming skills going so have my eyes on Scripted GUI Testing with Ruby
One book I read a while ago and really liked was The Goal: A Process of Ongoing Improvement which introduced me to the Theory of Constraints so a book about Thinking for a Change: Putting the TOC Thinking Processes to Use sounds a good read
I really need the Xmas holidays to be a month long...
What's on your Xmas reading and wish list ?
Monday, 1 December 2008
Sent in an application for a place at the Software Craftsmanship 2009 Conference in February. Should be interesting, I've been following Jason Gorman's writings via TestingReflections for a long time now so I'm not expecting this conference to disappoint.
Hopefully I'll get to attend, if not I'm sure there'll be a lot of good material coming out of it
Next week I'm off to the BCS SIGIST conference - 'The Multi-Skilled Tester'.
Bj Rollison ( aka I. M. Testy aka Testing Mentor ) is giving the opening and closing keynotes, I'm booked on a workshop on Soft Skills for Testers and I'm hoping to meet up with people from the Software Testing Club and SQAForums
Full report once I've been
Wednesday, 26 November 2008
Wonder how many have, wonder how many companies even thought about this happening and had plans for it or will it be a minor Y2K type problem...
I'll be scanning the news to see how many people get bitten by it and how many unforeseen implications there are
Tuesday, 25 November 2008
Just a nice reminder that the testing field is wide open
And full of great intelligent people wanting to talk about it
Monday, 24 November 2008
Typemock are offering their new product for unit testing SharePoint called Isolator For SharePoint, for a special introduction price. it is the only tool that allows you to unit test SharePoint without a SharePoint server. To learn more click here.
The first 50 bloggers who blog this text in their blog and tell us about it, will get a Full Isolator license, Free. for rules and info click here.
One of the '-ities' I rarely see discussed is maintainability
Are companies so focussed on getting Version 1 to work that they don't think about what happens if it's a success and has customers clamouring for Version 2 ?
As an ex-programmer I am embarrassed about some of the code I left behind and whenever I feel my ears burning I know that some poor programmer is trying to work out what the heck I had for breakfast that day to produce such a mess
Taking over someone else's code was one of the bugbears of being a programmer
Was the code a complete nightmare where you just wanted to start from scratch and rewrite it all ? Or had the code been written by an uber-geek using advanced programming concepts that you'd never come across and you didn't even know which file to start looking at and even if you did the contents seemed to be written in some alien language ?
Given that most projects spend most of their life in maintenance mode then shouldn't testing to see if the code can be easily maintained be a priority ? Or at the very least, something that is thought about ?
Maybe I've just been unlucky and worked on the wrong projects where it's never done but the regular readers of The Daily WTF seem bored with having horrific code samples sent in
Anyone considering maintenance, please leave a comment
And I'll see if anyone on the Software Testing Club does it
Thursday, 20 November 2008
I borrowed the idea of "how many points in a 5 point star" fromGojko Adzic ( who in turn got it from Jerry Weinberg's book Exploring Requirements:
Quality Before Design ) and used it to demonstrate how hard it is to get requirements right
( The answers I got were 5 and 10, no strange values like 11 or 15 as Gojko got )
The presentation seemed to go well, I avoided Death By Powerpoint ( Presentation Zen has given me some good ideas as has sitting through many deathly dull ones ) but I got the ending wrong by ending with a question.
Which got no response
I should have taken the advice of an Open University lecturer on my Management Challenge weekend. He did some amateur dramatics so had a lot of tips and tricks for improving presentations and he said that one guaranteed way to get applause when you end was to wait 5 seconds and then do a rising three's phrase. The classic example of this was Tony Blair and "Education, Education, Education"
I should have finished with "Requirements, Requirements, Requirements"
But good learning for next time...
Thursday, 13 November 2008
Grab the nearest book.
Open it to page 56.
Find the fifth sentence.
Post the text of the sentence in your journal along with these instructions.
Don’t dig for your favorite book, the cool book, or the intellectual one: pick the CLOSEST.
well page 56 was blank so I'll use page 57
Fitnesse allows you to view, change and create Web pages containing documentation and test tables that are to be run by Fit
"Fit for Developing Software. Framework for Integrated Tests"
Rick Mugridge, Ward Cunningham
Tuesday, 11 November 2008
Determined to make a good impression he sent round an email promising a one-to-one meeting with all the testers on the team to find out their skillsets, problems and targets
The email had a mixed response
Eric the Enthusiast felt excited, maybe at last the project would start to really get moving and he jotted down his ideas in his notebook
Clive the Cynic scoffed at Eric and told him it would never happen, he'd seen and heard it all before
Frank the fence sitter chuckled at Eric and Clives squabbles, he had a little hope that Eric would be proved right but also suspected that Clive would be vindicated
Neville's first week went in a blur as he learnt about his new project and he sent out an email aplogising for not meeting with everyone but it was firmly on his agenda
Eric felt disappointed but still made another addition to the list in his notebook and tried not to look at Clive
Neville thought his second week had gone much better, he'd written some reports for the senior managers and had useful meetings with other managers on the project. Deadlines were tight but once he'd got his team motivated....
He started to draft his email
Eric's notebook had stayed in the drawer all week and instead Eric and Clive got together to draw up a sweepstake on when their meetings with Neville would happen
Thursday, 30 October 2008
I posted to the discussion about how it was a tester missing a bug that led to me becoming a tester and then I thought about other bugs that had been memorable to me
The 15 second bug
Having established a reputation as someone who could break a program very easily, one day one of the programmers feeling quite confident about his code challenged me to see how fast I could break his latest release.
He was somewhat ashen faced when 15 seconds later I had a crash to show him. It was nothing special - one of the old testers bag of tricks of leaving an input field blank and I knew this particular programmer had a history of doing that ( he never seemed to learn )
Why was this memorable ? Because it established my credibility with the programmers ( I did it in front of few of them just like a magician ), they wanted to know how I did it so fast and some of them began to learn what to do to do testing themselves, it was the start of getting them test infected.
The Competition Bug
This was when I got my first inkling that I was a tester. I was in a team of programmers working on a new system and one of the programmers found a bug in my part of the code. Much teasing and pointing of fingers. So of course I had to be childish and retaliate and try to find a bug in HIS code. Easy. Then I found another. And another. And another and soon he was begging for mercy.
And we were wondering why the test team weren't finding these bugs so I ended up spending more and more time testing rather than programming.
Sadly at the end of the project I was moved back to programming but it was something I remembered a few years later when I was considering a career change.
( Good thing I wrote code with bugs in it !! )
So, bugs don't always have negative consequences
Sunday, 26 October 2008
Finding one means that you have some evidence to show that it's worth having a tester on your team, thats one less defect a customer is going to find ( assuming it does get fixed )so rah rah RAH to the tester team.
Thw downside is that it means there is a defect in the product, once again it has been shown that humans + software writing = mistakes
In a recent blog entry, James Whittaker was trying to explain to his son which part he did and was unable to give him a satisfactory answer.
Which led me to thinking about one of the common 'testing is like...' analogies and that is the being a tester is like being a detective such as this one
Do detectives/police have the same bittersweet moment when they get called to a crime scene ? Do they involve themselves in the process of the work, collecting evidence etc, that they dont see the bigger picture and wish there was no crimes for them to be called out to ? Do they wish they could tackle the root causes of the crime rather than the aftermath ?
Or are they happy to go off and " create a GUI interface in Visual Basic, see if I can track an IP address."
Friday, 24 October 2008
Additionally, this would include an interface that lacks grammar and spelling issues; any time I see those, I just assume the developers are as code-illiterate as they are English-illiterate and that logic defects aren’t far below the surface
This does seem to be a common reaction - if the UI looks bad then assume the underlying code is also as bad ( or worse )
I posed the question on Twitter ( if you see a bad UI do you assume the code is bad )and got a couple of responses
I would certainly question the quality of code and perhaps the priorities of the coders/testers
I cringe. I fear for the level of overall quality if no-one noticed it. It suggests sloppiness and a lack of pride in the work.
It can be a good heuristic to use but it's not always valid
From my dark days as a programmer there are two situations where it doesn't tell thefull story
For a lot of programmers it's all about the code - grudgingly they will fit a UI onto the top of their code so that mere mortals can use it but it's not their top priority. At one company there was always the promise that a professional UI designer would be brought in to take care of the UI but that never happened. So we'd sling together some rough prototype, ask for some feedback (which never arrived ) and then the code would ship
It could indicate that management of the company didn't take quality highly enough that they would pay for a UI designer ( and tester ) but as a measure of code quality it wasnt a fair indication.
Alternatively, there is also the case that some companies rely on smoke and mirrors and will put a large amount of effort into making the UI look slick and polished ( especially when there is an upcoming trade show to demonstrate at ) and pay little attention to the real functionality behind it
And maybe the UI can put a slight bias on testing efforts - if the UI is sloppy then there must be bugs to find, if it's slick then maybe, just maybe, you wont try as hard. With the increasing amount of programmers using unit testing then the correlation between poor UI and poor code is not as fixed as maybe it used to be
Anyway, it was about time I had a blog post with 'heuristic' in the title
Thursday, 23 October 2008
Monday, 20 October 2008
A few weeks ago I was fortunate enough to visit Venice. The guidebooks warned that it was easy to get lost in Venice so I tried to prepare myself with maps, directions, itineraries.
Didn't help, within minutes of being dropped off by the water taxi we were lost.
Wandering round we'd often end up back where we started or totally lost again.
Impossible to walk in a straight line as there are so many turns and twists, canals and bridges and long Italian place name so unless you have a life-size map or are at one of the main sights then really tricky to find yourself on a map. And the narrow alleys and buildings mean there's no landmarks to spot to help
So what has this got to with testing ?
A number of analogies came to mind
It could be used to argue against the waterfall approach - all the careful planning I did just didn't help very much when I was faced with the reality of Venetian streets.
It could be used as a good example of the dangers of ad-hoc testing, wandering off without any plan or direction meant going round in circles. Adopting a fully scripted approach of following directions exactly would not only have been incredibly tedious ( checking where you were every 5 yards ) but would have meant missing out on some great discoveries when we did wander off the direction we were meant to go.
We found that the best approach was to establish a general idea of where we were going and check our progress every so often.
One thing I have found when looking back at my trip is how it reflected my Myers-Briggs personality type I'm pretty much an ISTJ, I like things organised so initally Venice was a shock to the system.
As are software projects utilising the CHAOS methodology where there is no order - but knowing how I react to them helps me cope and get to work in making them less chaotic
Wednesday, 15 October 2008
Which led me to finding the bug shown below - there were 10 requests to join, I approved 4 which should leave 6
I then found a bug using Twitter, hitting the Update button was taking me to a page of a user called "update" as shown below. Sadly I didn't have the time to investigate it in any depth but I was having one of those days where everything I touched broke.
Nice to know I haven't lost my touch
Monday, 13 October 2008
It is more and more common to read blogs from developers talking about testing and so I'll try and make it along to the next DeveloperDeveloperDeveloper! Day so I can listen to talks from people like Ben Hall talking about Microsoft Pex - The future of unit testing?
Sadly I was never exposed to any of this in my development days and took the classic code-release-fix approach and was always surprised when bugs were found in my code.
The situation outlined in this blog - Are your applications ‘legacy code’ before they even hit production? was all too familiar.
But if you don’t understand what was wrong with the last project you worked on, you’ll be doomed to repeat all of its mistakes. Even with the best of intentions, new legacy code is written, and without knowing it, you’ve created another maintenance nightmare just like the one before it.
Though after reading Jerry Weinbergs Perfect Software and other illusions about testing maybe I was in the era when devs didnt do testing. Jerry says
That's why I'm a strong advocate of the test-first philosophy, whereby developers write their tests to include expected results before they write a line of code. It's what we did fifty years ago, but the practice was gradually lost when industry trends separated testing from development.
Wednesday, 8 October 2008
It was a phrase that I happened to come across during some surfing, my first reaction was to dismiss it as marketing bumf but then led me to think about whether there was a Holy Grail of testing
A quick Google search reveals lots of discussions about the Holy Grail of S/W Development ranging from real-time feedback, simplicity, Software Factories, Software Reuse, “getting to zero” defects and security vulnerabilities and the ultimate - a level 5 score in SEI evaluations
So if the software developers cant agree on what their Holy Grail is, how can us testers test to see if they have it ?
Thursday, 2 October 2008
I was, however, reading a blog post titled In A Web 2.0 World, Quality Is Irrelevant
The author was not writing about Twitter uptime or Facebook apps crashing, he was writing about traditional journalists adapting to the new Web 2 world with a different definition of quality.
Still, I'm not in full rosy concurrence with the idea that we should kick quality completely to the curb. For one, it's not that quality doesn't matter -- it's that the definition of what constitutes quality is changing. The old idea that quality is defined by editing an article six ways from Sunday so that it's denatured of all passion and advocacy, and so that that it has every freakin' semicolon and middle initial in the correct place -- that's what's dead
Testers, too, can struggle with different definitions of quality.
A release with a known defect can be the equivalent of a missing full stop in a story - hard to let it go and say that it doesn't matter no matter how many times you repeat that "testers are not the gatekeepers"
Wednesday, 24 September 2008
The giveaway seems to have been a comment in the code
// Completely undocumented from Microsoft. You can find this information by
// disassembling Vista's SP1 kernel32.dll with your favorite disassembler
Usually code commenting is A Good Thing - is this a case where it was actually A Bad Thing ?
It did remind me of a tester I interviewed who said he did code reviews. When I pressed for further details he said that he didn't actually read or understand the code, he just looked through it to make sure there were comments in it...
But at least there were probably comments to be found, one of my first jobs when I joined the s/w industry was to go through a huge pile of Assembler code and document how it worked as the code was written by a third party and the contract didn't state that there had to be code documentation
And finally, be careful with comments or you could waste 357 years
Tuesday, 23 September 2008
It was not a new idea, by now everyone must know the theory that the later in the lifecyle a bug is discovered the costlier it is to fix
What did get me thinking was one of the comments to the post where the question was raised about companies not rewarding and recognising employees who find and fix the defects at an early stage.
I'm sure all testers know the about-to-ship squeeze with late night and weekend bug blitzes and fixes and managers buying in pizza for the testers.
If a defect does slip out to a customer then there's the chance to be a hero and either find a workaround or come up with a bug fix and patch
Does anyone get rewarded for finding the defects early ?
Do all companies wait until the end to do the big push to get the defects found and cleared ?
Monday, 22 September 2008
A NNPP is a programmer who
"inserts enough spoilage to exceed the value of their production"
It made me wonder if there were NNPT's, Net Negative Producing Testers, and then I read a discussion on the Software Testing Club site about low quality bug reports with the great title of
If I were a developer...... I'd hate you too
Having been on the wrong end of such bug reports when I was a developer I knew just what he meant - and then I thought about the amount of time wasted trying to find out what the bug was and how to reproduce it
It wasn't only poor bug reports that wasted time. The majority of the bug reports were low-hanging fruit ones, easy to find ( and fix ) but trivial in nature - tabbing order, buttons misaligned etc. There was no bug triage in place to sort them into priority order so it meant either reading through the entire bug list to find the severe ones - or most commonly just starting with the first on the list and working through it
The result - lots of time spent fixing minor bugs and the customer finding the severe ones.
This then led to management wondering what the point of testing was if they were unable to find the bugs that mattered - why not send it straight out to the customers ?
Make sure the developers hate you for the right reasons and make sure your efforts are helping the project, not hindering it
Friday, 19 September 2008
Or I could remind people about the The Pirate Heuristic - "when you run out of ideas - steal someone else's"
Then I found that I couldn't make developers walk the plank as some of them do Pirate Testing themselves
Didn't find any "testers are like pirates because..." analogies to add to my testing cliches list
Any ideas ?
Monday, 15 September 2008
On the SCRUM mailing list I read about a new book that was in progress -
" Succeeding with Agile: Getting Started and Getting Good " by Mike Cohn
Went to the books website and was disappointed to read that
“The publisher has announced a publication date of June 4, 2018”
That’s 10 years away, doesn’t seem very agile to me...
I’ve also been disappointed with another book - the latest one from Jerry Weinberg
Perfect Software And Other Illusions About Testing
I have had it on pre-order with Amazon for ages. Seems people who were at the CAST conference were able to get their hands on a copy but I'm still waiting and waiting and waiting...
Hopefully Lisa Crispin's next book on Agile testing wont run into any difficulties
Good thing there's plenty of blogs to read
Friday, 12 September 2008
It sounded like an interesting, useful and cool tool but sadly the article said things like
"Klocwork [Insight's] static analysis takes the runtime burden away from engineering and QA,"
"However, if engineers are able to see and fix their own code, they are able to
preclude that defect from ever being seen by QA or customers. "
Surely the use of such a tool is part of a QA process and means that defects picked up by the tool wont be seen by QC ?
Yes, the old QA is not QC argument...
One I had many times at my last company, trying to educate people into why saying they were giving the program to QA to be QA'd was wrong and why I wanted directory names on the test server called "FOR_QA" to be renamed as "FOR_QC"
And now that I've blogged about it I can join the list of QA/QC bloggers such as
John McConda and Antony Marcano ( and again ) on testingReflections
The Braidy Tester
For my next post I need to think which "testing is like..." analogy I can use as that is also a common theme of a testing blog.
Detectives, surgeons, kung-fu, playing pool, driving a car, introducing a guest in your home, dishwashing,a box of chocolates, flossing teeth, growing turnips, toilet paper, hide and seek, marriage and Magic The Gathering have already been done so I'd better get thinking hard if I'm going to come up with a new one
and finally, I gave this post the title 'testing cliches' which is a good example of the problems of the English language and how it can be ambiguous
Does "testing cliches" mean a list of cliches that apply to testing ?
Or does it mean that cliches are being tested to see if they are true ?
Tuesday, 9 September 2008
( which incidentally was one of the first testing websites I found when I was making the switch from dev to test)
It was an interesting talk, dealing with a lot of the misconceptions and objections to Agile, how and why to write and use User Stories, use - or not - of a bug tracking system in an agile project, skills an agile tester should have etc
There were 2 great things about the talk
1 - Antony has been involved in XP/Agile for 8 years and so his talk was full of real life cases about how it can be made to work
2 - The passion he shows for the subject. Even after 8 years doing it he is still so enthusiastic about it and admitted that he is still learning
The talk and Q&A session was an hour and a half but it could easily have gone on for several hours
After the talk some people moved onto the pub where I was able to meet and talk to another of the attendees - Gojko Adzic
A pleasure to meet him and great to listen to him and Antony enthusiastically debating ways of running automated tests that could be written in a customer friendly fashion.
The talk was given the wrong title though - it was called "In The Brain Of Antony Marcano", it should have been "In the HEART of Antony Marcano"
( the talk was being videod and should be available soon
Wednesday, 3 September 2008
You can apply for your Visa online, the website is
Initial impressions weren't too good when we had to choose the current state of residence of my partner and found Massachusetts spelt as "Massachusettes"
My testing senses then tingled some more when I saw screens like the one below
As there is no option on this screen to enter any text, why bother to have the instruction that all questions must be answered in English ?
No serious bugs though so we were able to complete the application and got the visa
Once we were back in England the next step was to get a National Insurance number, this is done at your nearest Job Centre and there is a website to help find the nearest one.
All you had to do according to the on-screen instructions was enter your town/city and press the 'Find' button
So I entered 'Bracknell' and was told that "Bracknell was not a valid postcode"
Enter my postcode and voila, the location of my nearest centre
All of which led to this sequence of thoughts
Initial thought on finding these problems was "Who tested this ???"
Thinking more, it was possible that the defects had been found but a decision had been made not to fix them. They didn't stop the programs from working. Would be nice if government websites had higher quality standards though.
For the postcode/town problem, what was the real defect ? The spec could have been that a customer enters a postcode and that it was the label telling you to enter a town that was wrong.
Or maybe the tester was so used to entering a postcode when prompted for a location that they didn't pay attention to the label.
What if I was a uTest tester and reported the postcode defect as the application not recognising "Bracknell". Would my bug be rejected and I earn no money because the actual defect was that the label asked for a town ? Unless I have access to the requirements then how would I know which part was correct ?
Having heard James Whittaker's talk on the future of testing ( which he has now blogged about ) and how crowdsourcing could work then maybe I was proof of the concept - I was part of the crowd and I had found a bug some bugs.
Except that I wasn't going to take the time to report them.
In the future would all programs have a 'report this bug' button ?
Would The Crowd bother to press it ?
Who would the bug report go to ? The website owner ? But what if it was a browser bug or operating system bug - should they be sent all bug reports as well ?
And what happens when the bug report arrives - the tester of the future gets a virtualised copy of my system, a log of my actions - but could still end up looking at the screen and wondering exactly what the problem was and what bug I was reporting.
Or maybe I just think too much and shouldn't be such a grumpy old man when I find mistakes....
Friday, 22 August 2008
Reading it gives me a nice sense of job security...
One of the recent stories on there was an app that was running slowly due to a DB with no indexes or Primary keys
Same thing happened at my last company - a dev working on his own with not much DB experience, no senior guy reviewing his work and no performance testing ( that was what customers were for )
Customer complains the program runs like a dog, senior guy looks into the problem and rocks back and forth on his chair laughing like a maniac and pointing at the screen when he finds out the cause.
The small lesson learnt was that performance testing would be a good thing to do.
Another solution would have been to impose a formal code and design review.
However, the main lesson I learnt was about communication - if the senior guy had taken a moment and some interest in what the other person was doing it could have been picked up before it was shipped.
( and I wonder if a devious company could ship their DB like that and when the customer complains about the speed then perform the indexing, ship V2 and hey presto, great increase in speed and a happy customer.... )
Friday, 15 August 2008
Over on SQABlogs, Peter Nairn was having trouble finding good testers.
Down under in Australia, Dean Cornish seems to be having exactly the same problem
Linda Wilkinson was also getting depressed with the resumes she was reading
Steve Rowe thought we needed a better way to test and in response James Bach thought we needed better testing bloggers ( here I am ! )
On the UK Test Management forums there was a topic posted wondering if there were enough testing resources to cope with the dark side of SOA
Last month I found another blog wondering about the state of the testing industry and thought we needed a better understanding where we came from
I provided him with a link to Lee Copelands Nine Forgettings talk - which as I pointed out in a previous blog is a talk he has been doing for at least the last 2 years
On the other hand, over in Austin there doesn't seem to be the same lack of resources and on TestingReflections Antony Marcano thinks the developer-tester-analyst roles are getting blurred and competency in all three will be required.
From my personal experience as a developer I found it hard to break into the testing industry but persistence paid off.
Interesting times to be a tester for sure
Monday, 11 August 2008
First I read that the Large Hadron Collider Begins Testing - subatomic particles smashing into each other at virtually the speed of light creating between 600 million and 1 billion collisions each second.
I'd love to see the test plan, test cases and test data for that beast !
The stats are just mindblowing - as are the pictures of the machinery involved, some awesome ones can be seen here
Then I read the latest entry of QA Hates You ( 'hate' is such a strong word though ) with yet more examples of how small details are missed out.
Seems we can track 1 billion subatomic particles travelling at 186000 miles a second but checking for a missing 'www' is too hard...
Thursday, 7 August 2008
The agenda for the next BCS SIGIST in September has been announced.
The opening keynote is by Lee Copeland - "The Nine Forgettings"
It's a really good talk, I know because I saw him do it at the SIGIST in September 2006
Does this mean that Lee is short of material ?
I doubt it, I'm sure he could talk for a week and not repeat himself
Could it be that the message is still valid 2 years on and nothing has really changed since then ?
I know which my money is on.
If you can't make it to the event then a video of it is here
( or a PowerPoint can be found here )
Wednesday, 6 August 2008
On the BA site, if you decide to register then you are presented with a plethora of options for the title you can choose from
Admiral, Air Vice Marshal, Crown Prince, Her Majesty, His Holiness, Marquis, President...
Now I don't mix in such circles but I'm finding it hard to picture a Crown Princess logging in and entering her email address and choosing a password. Perhaps they do, if anyone can enlighten me then please do so.
It did make me wonder where the list came from - was it really a spec to have all of these or was it the developers adding some gold plating ? Or having a joke ?
There's no consistency between the airline sites, the Crown Princess wouldn't be able to use her title on the American Airlines site but AA does offer the options of Speaker and Swami which BA lacks. AA also offers a 'Eur Eng', BA offers a 'Eur Ing'
Just plain old Mr for me
Sunday, 3 August 2008
I have been giving something back to the tester community and trying to repay the help I got when I was making the career change from developer to tester.
I'm now helping Rosie Sherry with the Software Testing Club, it has been running for a while and recently the numbers have been increasing rapidly so she needed some help. I try to keep the signal to noise ratio high and make sure the request for ISEB materials and one line vague general questions dont get in the way of interesting discussions.
The Club also has a group on LinkedIn and this has a regular supply of new members that need approving with the occasional spammer that needs removing.
It has made me wonder what all these tester are looking for though. Only a very small proportion of the members join in or start discussions - unless there's a free Fail Whale T-shirt to be won in which case everyone wants to join in.
Also met up with four London based testers via SQA Forums, nice to put faces to names and swap stories about bugs
Thursday, 31 July 2008
Guest speaker was James Whittaker, very entertaining speaker with examples of bugs in the wild, how he became the top cheater on X-Box games and running through Bill Gates's house to test it.
He also outlined Microsofts vision of the future and how testing fitted into that vision, hopefully he will be putting more details onto his blog.
I got him to sign my copy of How to Break Software, next stop is EBay...
This was followed by a workshop on multi-vendor testing with special regard to PCI and the TJX credit card loss.
Paul Gerrard then raced through his Test Axioms in a Presentation Zen style
The drinks breaks were a good chance to network and I got to meet Ben Hall who I had been emailing and Twittering with for a while, always nice to put a face to a name.
Also met the recruiter who got me my job with Acutest and a chance for me to say 'thanks'
and of course it wouldn't be a meeting full of testers if a bug hadn't shown it's head, when trying to show a video up popped a message to tell us all that Windows Media Player had stopped working
Friday, 25 July 2008
Last month was a company social event, sailing to Cowes. Some people were happy to sleep onboard the boats the night before, others with not so good sea legs preferred a hotel bed so I had to go and book 6 rooms
The hotel chain website only allowed a maximum of 4 from the dropdown for number of rooms
After a search through the Help pages I found I could do a Group Booking...
...for 10 or more rooms
So it had to be the manual option of a phone call
Slight Aggravation # 1
( well, OK, I could have booked 4 rooms then 2 rooms using the website but I wanted to be sure there were 6 available )
6 rooms booked, a booking reference number for each and the numbers and invoice would be emailed to me
Two days later, still no email
Phoned up, they checked one of the reference numbers, quoted my email address back to me
And it was wrong
Slight Aggravation # 2
And no, they couldn't change it and resend the email
Larger Aggravation # 1
I used the 'Contact Us' page to report my problem, sadly the response had this sentence
"We also tried searching using your email address (assuming you booked with this) but again, no results were found."
Of course there were no results as they had MY WRONG EMAIL ADDRESS and this was the problem I was reporting.
Larger Aggravation # 2 - read the problem the customer is reporting and dont just sent a canned response
Fortunately, unlike the original operator, I had copied down all 6 reference numbers correctly and was able to put those into the website to generate an invoice
and that wasnt the end of the story, a few weeks later our email server caught some Special Offer emails from the hotel chain - to my wrong address of course
So make sure your testing includes the possibility that the operator gets things wrong - and that there is a process to correct these errors and not annoy the customer
and maybe I should have Twittered about my problems to see if the hotel chain was listening
Tuesday, 22 July 2008
Spotted this job ad and whilst I love my Friday morning fry-up I dont think it would be good for my health to have it 5 days a week.
Nor would it be good for my health to be working at a place where I was expected to be in work for breakfast, I much prefer to have it at home
Friday, 18 July 2008
Thursday, 17 July 2008
Yesterday was Graduation Day for my daughter, BSc in Computer Science and Management from Royal Holloway, University of London.
Just time for some photos, a celebration lunch and then off to Heathrow for a flight to New York to start her 8 week training at a well known global investment bank
She graduated in good company, at the same ceremony Whitfield Diffie was awarded the Degree of Doctor of Science, Honoris Causa
Tuesday, 15 July 2008
Usability testers will probably be choking when they read that their jobs can be done by reading the online chapters of Dont Make Me Think and using a $20 tool ( any choking usability testers care to comment on this ? )
His conclusion is that unit testing has become popular because most shops can't hire a proper QA person and hope devs can write their own tests - but he doesn't answer the question of WHY these shops wont hire one
Well, I do have a clue why from the last place I worked - it had the ethos of 'anyone can test' and the CEO was seriously thinking of sacking all the programmers because they kept writing code with bugs in it...
And he ends his article listing all the testing techniques that should be used - so testing is hardly overrated. I hope his presentation is better than his blog
Wednesday, 2 July 2008
I'd love to know if their measurement of comment density actually measures whether the comment is useful
# Adds 1 to i
i = i + 1
gets a higher score than
# This is a comment
I'd also like to know if it can cope with negative numbers as related in this tale from 1982
Worryingly it seems to be a finalist in some CEO awards
I suppose if it proves successful then we can look forward to Testmeter and Managemeter and if software is an art then maybe it's a good thing there was never a Renaissancemeter - "not enough brush strokes, Leonardo! ", "inefficient use of the chisel, Michelangelo"
Sunday, 25 May 2008
And I should have a blog, Steve Yegge says so
With Testing Reflections still complaining that it's filter list cannot be redeclared and the odds on Twitter being up about the same as getting #14 on roulette there's a lot of material out there for a tester to write about
Plus I need somewhere to start my idea about about celebrities doing the ISEB certification to get traffic and earn a fortune from AdSense