יום שלישי, 23 ביולי 2013

Notes from the SIGIT seminar: Exploring Exploratory Testing with Lee Copeland

After participating in the seminar of Lee Copeland on Exploratory Testing, which took place at the SIGIT conference on July 23, 2013, I pondered how it would be best to document my notes and decided in the end to divide the post into two parts. The first part will include new thoughts and ideas Copeland spoke about (he did not originate all of them), which I intend to adopt and pay more attention to in my daily work. The second part will point out the way I intend to implement these ideas.

Part 1: Exploratory cheese via improvisation

The main gist of exploratory testing is that the analysis, design and test execution are performed simultaneously, so that the exploratory testing coincides with agile testing. Copeland mentioned that the problem with the V Model is that we often start planning the test at a time when we know very little about the system.

To emphasize this point, Copeland introduced us to a short game, as James Bach suggested, known as “twenty questions”. After we finally figured out (using more than twenty questions) that Copeland was thinking about “cheese”, we all agreed that if we had written down our questions in advance, we would not have succeeded at all in divulging the answer.

This demonstration did the job of convincing us that exploratory testing was indeed a super concept.
Copeland, however, proceeded to warn us that there are some cons to exploratory testing:
·           Our daily experiences can cause us to be “blind” to our reality.
·           Exploratory testing may be a novel method, but it may not always be the way to go. Yes, it should be included in our toolbox, but we need to have other tools in that box as well.
In concluding this section, I would like to point out that Copeland asked us to associate exploratory testing with improvisation, recalling the top-rated TV show “Whose line is it anyway”.

Part 2: Schedule the session and do the test

(Note that since my team works with the Jira application, this part is based on working with that application. Please feel free to apply the information in this part to any other tool you are using.)
During the seminar, I especially liked the idea of scheduling exploratory testing, just as we do with other iterations. This idea will become a task in our Jira program (and may also be used to log out at the end of the process).
Copeland spoke of Bach’s idea of performing exploratory testing in sessions (SBTM). Jira employs a great tool for that called “Bonfire”

Due to my military service as an officer in the IDF, Copland and I happen to share a great love for mnemonics. He used several throughout the seminar, by I am determined to adopt two particular ones in my daily routines.
1.   SF DPT (San Francisco Depot) - make sure all items are covered.

For exploratory testing, it is a good idea to use a checklist of objectives that includes the following items: Structure, Functions, Data, Platform, Operations and Time. (Note that the boundaries between these items need not be so sharp.)
In Jira Bonfire, this session should be in an additional information section.

2.   During the session, use PROF mnemonics.

Past: What was done? Compile notes during the session, using the Bonfire extension to record what you have done.

Results: What were your findings? In Jira, defected and screen shots are automatically added to the Bonfire session if you opened the issues using the Bonfire extension.

Obstacles: What slowed down or blocked our session?

Forward thinking: Where do we go from here?

Feelings: How do we feel about what happened? Since I am a red-headed Israeli, the process will certainly touch my heart and cause me to experience feelings and emotions concerning the testing activity; but this is something I will have to cover in a separate post.

At the end of this part, we spoke about the outputs of our activity. At the end of the day, we all have to report something to someone. The report should be generated based on the Bonfire session and added to the ticket of the session as a document.
The report should include the following: charter, metrics, charter opportunity for other sessions, notes, and issues raised during the session.

Part 3: Concepts that require some follow-up posting


  • Testing and emotions
  • Is the waterfall as quite as it sounds?
  • Does an infant born as a tester? (ad hoc testing thoughts)

אין תגובות:

הוסף רשומת תגובה