Posts in Exploratory Testing
Template for session notes
When I teach people how to do exploratory testing, a common point of confusion is around what to put in your notes. While I often tell people it depends on you, what you're testing, and the company you're working for - they still want some concrete advice. So I often show some examples from past projects and provide the following template:

  • Mission: list out what you're testing with this charter

  • Environment: list out meta information related to your test environment (versions, configuration, location, etc...)

  • Risk: as you test, list out what risks you're looking for while testing

  • Coverage: as you test, list out what areas, features, users, data, or other meaningful dimension you're covering while testing -- it's worth noting, I also instruct them to list out what they didn't have time to cover...

  • Techniques: as you test, list out what you're doing... what techniques you're using, how you develop tests, etc... (in math class, this would be the "show your work" section of the document)

  • Status: as you test, list out questions that occur to you that you need to get answered later, possible issues/problems/bugs you find while testing, notes about automation or future test sessions, etc....

  • Obstacles: as you test, list out things that get it your way or ideas you have for things that would make your testing more effective -- this can be tools, hardware, information, training, etc...


For those readers who do session based testing on a regular basis, you'll notice I don't capture some of the classic items like setup time and time spent investigating issues. If you need to capture those metrics (or other metrics your team uses), simply add them in. Over time your session notes morph to become your own and you'll develop a format that works for you.

I'd be interested to see what other people capture.
Trick for clarifying a test charter
Sometimes when I ask someone what their test charter is, I get a paragraph in response. That's not bad, but I find that it often leads to a poorly understood scope definition, which leads to lack of focus while testing, which leads to a charter that runs way too long or feels unproductive. I have a trick I use to help simplify the mission of the charter when this happens.

Try using the following template:
"My mission is to test <insert risk here> for <insert coverage here>."

Some examples:

  • My mission is to test for various boundary errors for Microsoft Word's bullets and numbering feature.

  • My mission is to test for accurate error messaging pop-ups for Ford Motor Vehicle's Build and Price website.

  • My mission is to test for SQL injection vulnerabilities for application login and administration screens.

  • etc...


You might then still use the original paragraph to help detail out the charter, but getting a clear and concise mission helps me better estimate how much time I'll need to test, and maintain better focus while testing.
Thumb vote for priority
Sometimes, when reviewing charters with the team, I look to them to help provide some insight into what we should be focused on with our testing. One technique is to walk your list of charters (all twenty or them, or all 200) and have each person provide some insight into where they think it falls in priority. To facilitate this, I sometimes use a thumb vote.

When thumb voting, everyone must vote. There is no sideline. Since I usually use three levels of charter priority (A, B, and C), those map as follows:

  • thumb up = A

  • thumb sideways = B

  • thumb down = C


What I find is that for most charters, most people on the team are on the same page (or really close). Every now and then, you have to make a decision on a close tie (but since you're the test lead, you're use to doing that anyway). Sometimes however, you see some odd votes. Like four up and four down. Those votes normally lead to some really great conversations. Often, people misunderstand the charter or have a different understanding of the risk involved. This surfaces those differences.

It's also fun. (In an I'm-a-tester-so-voting-on-charters-is-fun sort of way.)
Wrapping up a session
When I'm doing a 45 minute test session, I typically find that the last five minutes are reserved for wrap-up. Since I'm normally working with a timer of some sort, I'm normally aware of when I have five minutes to go. Typical activities for wrap-up include:

  • finish what test or activity I'm in the middle of

  • write down what I didn't get to that I think I want to come back to

  • double check to make sure I have enough information for the bugs I need to log

  • double check to make sure I captured all the relevant notes on test data for my session notes (which might mean saving a spreadsheet or something)

  • clean up my test environment (if needed)

  • stop my screen recording and save it off (if running)

  • write down a couple notes about how I feel (did I need something I didn't have, energy level, frustrations, etc...)


It looks like a lot, but most of the time it's about five minutes. If it runs over, that's fine of course. Technically my hands-on testing is done when I'm finished with the first bullet point. But I like to include all the other stuff in my time estimates (aka my session time) because I feel it's all important for the work I'm doing.