In college, I was a huge fan of rock guitar instrumentals. One of my favorite musicians was Steve Vai (ex-Frank Zappa, ex-Alcatrazz, ex-David Lee Roth, ex-Whitesnake). In 1990, he released the album Passion and Warfare which included the song The Audience Is Listening. In addition to being a great song that I still enjoy today, that title, that phrase, “the audience is listening”, has stuck in my mind throughout my career. It’s a reminder that the people with whom we interact are listening, or have tuned out, based on what we deliver, how we deliver it, and when we deliver it.

With this in mind, it’s clear that audience is an important, albeit often overlooked, facet of an automation initiative.

I define audience as those people interested in the automation initiative. Since quality is a team concern, everyone in the company is potentially in the audience. The delivery teams, those “doing the work”, are interested in both the technology aspects and in the value proposition, the “what’s in it for the team”; I consider them directly interested. The product ownership and leadership tend to be more interested in ensuring the money that the team spends is being used effectively; I consider them indirectly interested. Most organizations have other “interested parties” as well; we need to identify those for each of our own initiatives since they are not always the same across companies.

When considering audience, here are some questions that I tend to ask:

  • Who will be writing the test scripts? Script writing typically requires some measure of programming ability.
  • Who will be executing the test scripts? This may be the same subset of the audience as is writing the scripts, but not always; not everyone is geared toward writing scripts.
  • Who will be viewing the results and the reports? Different audience segments value different information. We may need to provide different information to the developers than we do to the senior leadership team; also, we may need to provide that information differently to the different audience segments.
  • Who will be creating or selecting the automation tools and frameworks? There are lots of options for tools and frameworks out there; we may even decide to create our own if none of them fit our needs. Appropriate expertise needs to be applied here or we need to allow for learning, mistakes, and refactors.
  • Who will be creating and maintaining the test data? Test data can be voluminous and usually requires extensive knowledge of the testing approach as well as the product’s configuration and behavior. Often, more than one person needs to be involved in this activity; those people may well come from different audience segments.
  • Who wants the initiative to succeed and why? If we can identify our friendlies early on, we can gain valuable background information on the products and team dynamics that are difficult to get otherwise. Uncovering “the why” helps uncover unspoken expectations that help us guide our initiatives.
  • Who will be disruptive to the initiative and why? Some audience members will not be in support of automation for myriad reasons, including fear. Take care, however, not to confuse non-support with expectations that are counter to those that we’ve already received; sometimes audience members support the concept but not the implementation but they have a difficult time divorcing the two when explaining their point of view.

The answers to these questions are key in how we present the interface to our user community, including the technology portion of the initiative. Audience members with no real potential for (or interest in!) programming may benefit from a recorder (gasp!), a keyword-based interface, or some sort of DSL. Those with some programming background or some potential for programming may benefit from a function-based or object-based based interface, allowing them to exercise more complex programming concepts. These audience considerations help filter out technology options that are inappropriate for our needs from the vast number of automation technologies that are available.

Throughout my career, when I was lucky, the development teams were indirectly interested in the automation initiatives; by that, I mean they thought the initiatives were valuable and would even make the product code more automatable. When I was not so lucky, the development teams were ambivalent and unhelpful, effectively removing themselves from the audience. These were the worlds I was used to; the people who were directly interested in the automation initiatives, the ones with all of the requirements and expectations, were the test teams. I found this to be true even in organizations that were practicing Scrum. I was able to successfully launch automation initiatives in varied product environments because I was very familiar with the general expectations of my direct audience.

Then, I joined a new organization. This organization’s culture felt very familiar and I was even able to predict how thing would proceed. I was right to a large extent, but I missed a critical piece…something I’d not seen before. The development teams had their own requirements of the automation initiative; they had expectations of the things it would allow them to quickly check when they made code changes. They even expected to participate in the creation of the automated scripts. This was great news from a team delivery standpoint. Unfortunately, those expectations were not totally congruent with the expectations of the test team, so I failed to meet many of them. In this case, I failed to realize that those directly interested people also included the development teams. This caused much discord and necessitated a direction change in the original automation initiative.

As we can see, knowing our audience helps the automation initiatives succeed; missing a critical aspect of our audience puts our initiatives at risk. Remember, the audience is listening.

Advertisements