Surveying the landscape
Posted by maryvanbe
Stuart Blythe, in “Professional and Technical Communication in a Web 2.0 World,” discussed using survey research to get the information you need. At my work, we recently sent out a survey to all staff about internal communications, and I found the process very interesting. We are lucky enough to have a survey research center at the research and education institute where I work; however, because there is a cost associated with using the center, we decided to write, send and tabulate the results ourselves using a free online survey tool. This isn’t the ideal method, as it’s not founded in survey science, but we decided that it would be OK for our nonscientific purposes because they would never be published anywhere.
In our first iteration, we focused on current state issues and multiple-choice questions, because we have found in past surveys on other topics that people tend to provide amorphous, non-useful comments when asked open-ended questions. We asked about specific communications vehicles (ie, e-newsletter, TV monitors with messaging, weekly huddles, staff meetings, email announcements, and monthly email updates). We also made sure that all questions were pertinent to our audience of internal staff.
The questions were very specific, such as “How often do you read the e-newsletter Institute Connection?” and then offered responses such as “I read all of every issue,” “I read some of the content,” or “I don’t read it at all.” Because we were all-inclusive, the survey got to be too long, and we were worried people wouldn’t complete it. We also questioned whether, because the institute is merging with another institute, it was best to focus on current state, because we knew everything could change. Plus, would the questions we were asking provide actionable, useful content?
With that in mind, we ended up rewriting the entire survey to ask much broader questions that could be used to inform future decisions as to what kind of communications vehicles we should offer after the two institutes merge. These questions included such questions as “Please rank the top 5 ways you prefer to receive information about the institute?” The choices were “e-newsletter,” “TV monitors with messaging,” etc. We removed all potentially leading questions. We also significantly shortened the survey and added two open-ended questions. The open-ended questions were “What do you think Central Communications is doing right” and “How do you think Central Communications could improve?”
We had a response rate of about 60% (a very good rate for internal surveys) and received 80 open-ended responses (the survey went out to about 155 people). While some of the open-ended responses were not useful because they were vague or clearly intended to be unhelpful, most responses were very helpful in informing our communications plan for 2016 and beyond. I have to admit that I was skeptical about offering open-ended questions, but I’m glad we did, because most people offered constructive feedback. As a result, I’ve changed some of my communication practices and am researching ways to change others.
Another important lesson learned was about the format itself. Because we used a free online survey tool that couldn’t be customized with the branding of the institute, some people thought it was spam and refused to answer it. It reinforced the idea that everything that comes from us has to be branded, even when it is an internal document. To do otherwise is to confuse the audience and lead them to distrust it.
This is the first survey I’ve helped construct to gauge the effects of our work in central communications, and I found it to be a very valuable one.
This site uses Akismet to reduce spam. Learn how your comment data is processed.