Surveying the landscape

Stuart Blythe, in “Professional and Technical Communication in a Web 2.0 World,” discussed using survey research to get the information you need. At my work, we recently sent out a survey to all staff about internal communications, and I found the process very interesting. We are lucky enough to have a survey research center at the research and education institute where I work; however, because there is a cost associated with using the center, we decided to write, send and tabulate the results ourselves using a free online survey tool. This isn’t the ideal method, as it’s not founded in survey science, but we decided that it would be OK for our nonscientific purposes because they would never be published anywhere.Surveys-Jobs

In our first iteration, we focused on current state issues and multiple-choice questions, because we have found in past surveys on other topics that people tend to provide amorphous, non-useful comments when asked open-ended questions. We asked about specific communications vehicles (ie, e-newsletter, TV monitors with messaging, weekly huddles, staff meetings, email announcements, and monthly email updates). We also made sure that all questions were pertinent to our audience of internal staff.

The questions were very specific, such as “How often do you read the e-newsletter Institute Connection?” and then offered responses such as “I read all of every issue,” “I read some of the content,” or “I don’t read it at all.” Because we were all-inclusive, the survey got to be too long, and we were worried people wouldn’t complete it. We also questioned whether, because the institute is merging with another institute, it was best to focus on current state, because we knew everything could change. Plus, would the questions we were asking provide actionable, useful content?

With that in mind, we ended up rewriting the entire survey to ask much broader questions that could be used to inform future decisions as to what kind of communications vehicles we should offer after the two institutes merge. These questions included such questions as “Please rank the top 5 ways you prefer to receive information about the institute?” The choices were “e-newsletter,” “TV monitors with messaging,” etc. We removed all potentially leading questions. We also significantly shortened the survey and added two open-ended questions. The open-ended questions were “What do you think Central Communications is doing right” and “How do you think Central Communications could improve?”

We had a response rate of about 60% (a very good rate for internal surveys) and received 80 open-ended responses (the survey went out to about 155 people). While some of the open-ended responses were not useful because they were vague or clearly intended to be unhelpful, most responses were very helpful in informing our communications plan for 2016 and beyond. I have to admit that I was skeptical about offering open-ended questions, but I’m glad we did, because most people offered constructive feedback. As a result, I’ve changed some of my communication practices and am researching ways to change others.

Another important lesson learned was about the format itself. Because we used a free online survey tool that couldn’t be customized with the branding of the institute, some people thought it was spam and refused to answer it. It reinforced the idea that everything that comes from us has to be branded, even when it is an internal document. To do otherwise is to confuse the audience and lead them to distrust it.

This is the first survey I’ve helped construct to gauge the effects of our work in central communications, and I found it to be a very valuable one.

Posted on November 15, 2015, in Social Media. Bookmark the permalink. 5 Comments.

  1. Hi Mary, thanks for sharing this process. In particular, your point about branding hit home.

    Also, I’m not sure which tool you used, but I’ve had great success with surveymonkey.com.

    One thing I wanted to add, is that when it comes to surveys, it’s best to start with the results you want. I mean, identify the data that you need and then work backwards to the questions. This has helped me immensely.

    • Thanks for the comment, Aaron. We have used surveymonkey, but now we use REDCap pretty consistently. I appreciate the tip about working backwards on developing surveys; I’ll give that a try on my next one.

  2. Great application of the reading to an example from your workplace.
    It sounds like re-writing the survey and really trying to get a handle on what people like/dislike and need was really beneficial to your organization. And as you mentioned, 60% is a great response rate for an internal survey.

    One way you might be able to boost your numbers is if you provide some sort of incentive for completing the survey. While it may seem kind of cheesy, my company started offering small incentives (i.e. a $10 Caribou gift card for anyone who completed the survey) and found that our numbers dramatically increased.

    Likewise, when I get a pop up to survey my experience with a company online I generally won’t complete it unless there is an incentive (i.e a discount off my next purchase). For some reason, the “what’s in it for me?” mentality seems to take charge in these situations. If I am giving my time to provide feedback, some sort of small incentive definitely helps sweeten the deal 🙂

    So long story shot, the next time you have a survey, offering something to your participants might be worth a shot!

  3. Thanks so much for your comment. Incentives are a great idea to boost responses. We used to give out a lot of Target gift cards as incentives, but then our budget got tighter and tighter, so now we can use them only on projects deemed important enough. Still, we’re holding a contest soon to name our new company newsletter, and we’ll undoubtedly offer a Target gift card. Like you, I won’t complete and online survey unless there’s an incentive, like free magazines or discounts.

    • I agree with your peers. Every MSPTC graduate needs to complete a field project or thesis and surveys and interviews are the most frequent methods used. What’s interesting is that we leave that design up to you and the resources made available on the Institutional Review Board site: http://www.uwstout.edu/rs/humansubjects.cfm.
      Other than the brief introduction to methods that’s given in the ENGL 700 course and perhaps the Qualitative Research methods course that’s being offered next Spring, I think future students will benefit from your post!

Leave a Reply to maryvanbe Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.