As we’ve said many times before, great research is all about:
- Finding the right people to talk to
- Asking them the right questions
In our last post we talked about how to use the Recruiting Guide to find the right people to talk to. This post is about the next part, asking the right questions.
Whether an Interview Guide for an in-depth one-on-one conversation or a focus group Moderator’s Guide, the process of writing the document is important. During the cycles of vetting with the client, you gain deep insight into the important subtleties of the findings. There are few hard-and-fast rules for Interview Guides, because the topics vary, but here are a few principles that have worked for us in our research.
1) Invest the time needed to get the participant comfortable – The first question has to be easy and get the participant talking. It has to be more than name and job title or the weather. You need to set the stage for being chatty and opinionated. For our IT participants we like to ask about something happening in the industry, or trace the turns of their career. The catch – this takes time. When the Guide starts getting long and you have to cut, expect someone to suggest eliminating the intro. Don’t. Stick to your guns and spend some time on this. It pays off in a dramatically better conversation and stronger research results when people let their guard down and get chatty.
2) “Peel the Onion” – A good research guide starts more broadly and then gets progressively more specific. For example, start with challenges – what is hard about what they do or what is hard about a particular area. If the challenge the project is trying to address is mentioned unprompted, that is very interesting to know. If it’s not mentioned, go to the next layer of the onion and find out why it wasn’t mentioned. Then drill down one more layer into the actual solution being testing, and so on.
3) Use open-ended questions, but be prepared to prompt – An in-depth interview is not a phone survey. You want to give the participant lots of opportunities to tell you their thoughts on the topic and point to issues that you haven’t thought of. But sometimes people need a gentle push in the right direction, so if your open-ended question isn’t getting you anywhere, be prepared with some options that might get the participant talking. Be sure to present the options in a “let me clarify the question …” tone, not in an “is it A or B” tone.
Comments Off on Three Tips for Effective Interview Guides
Great research is all about:
- Finding the right people to talk to
- Asking them the right questions
The Recruiting Guide is the researcher’s tool to ensure we get the first one right – talk to the right people. It is an important written agreement between the researcher and the client that must be approved. This ensures that at the end of the study you know you’ve talked to the right people and helps prevent the “Why did we talk to THAT guy?” syndrome.
But Recruiting Guides can be tricky. The core problem is that prospects don’t talk like marketers do. Marketers like to segment their audience into tidy boxes, but in real life peoples’ roles aren’t that clean. With Recruiting Guides, it is particularly important to be careful about any questions where you ask about responsibility.
IT professionals typically have great pride of ownership in their work – which is great for the businesses that rely on IT business services to function. However, it can be difficult for researchers who need to find a certain type of person, because many people will put up their hands to say they have that responsibility.
In a recent study we were looking for IT procurement participants. We needed to talk to the people who did the hands-on financial and contract work, not the technical buyer who determines the product to purchase and then passes that off to procurement to get the paperwork done. Definitely the way NOT to write the Recruiting Guide in this case was to ask, “Do you have responsibility for IT procurement?” Probably every employee in the IT organization would answer “yes” to that. They all evaluate technology solutions and make recommendations. They feel ownership and responsibility for procurement, even though they don’t have formal procurement roles.
Instead, we asked questions in the guide about reporting structure and the focus of their jobs – how much of their time was spent on procurement? – and we got exactly the people we wanted.
Practical Tip: Think about someone who works with the people you want in your study, but that you don’t want to participate. Imagine how they would answer the questions and pass the screener. It can be challenging, but if you look at every question in the Recruiting Guide and think how someone might qualify for a study even though they aren’t the persona you want, the result will be a better project.
We are doing more and more online focus groups. We’ve always done some, but have noticed in the past year that we have definitely “crossed a chasm” and there is now heavy interest in online focus groups.
My take from conversations with clients is that this interest in online focus groups is caused by a mix of the technology getting better and people just plain getting used to the idea. It’s a typical technology adoption curve – in the beginning it is different and you’ve never done it before so it feels risky. Then you do it once, realize it can work, and from then on you’re a convert.
I’ll admit – we’re converts. I’ve heard people online saying that online focus groups are frequently pushed because researchers are too lazy to travel. That’s not us. We LOVE to travel. I look forward to those crazy New York/Chicago/Denver/San Francisco weeks. Throw in some Europe and Asia and I’m in heaven. So we’re converts in spite of the fact that we have to stay home to do research. You can reach so many more people and include participants from the non-major cities that are an important part of the target audience – all for the same budget.
That said, online focus groups are different. It is challenging to keep people focused, especially our IT participants that have email and IM and texting all distracting them during the group. Here are three of our best tips that we’ve learned over the years:
1. Have a Smaller Group
We’ll usually recruit eight IT professionals for an in-person group. For online groups, we recruit six. It’s hard enough keeping these IT guys engaged and off their blackberries and iPhones when you’re in the same room, it is MUCH harder when you can’t see that they’re doing email instead of talking to you. Smaller groups mean the participants each have more active time and are less likely to tune out.
2. Have a Strategy to Keep Intros Short
When possible have the moderator touch base briefly with each participant the day before – this works best when doing customer research. Get the basics on what they do and put that on a slide to share with the group. Cut short the “round the table” introductions and move to the more interesting and engaging topics. We usually use a slide to share first name, industry, and major infrastructure or application responsibilities related to the study. Then in the “round the table” intros we try to do something more interesting like where they went on their last vacation or their best “silly question a non-geek asked them” story.
3. Use the Tools
Every time you call on a participant you grab their attention. You can also grab their attention by switching the visual or adding a poll. Find ways to use the online meeting tools in an interesting way. Use them as a whiteboard for capturing notes, list talking points to enforce the spoken word, use declarative statements for the discussion and write those down in a prepared slide rather than just reading from the guide.
Online focus groups definitely require planning and preparation to keep participants engaged, but with a bit of effort, the results are fantastic.
Frankly, we’re not convinced. I’ve seen a bunch of headlines like New Technology Achieves The Same Results As Focus Groups In A Fraction Of The Time And Cost, and have even had a comment left on the Dimensional Research blog from a vendor who offers similar solutions.
This strikes us as so much nonsense. It’s not that these couldn’t be great products – we’re quite sure that these tools can be very useful, and give valuable insights when used to answer the right types of questions. Our beef is the claim that they can produce the same results as focus groups. To assert that any tool can replace qualitative research seems to demonstrate a clear naiveté about the power of having a well-moderated conversation. One of the things we’re quite sure these tools cannot do is ask a really great follow-up question.
Qualitative research – whether focus groups or in-depth interviews – is not about listening to a group chatter about whatever they want, or giving them a topic then letting them have at it. It’s about really listening to participants and asking the question that gets them thinking. Often it’s about guiding the conversation to get at those insights that are just below the surface. The best researchers have a knack for understanding what is NOT being said and finding out why.
What do you think? Have we missed the point of these new tools? As usual, I’ll point out that our business is technology market research – we work with vendors who sell to corporate IT, which is primarily B2B. Perhaps these tools offer more value in consumer research. Either way, we welcome insights from anyone who has gotten good results from a solution like this. If I’m an old fuddy-duddy (and perhaps my use of the phrase fuddy-duddy is proof that I am one), go ahead and tell me.
Or possibly the problem is that these tool providers have a messaging problem? I’m assuming they have NO plans to sell their product through traditional market research providers if their press releases are so dismissive of the work that we do. Perhaps they needed to do some qualitative research on their messaging with some of us? 😉
We love analytics. Of course as soon as I started blogging we installed Google Analytics. We love watching the numbers go up and down and seeing what topics are most interesting to our audience (Our #1 post on this blog is Focus Groups vs. In-depth Interviews, by the way), whether external events, such as being named a Top 10 Research Blog by Quark Magazine, drive traffic more than specific content, and so on.
But analytics – especially web analytics – can be very misleading, because it typically misses the demographics. WHO is doing that behavior and HOW are specific parts of your market behaving differently.
We recently blogged about a client who thought that videos were their most compelling web content because the web analytics showed a lot of video activity. But it turned out that the technical buyers – arguably the most important audience, but a small portion of web site visitors – didn’t use videos, a detail that was completely lost in the overall analytics data.
So here are two very simple tips for using Web analytics for good:
- Assume analytics don’t tell the whole story – There is a lot of useful information in the analytics, and it should absolutely be used. The danger occurs when analytics become undiscussable facts. Start with the assumption that even with the data you don’t know everything, and you’ll be able to ask the right questions to use the best analytics and look for more information when needed.
- Don’t stop talking to customers because you have data – The video example mentioned above popped up after talking to just a handful of people. It doesn’t take many live conversations to figure out what is actually happening with the people who are spending money on your products. Talk to some of them!
Comments Off on Using Your (Web) Analytics for Good
InformationWeek did a very interesting article on Practical Analysis: User Habits And Making Tablets Seem More Like Beer. They did a survey showing IT prefers HP, then Dell for tablet computing and only then the infamous Apple iPad. But how can it be possible? “Everyone knows” the iPad is the market leader! The answer, according to the article, is that IT acts very differently than consumers do, and this was a survey of IT professionals.
We didn’t conduct the survey so we can’t comment on the methodology used (although as much of the methodology as the article reveals certainly sounds sound), but we do completely agree with this: When research delivers a finding different to core beliefs, you’d better be able to explain why.
There is almost always an explanation, although perhaps it’s not as obvious as the Infoweek example. We recently did a series of projects that included questions on how various IT professionals use vendor web sites when making purchase decisions. The participants in our study were adamant that they didn’t like videos – they weren’t scannable, tended to be full of “marketing fluff,” and were generally a waste of time. But our clients’ web analytics showed that videos were viewed a lot.
The instinct was for the web analytics people to assume that the research was wrong, because they had real-life data showing that videos were popular. Of course, we had to consider the possibility, since one of the well–documented challenges with research is that people don’t always act the way that they say they will act. But we were pretty confident in our methodologies, and our techniques for getting to actual behavior, so we decided to figure out why. And as they did in the Infoweek article, the first thing we did was look at our demographic. Was there something about the demographic we spoke to that behaved differently than the overall Web audience?
Once we started digging, the answer was obvious: Our study was with product-level decision-makers – the most technical of the technical, the “sharpest pencils in the box,” the brightest ones. They didn’t like videos because they understood things quickly and preferred to scan content. But we discovered these decision-makers also had to educate the rest of their teams – and they would frequently forward videos. Since there were multiple “less technical team members” that watched videos for every “technical decision maker” that didn’t, the behavior was lost in the analytics.
If this kind of “what’s the underlying explanation” problem fascinates you the way it does us, we recommend a great podcast from HBR: Strange-But-True Research Insights.
Comments Off on When “Everyone Knows” Contradicts the Research
My family and I are living in the south of France for a year. It’s been a really incredible experience, and one of the things that has made it possible is this fantastic job doing market research with technology professionals.
This is truly a profession that can be done from almost anywhere in the world – “location independent living” at it’s best. Phone and web survey work can be done effectively from anywhere with reliable internet and phone connections. And you have to travel for focus groups anyway so a few more hours on a plane isn’t much of a difference. I wondered at first if my clients would be concerned that I was farther away, but that simply has not been an issue.
Here are a few tips I’ve learned in the past few months that anyone working overseas temporarily can leverage, but especially market researchers:
1) Vonage is your friend: When you’re doing recruiting or in-depth interview calls, it is a disaster to have a strange foreign number – or worse “Caller Unknown” – show up on CallerID. We brought our Vonage box to France and plugged it into the internet, and now it’s my name and number that shows up when I make calls, so the participants feel comfortable picking up the phone. Of course it’s also great that calls to North America and many other countries in the world are free, and being able to dial 1-800 numbers is a big plus.
2) Try Euro Saver: Finding a reliable calling card with local toll free access was much harder than it should have been. Many of the cards listed toll free numbers that were out of service. The Euro Saver from Cloncom has been great. Good prices and reliable toll-free access across every country I’ve been to in Europe so far.
3) Keep your US Smart Phone: It was important to me to keep my US cell phone number for incoming calls and business consistency. I was surprised that my cell phone bill is actually lower in France then it was in the US – mostly because I use the phone less of course. I cut my plan down to the minimum number of local minutes, and added the global plan for data, which was not expensive. The calling plan is exorbitant at over $1/minute, but I just don’t use it that often. Outbound calls I make from the Vonage line and long inbound calls I ask if I can call back from a land-line that is more reliable.
4) Invest in a monitor and an all-in-one printer: I’m finding that I can live temporarily without most of the things I have in my California office, but I did go ahead and buy a printer/scanner/copier which has been invaluable. You must be able to sign documents and email them if you’re running a business. I also bought an external monitor. When doing analysis and writing reports – especially for quantitative projects – I must have that extra screen real estate.
5) Bring your favorite headset: The Vonage line requires a US phone, so I brought my favorite phone with my favorite headset. It’s the same setup for calls that I had in my office in California.
With that, you’re all set. The one other adjustment I needed to make was my working hours. Researchers almost always have crazy hours. Even if you work just with North America you have four different time zones you have to cover, and most of us also do work with Europe and Asia as well. But when you’re abroad, you have your client calls at crazy hours in addition to your participant calls, and you need to adjust your mindset for this. It’s been fairly easy though. I do more calls in the evening, but on the flip side I really appreciate the interruption-free mornings that let me focus on analysis and writing.
I strongly recommend working overseas to all researchers who have international business. It’s not that difficult, and it gives a great perspective.
Language is a topic that is particularly close to my heart right now. Last August I relocated with my family to the south of France for a year. We wanted to experience life in another country.
I also wanted to improve my French which was best described as “preschool-esque.” My husband is French-Canadian, so French is an important part of his heritage. We wanted our kids to be immersed in the language and I hoped to get my own French to a more functional level.
Being here and struggling to communicate has been very good at helping me understand the experience of many of our international research participants. Dimensional Research clients are technology companies that frequently have global operations and most of our projects include international participants. Since English is the global language of technology, technology professionals typically speak at least some English, albeit with a wide range of ability.
My French is still not very good, but I’m finding that there are certain people here that I can easily communicate with because they create an environment that makes it possible. Any researcher can have a more successful conversation with a participant that is not a native speaker by using these same techniques.
1) Speak slowly and clearly. This is a must for any research conversation, but is particularly important with a non-native speaker. One of my challenges when listening to French is picking out the individual words within the sentence. If someone talks too quickly or runs the words together, I miss words that I know perfectly well and so lose the overall question. It’s much easier to talk with people who distinctly separate each word.
2) Use simple words. When learning a second language, it’s easy to master words like “good” and “nice,” but “majestic” or “phenomenal” don’t come up very often. The simpler the language, the more likely it will be understood.
3) Avoid jokes. Humor and wit are among the most difficult things to grasp in a second language. Unless it’s really, really, REALLY simple it’s just confusing. Even worse, having someone laugh during a conversation I don’t understand makes me wonder if they’re laughing at me, which is discouraging for the overall conversation.
4) Repeat first – then reword. A lot of times I don’t get what someone says to me the first time, especially if it’s a new topic. But if they repeat the words in exactly the same way, I have a second chance to pick up what they’re saying. I know this is not a natural thing to do. Instinct says to use more or different words since that helps when talking to an native speaker. But a non-native speaker just has more words they have to filter through, which makes it harder. Of course, if repeating doesn’t work, then reword, sticking to simple language.
5) Give participants time to think. It requires a lot of brain-cycles to carry on a conversation when you’re not fluent in the language. Everyone knows that feeling when they have to search for a word that just isn’t there. When I talk French it ALWAYS feel like that. Translating real-time is difficult and can take time. Simply pausing for a little bit will give the participant time to think and come up with the best way to express themselves.
6) Enable participants help each other. In a focus group environment, put it on the table. Say that you know it is challenging to speak a second language, and that everybody in the room should help out everyone else to facilitate communication. If you give people permission to struggle a bit, then the group can pitch in to help an individual express their thought.
Of course, there are times when the only real solution to research with non-English speakers is a translator, but sometimes a bit of patience is all you need for spectacular research results.
It’s very common to be invited to participate in a bad web survey or a bad phone survey. But this week I had the dubious honor of being a participant in a truly bad in-depth interview. I had been invited to give my perspective on the use of research during an enterprise sales cycle.
Usually researchers are automatically excluded from these kinds of research projects. However, Dimensional Research develops research reports including customer-based ROI analysis and web surveys, so I was asked to participate and couldn’t give up the opportunity.
Here’s just a few of the things that went badly during the 45-minute telephone conversation:
1) The call opened with the interviewer telling me that he “was convinced there was a huge gap” and he was doing the research to prove it. Yes – he started out by stating his conclusion and his bias!
2) After a 10 minute preamble, during which time I was easily distracted by a client skype-ing me, he finally asked me the first question. His opener, which sets the tone for the entire conversation was this gem: “Where are you located?” They already had taken my contact information previously, so it surprised me that they opened with this.
Let me clarify – I completely understand a need to confirm information gathered previously, but this seemed like a weak opener. Personally, I prefer starting an in-depth interview asking the person to tell me about whatever their involvement is in the topic of the study. That usually covers a lot of the basics (job title, location, etc.) without asking the boring laundry list of questions. Of course you can always ask those questions later if they’re not covered in a more engaging fashion.
3) The interviewer had sent a supporting document – probably about as many words in the document as is in this post, in about the same font size – and then proceeded to read the whole thing out loud to me. He didn’t seem to get that I CAN READ and was done with the document way before he finished it and was being distracted by a text message. Again, certainly there are times when it is appropriate to review materials with the individual, but one of the benefits of an in-depth interview is you can get a real feel for the participant and understand when you can move more quickly or need to progress more slowly.
4) The interviewer ARGUED WITH MY ANSWERS!!! Now I know that all researchers need to dig deeper and sometimes playing devil’s advocate is a good tool. We’ll often use questions like “I understand that you prefer this, but can you explain why that isn’t a preference?” But this guy actually said “Really? That’s odd. In my experience it never happens that way.” Talk about not feeling valued!
5) The interviewer did finally let me get a word in edgewise, and I was half-way through an idea when I got the “Thank you Diane, this was great” and we were done.
I will confess I was a bad research participant and didn’t turn off my instant messaging, email, and other distractions. But I was taking time out of my schedule to share some expertise that I had and the researcher didn’t, and there was no attempt made to make me feel heard or make the experience a positive one for me.
Such a waste of 45 minutes – for both of us.
Seth Godin wrote an interesting blog post about Netflix. In the post he says that “Netflix tests everything. They’re very proud that they A/B test interactions, offerings, pricing, everything.” But then he notes that “The three biggest assets of the company weren’t tested, because they couldn’t be.”
It was (as expected from Seth) an intriguing and thought provoking post, and made us think about the role of research and where it fits.
We’ll confess: in the Dimensional Research client base – mostly B2B technology companies – we don’t spend much time with the true technology innovators. Every successful technology company has a few amazing minds that come up with the technology ideas that change the face of Silicon Valley – the mouse, the browser, virtualization, cloud computing. I’ve worked with some of them and met others socially and I’ve never heard of a really breakthrough technology idea that came from a focus groups. The innovator brains are visionary and are so far ahead that only a small subset of their future customers can even understand the idea when it is explained, let alone articulate the need for innovation.
But those amazing technology innovators are usually only a tiny part of the technology company machine. There are much larger teams of people who need to take those ideas and develop, market, sell, support, collect payment, and so on. Those operations don’t require INNOVATION, they need OPTIMIZATION. How can they do what they’re doing more effectively? How can they generate better business returns for lower cost? It’s here where there ARE testable ideas, where customer feedback makes a huge impact on business results, and market research truly shines.
Yes, innovation is absolutely necessary and companies need those breakthrough ideas. But optimization is also necessary if you’re going to build a great business.
As always, we welcome contradictions so we can evolve. Have you used market research to come up with a truly innovative idea? How?