In part 1 of “The Endless Possibilities of Mobile Market Research” we examined how a mobile device is in essence a miniature, mobilized focus group facility with GPS enabled capabilities that offers endless iterations of how mobile research can be conducted. In part 2, we will examine the methodology of an actual Mobile Market Research tracking case study that was recently presented at the 2014 Future of Consumer Intelligence conference (#FOCI14).
The case began with Jim Kohl, Director of Consumer Insights at the Career Education Corportation (CEC), a postsecondary education provider with campus-based and online curricula. Over the past several years Kohl has conducted an internal “Voice of the Customer” research project and found that his students had a great online educational experience, although their online research experience was subpar. Additionally, as CEC moved forward with a tracking study, Kohl was concerned if conducting a sole online/mobile research study would be representative of his student populous. One of Kohl’s main objectives was to insure the insights were indicative of their customer base.
In an effort to address these issues, Kohl partnered with Added Value, a full service Top 50 Honomichl research company. At FOCI14 both Brian Kushnir, EVP, Managing Director & Wai Leng Loh, VP of Added Value presented 3 key sampling and methodological takeaways from CEC’s current online/mobile tracking study:
Second, BE DESIGN AGNOSTIC.
Survey takers should be able to seamlessly participate in surveys, anywhere, anytime, regardless of platform (online, smartphone, tablet, etc.)
Third, LET IT GO.
As researchers, we often like to ask tons and tons of questions, in order to gather as much data as possible, so that our results “stick”. However, new evidence suggests mobile device users are more engaged with their devices and consequently, less willing to spend as much time taking surveys on their devices. As such, we as researchers need to “Let It Go” when we conduct mobile research by: (1) shortening the length of survey questions, (2) limiting buttons and images within the survey, and (3) reducing survey questions to basic common denominator questions within select categories.
In other words, a traditional survey will suffice in the online world. However, with mobile, the same survey should be streamlined and broken down to basic elements in order to enhance completion rates while keeping both the online and mobile portion of the study intact.
Case in point: although CEC’s study is still in the field, the aforementioned mobile research methodology has enhanced the user experience and improved completion rates. It will be interesting to see the final results of the online/mobile tracking study as it moves forward.