Tag Archives: researcher

Marketers and the Future of DMP Insights

By: Hannah Chapple
Advertisers, agencies, and publishers are swimming in data.
They have so many data points, from a variety of sources, that they are simply
overwhelmed by it all. Website (cookie data), social data, CRM data, you name
it, and they’ve likely got it. Sorting all of this data from various (often
siloed) sources, in a timely and efficient manner is a near impossible human
task.
We all know that the role of a marketer is to reach the
right consumer, at the right time, with the right message. But to do this
effectively, marketers are challenged with interpreting their mass amounts
of data and uncovering actionable insight, at speed and scale.
Interpreting mass
amounts of data is no easy feat.
As the demand for digital marketing and
programmatic/real-time ad buying rises, marketers face more pressure than ever
to target audiences faster, and with laser-precise, data-driven insights. We
know that consumers will only respond to the messages that speak to their
interests, passions, wants, and needs. And in the world of real-time bidding,
technologies only have milliseconds to get that messaging right. And guess
what? These messages cannot be crafted with broad categorization methods like
demographics alone. Demographics as a stand alone are limiting and tell you
nothing about what an individual is interested in, passionate about, or value.
To fill this gap, we have seen marketers seek more and more
data resources. That’s why we see marketers not only trying to make sense of
their first-party data but also second party data (from partners) and purchased
third-party data. Can you understand why marketers are swimming in data? It’s
a vicious cycle. So again, we arrive at our original problem: how can
marketers turn mass amounts of data into actionable insight, at speed and scale?
Are DMP’s the magic
solution in the advertising ecosystem?
To better target potential consumers, many advertisers rely
on Data Management Platforms (DMP’s) to collect their mass amounts of disparate
audience data (including the first, second, and third-party data we spoke
about) and interpret it. In short, DMP’s are cloud-based warehouses used to
generate an audience segment(s) based on patterns and trends set within defined
parameters. The goal, of course, is to deliver high-quality, accurate audience
segments to marketers, and all other players in the advertising ecosystem, like
DSP’s. When placed into action, these audience segments (generated by the DMP)
should result in smarter optimized ads, efficient media spend, and less ad
waste. But is this actually the case?
Marketers are sitting on a wealth of data, with a goldmine
of potential insights to derive from that data. That’s why more and more
companies are investing in DMP’s for their business and are hiring
highly-qualified, expensive professionals to manage them. However, while DMP’s
are used to extract insights, there is still a lot of wasted potential in these
tools.
Here’s a quick DMP lesson: DMP’s operate on a ‘hypothesis’
basis. DMP users must set conditions or a query to break down the data sources
and form a specific audience segment they want. For a DMP to work properly
(with speed and accuracy) and know what data to segment or pair, a DMP user
must understand many factors including media, marketing, analytics and of
course data. The DMP will then do its best to match data and form an actionable
audience segment for the marketer to leverage.
For example, a marketer could leverage behavioural cookie
data to build an audience of males in Nova Scotia, over 30 who browsed a car
website on their mobile device. This audience can then be used for ad-buying,
media placement, etc. 
But marketers don’t
know, what they don’t know.
But what does this marketer really know about this audience?
What are their interests and passions, outside of cars, and how can they be
determined? This is why, despite the integration of DMP’s, marketers still
aren’t getting it right. While automated, there is still a human error in how
DMP’s select which data to process and interpret.
Don’t get me wrong; there is incredible value in DMP’s but
there is also an incredible opportunity present. Ultimately, the goal of
leveraging a DMP is to provide a personalized consumer experience by relating
to their interests and behaviours. But marketers are only grasping at the data
that they are currently able to understand. Like I said, DMP’s operate on a
hypothesis basis, contingent on the user’s understanding of the data.
We, as marketers, haven’t even scraped the surface of what
is possible with DMP data. Marketers need a solution that looks beyond
predetermined hypothesis and attributes. Instead, we need a solution that
interprets unsupervised data and can discover the hidden relations and insights
within audiences that marketers don’t yet know.
How do you foresee 2017 shaping up? How will DMP’s evolve?
Share what you think down below: [Read
more on the Affinio blog]

About the Author: Hannah
Chapple is the Marketing & Content Coordinator at Affinio, the marketing
intelligence platform. Hannah holds a Bachelor of Business Administration with
a major in Marketing from the F.C. Manning School of Business at Acadia
University. 

#MediaInsights Day 1 Recap

By: Jim Bono, Vice
President, Research, Crown Media Family Networks

MI&E Conference Director Rachel McDonald started
off the day welcoming this year’s attendees and introducing this year’s
co-chairs: Janet Gallent (NBCUniversal), Rob McLoughlin (POPSUGAR) and Bruce
Friend (Maru/Matchbox).
Bruce then sat with Turner’s Howard Shimmel for the OPENING
KEYNOTE INTERVIEW: Re-Imagining the Future of Television
.  Recently, at a Cynopsis conference, Shimmel
said “we’re at a measurement crisis.” 
Elaborating on that comment, he explained how it’s 2017 and we still do
not have a robust cross-platform solution for our industry. Advertisers want an
infrastructure that allows more exposure than just reach and frequency.  With Total Audience, we still don’t know what
to do with it.
They also discussed the Turner Ad Lab, and how people go to
Netflix, Hulu, etc., to watch content without ads. What can we do to make the
advertising experience better for the consumer? Howard believes that the industry should have a published
document that mandates what currency data research vendors should provide for
the content providers. As new platforms are emerging, we need to better
understand where those consumers are going to find content.
Bruce asked about big data and how it’s all the rage. As an
industry where do we go from here?  
Howard explained how there is an abundance of research tools out there.  We just haven’t done a good enough job
telling our clients that we have all these tools.  Big data is a component to an overall data
framework. We need to know when to use it and not to use it. Sometimes Big Data
can be wrong data.
Bruce also questioned how new companies are great with tech
but don’t understand the data they deliver. However, other great long-time
research companies are very good at analyzing data but don’t have the tech.  Howard feels that there’s nothing wrong with
using a combination of data sets like Nielsen, MRI, and panel data to come up
with the best solution. Unfortunately, there are too many companies that reach
out and don’t really understand our businesses.
He still believes that survey research is important to our
industry as data tells what, but not why.
KEYNOTE: The Importance of Race and Ethnicity in Reaching
Millennials

Cathy Cohen, Professor at University of Chicago, gave us a
very entertaining look at millennials and the importance of race and ethnicity
among this group, especially regarding this year’s election. The majority of
Millennials in the US are Hispanic and African-American, and by 2060 White will
be a minority. In this past year’s election, more African-American and Latino
Millennials voted for Democrats, while there were more white Millennials voting
Republican. However, in the 2016 primary vote the choice among all Millennials (regardless
of ethnicity) was Bernie Sanders.
Cohen’s presentation covered: 

  • ??        
    The complexity of Millennials through a racial
    framework
  • ??        
    Researching race and Millennials
  • ??        
    Rise of Millennials in the workforce
  • ??        
    Importance of Millennials in the Political force
Millennials are becoming an increasingly important electoral
demographic. The share of eligible
voters that are Millennials has grown during last 3 elections:

  • ??        
    2008 – 23%
  • ??        
    2012 – 29%
  • ??        
    2016 – 36%
Cohen also addressed the six key problems with studying
Millennials:
1.      
 Generational frames / over-representation of
white Millennials
2.      
 Under
investigation of white Millennials
3.      
 Homogenous communities of color missing
Millennials
4.      
 Segmentation of Millennials of color – pick
one!
5.      
 Millennials as experts of Millennials -
homophily
6.      
 One-offs
or waves – assumes stability in taste, preferences and decisions
KEYNOTE PANEL: How Consumers Engage with Programming Across
Social Platforms
Sean Casey from Nielsen Social Guide moderated this
morning’s Keynote Panel featuring Brian Robinson (Facebook), Tom Ciszik
(Twitter), Guy Ram (NBC), Leslie Koch (HBO).
Insights focused on the evolution of social media and how
quickly it’s grown.
Consumers spend 5.5 hours per week using Social Media on
their smartphone.
64% of consumers use smartphone while watching TV. 
1.2 billion interact on Social referring to TV.
After breaking for lunch hour afternoon consisted of Concurrent
Tracks.  These case studies were broken
into three groups:

  • ??        
    Track 1 – Targeting Viewers
  • ??        
    Track 2 – Audience Insights
  • ??        
    Track 3 – Innovation in Media
Track 1 – Targeting Viewers case studies:
From Ordinary Target to Persuadable Target

David Kaplan from Bravo, along with Zach Schessel from NBCU
and Peter Bouchard from Civis Analytics, discussing how to hit the right target
audience and “swing” viewers. The presentation also looked at how to
attract casual viewers without alienating the core viewers.
Key takeaways were:
??        
The different creative approach is often
required for on-air vs. off-channel to drive maximum impact with loyal and
casual viewers
??        
Casual Bravo viewers may all have some affinity
for the network but only the “swing viewers” in this group can be
readily persuaded to deepen their commitment and watch more
??        
 An ads
positive persuadability should be balanced with any potential backlash effects
to ensure a net positive effect
??        
 Not all
swing viewers are created equal, eg. consumers in different DMAs can have a
varied response to creative hooks
Viewing Predictions & Inventory Optimization: The
Secrets to Success in Audience Targeting

Steve Schmitt of TiVo showed us how TiVo is helping clients get
from traditional linear to non-linear content, and how they improved campaign
performance using optimizers and brand targeting. His presentation focused on
how:
??        
 TV
consumption has undergone profound changes, especially Millennials age 18-34
??        
 Total
video consumption continues to expand with DVR, VOD, SVOD and online/mobile
viewing extending the power of linear TV
??        
 Linear TV
has majority share, but it is declining as on-demand options expand
Concepts on the rise are binge viewing, on-demand,
cord-cutting and cord-shaving, while things like appointment viewing and one-size-fits-all
on decline.
Online Video in the Toolbox: A Must Have

Darlene LaChapelle and Maya Abinakad from AOL talked about
the top drivers for video growth, with “social media video offerings”
and “better quality creative” leading the way, and how online video
growth is driven by mobile devices.
??        
Online video viewing on a smartphone is on par
with that of a computer
??        
 Consumers
indicate they have few technical barriers watching online video on their
smartphones, but get the convenience of watching anywhere, anytime
??        
 62% said
I watch more online video today than one year ago
??        
 62% said
in the next 6 months I expect to watch more online video
Laptop/desktop (70%) is still the leading device on which
online video is watch daily, just edging smartphone (67%)
How to Engage  Multicultural Millennial Influencers in 2017
and Beyond

Our afternoon continued with our only Track 1 panel.  The panel was moderated by Horowitz’s Adriana
Waterson, and we heard from Michele Meyer (Univision), Tom Kralik (Revolt) and
Lia Silkworth (Telemundo) as they discussed their key takeaways about
multicultural millennials and the importance of this audience in our business
today, as leading consumers of cross-platform media.
??        
 Hispanics
are leading the charge in cross-platform media consumption
??        
 Millennial and Gen Z trends ARE multicultural
trends
??        
Gen Z is more diverse and multicultural and are digital
natives
??        
 If you
join a multicultural network, your general market skills may not “translate”
The Next Generation of Ad Effectiveness
Our first day concluded with this presentation from Chris
Kelly at Survata.

The Future of Market Research Data Collection

By: Research Now CEO Gary
S. Laben
This post was
originally published on the Research
Now Blog
.
The vast expansion of communications technology has
obviously sparked a dramatic change in the way our world
functions. Certainly one of the most ubiquitous and transformational
impacts is that brought on by new technologies that allow virtually everyone to
remain constantly and instantly connected; connected to one
another, certainly, but also to the growing number of systems upon which
we are growing increasingly dependent, if not addicted. Modern
communications systems have given users unprecedented access to information and
services without regard to time or location, letting them get more done faster
than ever before. Even more, the devices and systems continually monitor users’
behaviors to refine the responses to personalize the service
delivered. By providing experiences that are tailored and relevant to each
user’s expectations, this new generation of technology doesn’t just provide a
better user experience, it also preserves the user’s most
valuable resource: time.
The idea that we can use deep knowledge about individual and
groups of users’ situations, preferences, and past behavior to provide a
better, more efficient user experience applies equally well to market research.
Of course, this is not a new idea. We’ve always used profiling data to target
specific communities for research studies and minimize the amount of
information we need to collect in each study. Avoiding collecting redundant data
shortens surveys, reduces participant load, and improves data quality. What’s
changing is the vast volume of data we can mine to automatically extract and
maintain components of the user’s profile ‘ even in real time ‘ without the
need to explicitly query them. This is the realm of big data.

Applying big data to market research has tremendous
benefits to all involved in the research process. Data providers can use automation to
maintain more expansive and accurate research databases at a lower cost.
Market researchers can target research communities with greater accuracy and
know more about them in advance of fielding a study, which lets
them devote more of a survey to the core questions of the research rather than
qualifying questions. And finally, and perhaps most
importantly, the study participants benefit from reducing the number of
tedious and repetitive profiling questions asked of them, shortening surveys,
keeping them engaged, and giving them back valuable time.

The allure and promise of big data for market research is
compelling, but not without risks and issues. Technology has created
a window of opportunity for brands to know more about consumers than previously
ever thought to be possible. But, just because we can reach everybody,
doesn’t mean we should. Technology sometimes presents a facade that
can lead researchers to lose sight of the fact that they are
dealing with real people. Real people who have thoughts, feelings, emotions,
goals, dreams, and likes and dislikes. Dehumanizing a person to a set
of numbers and patterns obscures the advantages that
big data enables. Further, easy collection of data can make us
forget about the very real and important privacy interests of our participants.
If we fail to recognize, respect, and account for these concerns, we will lose
their trust and their willingness to participate.
The market research industry must
use big data as an opportunity to get smarter, quicker, so
that we are able to be more personable in our approach
to collecting information. We need to
maximize participants’ time by creating relevant engagement
for them that is also useful to the researchers. Big data presents a
new opportunity to improve our ability to accomplish both.
At Research Now, having
more data, specifically more accurate data, about
people is what defines the quality of our panels. It allows us to be
less intrusive and more in-the-moment with people who want to engage with
brands. Having more information about whom we’re talking to
permits us to put greater focus on core research by bypassing things
like screeners and
get right down to the questions our clients are interested in asking.

This improves the participant experience and gives
our research clients the ability to collect more desirable data,
which in turn fuels deeper insights and gives everyone back just
a little more of their precious time.

Is it Worth it? Key Considerations for Social Media Research

By: Terry
Lawlor, EVP Product Management, Confirmit

The role of social media in delivering
business insights is a tricky business. While most researchers consider it to
offer real benefits, the big question is ‘how do we do it properly’? In our
recent survey of Market Research professionals, we asked respondents about
their feelings towards social media. Overwhelmingly, the most popular response
from the five choices offered was ‘A
useful addition to a Market Research project if we can bring the data together
effectively’.

The word to look at there is ‘if’.
For many businesses, that ‘if’ is
surmountable, and for others it isn’t ‘ at least not yet. There are a number of
things to bear in mind.
Who
is Your Audience?
The changing dynamic of the consumer has a
significant impact on research. Millennials behave differently when it comes to
researching, buying and complaining about products. The audience you’re
targeting has a huge role to play when it comes to establishing the part that
social media has to play in your business.
It
Takes More Than Technology
There’s no silver bullet for social media.
It takes a combination of people, process and technology to be successful. You
need technology to sift through the vast quantities of information ‘ to find
and filter data sources, provide intelligent sampling of massive amounts of
content, and perform categorization and sentiment analysis. However, you will
still need people. In our recent study, Political Buzz, we used social media
(as well as traditional surveys) to monitor topics for the UK election. One of
our key findings was that the role of people was critical in researching the
key social and online media channels, and in building the taxonomies on which
your technology must function.
It’s
More Than Just Social
When thinking about social media, most
people immediately think of Twitter and Facebook, Instagram and Tumblr, perhaps
YouTube and Pinterest. There are actually many more social media sites than you
think, and there are many different feeds within each social media platform.
And there is a huge array of online media, where people post comments and
stories, and review sites that cover many different categories of products and
services. So you need to think about online media as much as social media, and
you need to think about data sources that amount to tens or hundreds of
thousands of different media channels.
A
Double-Edged Sword
As with every ‘next big thing’, social
media research is a double-edged sword. On one hand, because it is largely
unsolicited, you can uncover insights that you never anticipated. However, also
because it is largely unsolicited, it might not address anything useful for
your research program. You may want to research a particular topic but no one
is discussing it, or your target audience just doesn’t use social media.

About
the Author: Terry Lawlor has the responsibility of all aspects of product
management, including strategy development, product definition, and product
representation in client and marketing activities. Terry is a seasoned and
highly professional enterprise software executive who possesses a wealth of
expertise in the Market Research and customer experience markets.

The polls got it wrong (again) but don’t lose faith in quantitative research

By: Jim
Mann
Like many, I woke unusually early on
Wednesday and reached nervously for my mobile phone. It was US election night
and I was eager to see if, from my perspective, crisis had been averted or the
world really had gone mad. Before I had a chance to tap my favourite news app I
noticed a message from my brother: ‘Another resounding victory for the
polls bruv!’ Detecting sarcasm (I’m smart like that) I knew this could
only mean one thing. Sure enough, Trump was well on course to a victory that
nobody, least of all the pollsters, was anticipating. For the third time in
eighteen months (following the UK general election and EU referendum) the
pollsters had got it wrong!
In the period since May 2015, I’ve had
countless debates with polling sceptics like my brother. His, fiercely
articulated, view is that polling is not simply inaccurate, it also has the
potential to sabotage itself. He’s not alone in this belief. Behavioural
economics shows that people generally wish to follow the herd. Therefore, a
poll showing that the majority think in a particular way is likely to
influence, albeit subtly, what they themselves believe. Furthermore, there are
those that cite the possibility that polls could impact rates of voter turnout.
After all, why bother to turn out to vote if the polls have created a strong
belief that your favoured candidate is either assured of victory or has no chance
of winning?
Polling, when first popularised by George
Gallup in the 1930s, was hailed for the positive contribution it made
to the democratic process. Gallup himself was, understandably, steadfast in
this belief. Elmo Roper, another pioneer of the public opinion poll, described
it rather hyperbolically as ‘the greatest contribution to democracy since
the introduction of the secret ballot’. 

But there have always been critics, and
the anti-polling arguments inevitably gain traction when the pollsters get it
wrong. Failure is not a modern phenomenon either. Immediately prior to the 1948
election George Gallup predicted that Dewey would beat Truman in the election
and stated, unwisely as it turns out, ‘We have never claimed
infallibility, but next Tuesday the whole world will be able to see down to the
last percentage point how good we are’. Dewey lost. The anti-polling lobby
had a field day.

So criticisms of polling aren’t new and,
let’s be honest, they would remain niche concerns if the polls were accurately
predicting results. But they’re not and on the back of a series of high profile
failures it’s increasingly common to deride polling as a ‘devalued
pseudo-science conducted by charlatans’. Yep, my brother again. I hate to give
him the last word so, in order to provide a flavour of wider opinion, I’ll
quote the Guardian’s post-election editorial instead. ‘The opinion polls
and the vaunted probability calculus rarely trended in his (Trump’s) direction;
both are discredited today.’
The purpose of this blogpost is not to
defend political polling; I have my own concerns in that direction and it’s
undeniable that the work of pollsters is becoming harder, due to a combination
of methodological issues and a more fluid, less predictable, political
landscape. However, for the sake of fairness I’d like to mention two things,
neither of which is intended to exonerate the practice.
First, most polls reflect public sentiment
within a nationally representative sample. In the main, but not exclusively,
the polls conducted immediately prior to the election found that, by a
relatively small margin, more Americans intended to vote for Clinton than
Trump. In this they were correct. At the time of writing, the figures show that
59,814,018 Americans voted for Clinton whilst 200,000 fewer (59,611,678) voted
for Trump. However, due to the distribution of votes and the vagaries of the US
political system, this translated into 279 Electoral College votes for Trump
and 228 for Clinton.
Second, most polls conducted by reputable
polling organisations produced results that placed the result well within the
margin of error. ‘What’s that’? I hear you ask. Well, tucked away at the
end of most reports based on a public opinion poll will be a small note about
margin of error. This margin will differ depending on the number of people
interviewed for the poll but, for a standard sample size of 1,000, the margin
of error is +/- 3.5%. This essentially means that if the poll results show that
Clinton is projected to win 47% of votes, the reality is likely to be somewhere
between 50.5% and 43.5%. Within this context, the result of the election was
well within the margin of error of most polls. It wasn’t so much the polls that
got it wrong, it was the reporting of the polls that failed to sufficiently
stress that the result really was too close to call. But people don’t like
uncertainty so these boring, statistical caveats tend to get overlooked.

OK, but I said this blog wasn’t designed to
defend polling. So what is it about? Well, I don’t feel the need to defend
polling because I’m not a pollster. However, I am a market researcher working
with quantitative surveys and, what concerns me, is the fear that growing
scepticism around polling will negatively impact trust in all forms of
numbers-based research into public attitudes. Maybe I’m just a worrier and
people are perfectly able to distinguish between different forms of survey
based research. However, my own experience suggests that isn’t always the case.
In May 2015 I was working at the Guardian.
The Guardian has invested significantly in data journalism over recent years
and coverage and analysis of polls was given a high degree of prominence in the
run up to the UK general election. At the Editorial conference, held the day
after the election, the mood was subdued. When the conversation turned to the
failure of the polls some journalists questioned the prominence given to
polling numbers, especially as those numbers didn’t chime with their instincts
and the evidence of their own, on the ground, experiences. The upshot was a
policy decision, only recently reversed, that editorial coverage of polling
should be suspended. The coverage of polls in the run-up to the US election was
reported under the banner ‘Sceptical polling’, which gives a pretty good
indication of the mood around the organisation.  
As Head of Consumer Insight at The
Guardian, a key element of my role was to advocate for use of consumer research
and promote evidence-based strategic decision-making. My internal clients were
ranged on a spectrum that ran from research enthusiasts to rejecters. This
latter group, a minority it should be said, believed there was little to gain
from engaging with research. The great polling disaster of 2015 provided a
tailor-made reason to disengage. After all, research had been shown, in the
most public way imaginable, to be unreliable and wrong! Hadn’t it?
I’m sure the Guardian is like most
organisations in having research stakeholders ranging from enthusiasts to
sceptics. To the latter group I would make this plea; don’t conflate political
polling and other forms of quantitative market research and do not deny
yourself and your business an incredibly powerful, consistently proven aid to
decision making simply because political polling has been shown to not be a
perfectly accurate crystal ball. As mentioned, polling isn’t quite as
inaccurate as some would have you believe. Furthermore, the stakes are simply
much higher for polling: A couple of percentage points either way (generally
within the margin of error, remember) is the difference between two
diametrically-opposed outcomes and the profound repercussions associated with
that. In contrast, if a representative survey of consumers in a particular
sector suggests that awareness of your brand currently stands at 34% whilst
that of a competitor is 64%, does it really make a huge difference to the
decisions your company will take if the reality is a couple of percentage
points either side?  
Of course, some decisions do require a
higher degree of accuracy. In these instances, market researchers have two huge
advantages over pollsters. We can increase the number of people interviewed in
the study, thus reducing the margin of error and increasing confidence levels.
We can also utilise robust sampling techniques such as random probability
sampling. Generally speaking, neither of these options is available to
pollsters because they are simply too time consuming. Pollsters are required to
provide an almost instantaneous reading of public sentiment, before new events
have a chance to change it, and anything that slows that process is, by
necessity, discarded. If pollsters were given the freedom to use these tools,
it’s likely they would provide far more accurate predictions. How do we know?
Well, following the 2015 general election most polling companies conducted
re-contact surveys with pre-election poll respondents to try and understand
what went wrong. What they discovered was that, even when conducting post-event
research, they were unable to accurately replicate the result. The inquiry
conducted by the Polling Council of Great Britain concluded that the reason was
their use of (attitudinally) unrepresentative samples drawn from panels and
that a random probability sampling approach (that gives every member of a target
population an equal chance of participating in the study) would counteract the
problem. Tellingly, the survey that best replicated the election result was the
British Social Attitudes (BSA) survey conducted by NatCen Social Research. Need
I say that BSA is based on a large sample (3,000) and utilises random
probability sampling?
I’ve rambled on too long and exceeded my
word count limit by a distance so I’ll finish by saying this: The great jazz
musician, Duke Ellington (or possibly Richard Strauss, it’s disputed) is quoted
as saying ‘there are only two types of music: good and bad’. Market
research is much the same. When done properly it is an incredibly powerful
diagnostic and forecasting tool that can provide a highly accurate picture of
consumer sentiment as it currently exists. Pollsters, through no fault of their
own, are sometimes unable to do it.
Researchers, however, can and do. 
Jim Mann is a senior quantitative director
at the numbers lab @ Firefish

Facts Don’t Have to Die

This post was
originally published on the Sentient
Decision Science Blog

‘Stories last and facts die,’ Kelsey
Saulsbury
 stated on day four of TMRE.
The Schwan Company’s manager of consumer insights and
analytics held a fun workshop called ‘Your
Voice’the Power to Slay the Two Dragons of Storytelling.’
 By leading
participants in two creative writing exercises, she encouraged market
researchers to ditch the corporate speak that plagues our presentations and
find our own human voice.
If it sounds like Saulsbury didn’t know her audience, be
assured she did. She addressed the certain skepticism held by any data
analysts or behavioral scientists in the room by conceding that the type of
presentation should depend on the client.
‘A client once told me before a presentation that if I
showed him one number, he’d walk out,’ an audience member offered.
Saulsbury nodded, asking audience members to cut down on the
slide decks and get to the point faster.
‘When putting together reports we’re often afflicted by the
No Data Left Behind Syndrome,’ she said with a laugh. ‘Less is not lazy.’
Big Data Dominance

Are executives so exhausted by tables and charts that
they’re letting data die? According to Alec Ross,
there’s no way. As data becomes more abundant, industries become more efficient.
And data is incredibly abundant.
‘Ninety percent of the world’s data in the totality of human
history has been produced in the last two years,’ said the former senior
advisor for technology and innovation at the State Department.
‘The sum of all the data from paintings on cave walls to
2003, we now produce that amount of data every two days’ over 16 billion
networked devices.’
So how do we leverage that?
In ‘We
Are Not in Kansas Anymore’Consumer Insights in the Age of Big Data’
Walmart’s
Senior Director of Consumer Insights and Analytics, Heiko Schafer,
admitted it can be overwhelming.
‘Big CPGs are under tremendous pressure,’ he said.
Schafer quoted Ross’s note about how 70,000 data points are
available about all of us. The pressure comes in reconciling these new
data streams and business models.
Companies like Walmart use the newly available data sets to
integrate things like geolocationing, sensors, and digital media into their
skill set. The result can be incredibly informed, targeted marketing.
Data as a Storyteller
We urge our analysts at Sentient to also highlight
conflicts between data and its context. Those conflicts might reveal important
insights the client wasn’t even looking for.
Saulsbury suggested that researchers begin presentations
with the most compelling findings. ”Don’t bury the lede,” she quoted.
Schafer illustrated with a study about the sales
of colored pencils.
Data in graph form showed peaks in colored pencil sales
where you might expect them’Christmas, Easter, and at the start of the school
year’as well as a general increase in year-over-year sales.
Researchers could have accepted the numbers as they stood,
but they knew something was off. Birth rates in the United States had dropped
off in the Great Recession of 2007-2009 and so there are actually fewer
school-age children enrolled in the areas they were looking at.
‘What is going on [with the data]? What’s driving this? Why
is it happening’? Schafer posed.
As it turns out, the reason colored pencil sales are going
up is because the sales of adult coloring books are going up. Why was that
happening? Researchers then looked at social media conversations and Google
Trends and saw that a lot of adults are stressed out. They’re looking
for creative escapes.
Where the data says sales of colored pencils are up, the
true story is that sales of colored pencils are up because burned out adults
are looking for catharsis in coloring books.
Imagine how much money could have been wasted on marketing
to the wrong demographic.
Truth Is Important;
So Is How You Share It

If data equals truth, truth should trump all in market
research, right? Not if no one is listening to it.
That’s why storytelling is part of Sentient’s DNA.
Yes, we are a company that’s expert in advanced
implicit research technology, the consumer subconscious, and quantifying the
impact of emotion on choice. Our technologies are coupled with deep knowledge
of behavioral economics, emotional branding, and quantitative models of the
drivers of human behavior.
But the value of our insights comes from the stories we tell
about data.
People support and share ideas they have an emotional
connection to. By crafting our insights in a way
that inspires emotion, we give data a better chance to resonate with
our audience. We don’t just reel off numbers, we help clients understand
why they should care about those numbers.

Facts and data don’t have to die. We can use stories to
help keep them alive.

6 Tips Marketing Researchers Can Learn From Social Media

This post was
originally published on Lightspeed
GMI’s blog
.

Social media has caused a massive shift in the way people
communicate, interact and share experiences and personal interests. Consumers
are always on, always connected. Consumers build unique online relationships;
they are connected to brands, athletes, teams, family, friends and co-workers
on multiple channels. Sharing everything from political views to favorite
products, social media users are leaking valuable information and insights for
researchers to take advantage of.

Marketing researchers have adapted Mobile
First best practices
; but are we also looking to benefit from the same
openness and flexibility that social media platforms have to offer? There are
six ways to successfully engage and capture relevant and actionable
feedback from your panelists based on social media best practices:
1.      
Focus on
people, not metrics: Our
industry refers to panelists, not people. Are
we focusing on why individuals are dropping out of surveys? Are we worried
about their enjoyment of a survey or just survey completes? Create consumer
conversations, not metrics.  
2.      
Stay
authentic: 
According
to Digital Stats, 92% of consumers say they trust earned media like personal
recommendations above other forms of advertising.
 Authentic brands do
better on social media, but trust is earned over time. If you want to capture
genuine consumer insights, treat your online survey as you would a social media
account. Be honest and upfront about your intent.
3.      
Engage,
don’t push:
 Want to get better research? Consider the way you are
asking questions. Similiar to social media posts, consumers favor shorter,
visually appealing surveys with a strong narrative structure.Engage
your respondents first, ask questions later.
4.      
Let the
consumer decide:
video, text or photo? Social media platforms are
constantly evolving, but they always remain focused on consumer adoption. According
to Spinklr, marketers need to find new ways to capture the attention of the
consumer who has seen just about everything
. Every day, more and more
individuals are starting surveys on their mobile devices over PCs. They are
deciding when and what device to take the survey; why not let them decide on
the format? We design for cross-device research, so why not design
cross-format?
5.      
Be
relevant: 
Across Facebook, Twitter and Instagram, you want to reach
your target audience with relevant content ‘ photos, posts and videos. Like
social media, marketing research is a crowded space; panelists are flooded with
survey invites daily. Be relevant: ask
the right questions, in the right sequence to the right audience.
6.      
Interactions
first, technology second:
 Social media planning 101 = interactions
first, channel second. Allow your panelists, not technology, to drive the
future of the industry. Are marketing researchers allowing technology to
dictate the future or panelists? Are you focused on building mobile research
apps or consumer feedback apps?
Gaining success in social media isn’t easy; it’s a process,
a way of thinking. Social media can be used to create and collect customer
intelligence through listening techniques. And this can also ring true in the
online survey world. Think about it: Brands have the capacity to cultivate
conversations with consumers…but often don’t. Researchers who are successful
in gaining insights from surveys are the ones who allow the consumer to take
the wheel and drive how marketers can collect information from them. Platforms
such as Facebook, Twitter, YouTube, Pinterest and Instagram allow users to be
creative and communicate in whatever method is enjoyable to them.  Why not
allow online panelists that same freedom? By allowing panelists to communicate
with you through mediums that are most enjoyable to them, through video for
example, you could garner more authentic and elaborate feedback. Rather than
force tedious or possibly challenging lengthy open text responses, try allowing
an option for using text or video responses. Instead of requiring respondents
to rate a product on a variety of features through a MaxDiff exercise, try
engaging them in conversation through communities or discussion boards.

The perfect solution for the survey world isn’t available in
140 characters or less, unfortunately. But the successes of social media are
ours to grow from.

Innovation In A Change-Phobic World

By: Tom Ewing, Senior Direct, BrainJuicer Labs

Sometimes it takes a huge event to make people look at their
assumptions in a new light. Taken by surprise by the EU Referendum and
Brexit, British marketers have had to think carefully about how well
they knew the people they were selling to. And, as a fascinating new study
by the Futures Company points out, it’s not just a British thing. All
over Europe and beyond there are vast groups of consumers who feel a
sense of loss in the face of change, and respond strongly to the
familiar. As in politics, so in marketing: for a big chunk of the public, novelty and disruption aren’t particularly big draws.

Of course, this is why innovation research so often focuses on early adopters: get them on board, and diffusion across the population will follow. This is the innovation equivalent of trickle-down economics – an excuse to focus on a segment you’re sympathetic to with the hope that everyone else will benefit… eventually. After all, marketers themselves tend to be novelty-seeking types who talk a big game about
the inevitability of change. Forcing them to put themselves in the shoes of more
conservative or change-averse consumers can be a wrench. But change-aversion isn’t something you can sweep under the segmentation rug. Because while
history may be on the side of change, psychology isn’t.

As human beings we are wired to prefer the familiar: if we recognize
something quickly, it seems like a better choice. This processing
fluency heuristic – or just Fluency, for short – is one of the foundations of decision-making. And it’s used by every consumer to make judgements, not just those who are culturally or politically averse to change.

The Futures Company’s recommendation to appeal to the nostalgic or
conservative consumer is ‘cautious innovation’ ‘ balancing
future-focused new product lines with more immediately familiar ones.
This is sound thinking. But given that everyone ‘ not just the
change-averse ‘ uses Fluency to make decisions ‘ it’s only half the
story. Familiarity isn’t just for nostalgists. The key thing to realise
about Fluency is that you can build it.

If you use research to identify your brand’s unique assets ‘ the things people most
immediately associate with it ‘ you can build that into how you present
your innovations. This works for the radically new as well as the cosily
familiar: you can ‘bake in’ new assets which the launch and promotional
campaigns can lean heavily on.

When the iPod was first launched, for instance, it was certainly
disruptive new technology. But Apple’s marketing around it didn’t
emphasise the tech, or its many features ‘ it relied on the simple image
of those distinctive white headphones. The white headphones became such
a well-known symbol of the iPod that at least one police force issued
warnings against them ‘ as criminals were targeting anyone they saw
wearing a set. A brand’s unique assets in action!

With investment and focus, brand assets can become familiar very
quickly. In the UK, banking giant Santander was a newcomer only a few
years ago. Now it’s carved out a large chunk of market share, and ‘owns’
the colour red, thanks to pushing it as an asset in its marketing. But
heritage can certainly play a role. There are many brands whose vaults
are stuffed with distinctive assets which were put to pasture because of
a new-broom marketing director, not because the public was tired of
them!
Whether you’re launching something new or looking to the past,
though, it’s crucial to remember that familiarity and Fluency are at the
root of all successful innovation ‘ not just the cautious kind.

Insights as a Vehicle for Influence: Embracing the Omnichannel Customer Journey

By: Amanda Ciccatelli,
Content Marketing & Social Media Strategist, Informa
Insights have become a vehicle for influencing marketing and
ultimately, the world. That’s why next in our Insights as a Vehicle for
Influence series, we sat down with Claire Quinn, principal at Capre Group to discuss
the ever-changing retail space and how to embrace the omnichannel customer
experience.
Here’s what Claire had to say:
How is digital
reinventing retail?
Quinn: For
Millennial ‘digital natives’ and those GenX and Boomers who are early adopters,
digital has reshaped the traditional ‘role sort’ of consumers and
shoppers. Instead, people are shifting between ‘consuming’ modes and
‘shopping’ modes seamlessly, regardless of their physical location. 
Imagine this scenario to demonstrate these shifting
modes:  a young woman views a TV ad showing how a new lipstick will last
all day (consuming mode)’she researches user reviews online with her phone (consuming
mode)’then uses her phone app to check whether her local drug store carries the
brand (shopping mode)’ultimately ordering it for pick-up at her neighborhood
store (shopping mode). She has moved along the full path to purchase -
shifting between consuming and shopping modes – all from the convenience of her
own living room. 
It will be critical to understand the different decisions
people are making at each point along this new path, as well as the right
marketing touchpoints and content to share to drive conversion and reinforce
each purchase decision. 
What can retailers do
better to embrace the omnichannel customer journey and experience?
Quinn: The new
omnichannel world presents a great opportunity for retailers to partner with
manufacturers to successfully meet and exceed consumer/shopper
expectations. Manufacturers are experts in their categories, and can
provide deeper category leadership insights and perspectives than most retailers
can create on their own. And retailers know their shoppers extremely well,
having built incredible capabilities to target and engage their shoppers.
Together, manufacturers and retailers are able to
collaborate in new ways, such as to engage shoppers pre-store, getting brands
on the list of relevant household segments or to reinvent the aisle or
check-out experience, to provide added value to shoppers while driving
incremental impulse purchases. Working together to exceed shopper expectations
provides a triple win for everyone ‘ shoppers, retailers and
manufacturers. 
What are some shopper
insights lifecycle best practices you can share?
Quinn: One of the
most important things insights professionals need to keep in mind when planning
research is to ‘begin with the end in mind.’ Specifically, what is the
business problem you are trying to solve and how will the insights drive action
once the results are in?  
At Capr?? Group, we work with clients to delve into root
cause analysis and create hypothesis-driven assertions to guide research design
as well as post-research analysis and application. This approach helps to
ensure one is thinking through the full insights lifecycle before the Knowledge
Brief is even drafted. 

Want more on this
topic? Attend OmniShopper International this November in London, England. Learn
more here: http://bit.ly/2aSfoLS 

Get Unprecedented Access to Stephen Dubner at TMRE

TMRE Brings You the Best in Insights, Featuring Exclusive
Access to Best-Selling Author, Stephen Dubner
The world-renowned author of Freakonomics and
SuperFreakonomics, will reveal how you can leverage the power of incentives to
uncover human behavior.

Dubner is taking to the keynote stage at TMRE: The Market Research Event this
fall to show you, no matter what industry you’re a part of, how to inspire
change in both your organization as well is in your customers’ minds.

The first 10 people to register with code DUBNER receive
complimentary access to the VIP lunch with Stephen.
Book Your Today: http://bit.ly/2bdhOoM
Stephen Dubner is just one of the amazing insights leaders
at TMRE. Access more than 120 sessions from cross-industry insights leaders,
including:
??        
Consumer Goods Perspective: PepsiCo’s Director
of Insights & Analytics talks about the evolution of their state of the art
Gatorade Mission Control Center and it’s new approach to digital insights.
??        
Pharmaceutical Perspective: Merck’s Executive
Director of Global Customer and Brand Insights details the art of transforming
insights into stories and strategies that drive results.
??        
Financial Service Perspective: MasterCard’s Data
Visualization expert details why Data Visualization is the key to Empowering
Powerful Business Decisions
??        
Retail Perspective: Walmart’s Senior Director of
Customer Insights & Analytics reveals how the combination of Big Data and
traditional research methods can lead to stronger insights
??        
Media & Entertainment Perspective: SVPs from
AMC and Cablevision come together to share best practices on how to join forces
on data driven business insights.
??        
Travel Perspective: Marriott’s Senior Director
of Consumer Insights reveals low-cost research for high-impact results.
And so much more!
Download the TMRE brochure for the full agenda and
session details: http://bit.ly/2bdhOoM
Use exclusive LinkedIn discount code TMRE16LI for $100 off:
http://bit.ly/2bdhOoM
See you in Boca Raton this fall!
Cheers,
The TMRE Team
@TMRE
#TMREvent

Themarketresearcheventblog.iirusa.com