Tag Archives: Privacy

Next Generation Facial Recognition Software Knows How You Feel

By: Anthony Germinario

Affectiva, a pioneer in emotional recognition software, seems
to be everywhere lately ‘ from discussions in my office about new MR
techniques, to a recent article in Wired.
I first heard of their Affdex technology at an ASC conference in London, and was thoroughly
impressed by the ability to capture emotions while respondents view videos
or ads. Facial recognition has been around for a while now (remember when
Facebook started guessing who was in your photos?) but decoding emotions on
those faces is a whole new frontier.

Affdex was developed with altruistic intentions at
the MIT Media Lab; to help autistic people read emotions during daily
interactions. A machine that reads emotions, however, inevitably
caught the attention of many more interested parties. While I can get
excited about using their technology to measure respondent reactions in Market
Research studies, I am even more interested to see where else this will be applied
in my everyday life as a consumer. Which of my devices will read
my emotions, and what will they give me in return?
Affectiva recently offered a 45-day free trial to developers who want to experiment with their API ‘ which got me thinking… what
are some apps or devices I would want to read my face/emotions? I’m not a
developer (just a dreamer) so here is my short list:
1)     
Apple TV
/ Roku
‘ Could the device please pause my show when I inevitably doze off while catching up with my shows on Sunday evening?

2)     
eCommerce
sites (Amazon, Gilt, etc.)
‘ While I shop, can you tell which items I react
positively to, and tailor my experience like a virtual personal shopper?

3)     
Dating
sites
‘ maybe Tinder can tell exactly how you feel about a potential match,
so you don’t have to keep swiping left/right? Perhaps you would find different
matches based on your initial emotional response, which you may not even be aware of.
All dreaming aside, one real concern about any new kind of
data capture, especially involving video, is privacy. Consumers are willing to
trade a good amount of personal privacy for novelty and convenience, but it
certainly is something that must be addressed. Rana el Kaliouby (Chief Science Officer at Affectiva)
assures us that Affdex, while it has amassed a database of millions of faces,
retains no personally identifiable information. So even if we know my face is
in there, she asserts that nobody would be able to pull it out of the system.

I, for one, will take her word for it ‘ and am excited to
see this approach applied in consumer technology. What about the rest of you ‘ any other
ideas for places you do (or maybe don’t!) want to have your face/emotions read?

About the Author: Anthony Germinario is Director
of Technical Product Management at BuzzBack,
where he is focused on developing and integrating unique respondent and
reporting experiences for online research. He has earned his PMP certification
and holds a B.S.B.A from Boston University’s Questrom School of Business. You
can keep up with him on Twitter @AGermBB and
on LinkedIn,
as well as on BuzzBack’s blog.

Related articles

Big Data: The Good, The Bad, and The Ugly

Big Data is important and can be incredible useful in finding insights, but be wary of those using big data the wrong way. The use of big data in loyalty programs can be incredible helpful to business such as casinos. On the flip side, big data can be used to harm someones image. Adam Tanner discusses this in his book What Stays in Vegas.

Old Vegas vs. New Vegas:

Casinos have always been customer service oriented but there is a new age upon us. Vegas leads the way in marriages, security cameras, and loyalty programs. The marriages are obvious, the cameras make sense for security, and the loyalty programs are changing the way customer service is oriented.

The new loyalty programs are featured in most casinos and feature loyalty cards for each guest. The customer use the card at restaurants, slot machines, table games, and shows. The card tracks the info the customer provides and enables floor managers to personally welcome and entertain each guest. The manager can then use his smart phone to go up to Mr. Smith and know an incredible amount of data from personal info to the amount he spends per visit to his favorite drink.

The manager can see in real time how much Mr. Smith is currently losing and he can go over and greet Mr. Smith and ask him how his family is back in his hometown and then can offer him free buffet tickets to compensate him for his losses. This makes Mr. Smith feel better and come back more often. This is an example of how big data is being used positively in Las Vegas.

The Dark Side:

The dark side of big data is not so prevalent, but it does exist and is something to consider. Busted Mugshots is a site that features mugshots of common people and forces them to pay for it to be removed. When googling a persons name, their mugshot would often come up first. Making people pay for their mugshots to be removed begs the questions is this legal? or is this blackmail?

Technically, it is legal, because the mugshots are public record and the site’s argument is that they are doing a public service by bringing to light who has been arrested. It is highly unethical and dishonest, but it is legal.

My Ex-Girlfriend:

Another example of the bad side of big data are the ex-girlfriend sites. These sights feature incriminating pictures of ex-girlfriends or ex-boyfriends and forces the victim to pay for the pictures to be removed. This again, is not illegal but is unethical. These sites use big data to manipulate others and use negative photos of victims to make a profit. These sites are an instant checkmate and are nightmares for victims.

Cheer Up, Big Data is Good:

Big data isn’t all bad and is overwhelmingly positive. Data sharing can be incredibly helpful in several industries. Airlines were first to adapt to it and casinos, and grocery stores have joined in. Overall, the good uses will continue to be successful and the bad uses will ultimately fail.

About the Author:
Ryan Polachi is a contributing
writer concentrating his focus on Marketing, Finance and Innovation. He can be
reached at rpolachi@IIRUSA.com.

Big Privacy: It’s Coming

By Marc Dresner, IIR

My blog last week focused on data brokers and the looming threat of a Big Privacy backlash
in response to Big Data collection run amuck.

I want to
stick with Big Privacy this week, because I believe strongly that the
consequences of inaction for those in the consumer insights field could be more
serious than most of us realize.

For starters,
high-profile gaffes by Facebook, Apple (I’m referring to “Locationgate” not the naked photo scandal) and the like have done much to educate
the public on the data-for-service arrangements those of us who didn’t read the Privacy Policy unknowingly entered
into with such companies.

I think most people have since resigned themselves to this trade-off. 

Maybe that’s
because many of us did a rough cost-benefit analysis and, if not ideal, we found the
model acceptable, harmless, reasonable’ 

The absence of any evidence suggesting widespread public outrage has to do with the fact that
people don’t think they have any choice

But I suspect that more likely than not, the relative absence of any evidence that suggests widespread public outrage has to do with the fact that people don’t think they have any choice in the
matter.

A friend I recently mentioned
this to dismissed the idea, noting that Facebook isn’t forcing anyone
to use its network.

That’s true. And it’s pretty much irrelevant to a realistic discussion about privacy, because what matters here is the perception of transparency and ethical conduct.

No one is being forced to Google anything, either. But that didn’t
prevent the European Union Court of Justice from ruling in May that Google must amend search results upon request’a precedent-setting decision that asserts
the rights of the individual to control his/her personal data.

Indeed, it’s this notion of control (and informed consent) that we need to start considering when we talk about privacy.
People
are waking up to the fact that information about them is being collected and
used for purposes that they aren’t aware of and might not consent to if they
were

People are just
now starting to wake up to the fact that information about them is being
collected by unknown others and used for purposes that they aren’t aware of and
might not consent to if they were.

Most of the general
public, I think, knows that privately held data’credit reports, purchase
histories, loyalty data’about them exist and are shared between companies, but
I’d wager few people understand the extent of this sharing or what policies or
rules govern such activity.

Josh Klein, author of ‘Reputation Economics: Why Who You Know Is Worth More Than What You Have,’ points out that most people would probably be surprised to learn that Acxiom and LexisNexis have been aggregating purchase history to develop health profiles, which they sell to hospitals who then
use the information to advertise targeted medical services.


“Tell people this sort of
thing and it’s no leap for them to imagine that information going to their
insurance adjustors,” Klein said in a presentation he delivered at TMRE’s sister event, Shopper Insights in Action, this past July.

People would probably be even more shocked to know what can be amassed about them in the public domain’tax
records, voting records, ethnicity, religion, who your neighbors are, if you’re
married, do you take care of your parents, do you have children, etc.

This information isn’t
just available to Big Brother; it’s available to, well, me if I want it.

Klein pointed out that Spokeo combs publicly available sources, aggregates the data and basically provides
a docier on individuals to subscribers for about $3 per month.

Now, you can opt out of a Spokeo listing, but you cannot close the spigot of publicly available data about you. That alarms some people. 

Surveillance
is a loaded word, but that’s what is happening when we go online, isn’t it? 

Surveillance
is a loaded word, but that’s what is happening when we go online, isn’t it? And
on such a massive scale that Orwellian is almost an understatement.

Klein notes that Google only needs 22
points of data to figure out who you are wherever you log on. (Whether you hit the logo to go back to the home page or hit the
home button is one such data point.)

And then there’s mobile’where you go, what you do on your phone’it’s all being collected. 


People may have signed on, but they are not on board.

So, again, why haven’t we
seen a bigger backlash?
Maybe it’s a matter of ignorance or denial. Maybe people think it’s futile. Maybe we’re just lazy.
Whatever the
case, it is a curious thing and I’m not the only one who believes the situation is unsustainable.

Coming Next: Data Custodianship, Privacy By Design and a Huge Opportunity for Consumer Researchers.

ABOUT THE AUTHOR 
Marc Dresner is IIR USA’s sr. editor and special communication project lead. He is the former executive editor of Research Business Report, a confidential newsletter for the marketing research and consumer insights industry. He may be reached at mdresner@iirusa.com. Follow him @mdrezz.

Data Brokers: Shadow Industry, Privacy Flashpoint, Research Problem

By Marc Dresner, IIR

I attend a lot of research conferences and I’ve
noticed that when the subject of privacy comes up, people frequently check out’laptops
open, fingers wander to phones, sometimes eyes even roll’

I attribute this to the fact that
heretofore privacy has been pretty much a non-issue for researchers. Arguably
no other industry adheres to more rigid privacy standards.
The problem, however, is that we live in a
world where data are no longer rare, and researchers obviously aren’t the only
ones who trade in information nowadays.

Experts and authorities warn of a ‘Big
Privacy’ backlash in response to Big Data collection
Experts and authorities from a variety of
fields and sectors’from public policy to data security’warn of a mounting ‘Big
Privacy’ backlash in response to passive Big Data collection.
And if you’ve not been paying attention to the
public discourse on privacy of late, perhaps it’s time to start, because the outcome of
this debate could affect consumer research.

Flashpoint: Data Brokers
Where there’s a Big Data and privacy
concern, you’re increasingly likely to hear “data brokers” mentioned.
I’ve been surprised by the number of
researchers who’ve told me they haven’t heard of this $250 billion industry,
especially since so many of our companies (or our clients, if you’re a research
firm) do business with them.
Data brokers collect information about
consumers from public records and private sources and sell it to marketers.

The chief complaint against
them is that they often do so without people’s knowledge or consent. 

It’s also a relatively unregulated space, which coupled with the perceived lack of transparency’some call it a ‘shadow
industry”makes people uncomfortable.

Now, these aren’t shadowy guys in sunglasses
lurking in alleyways hocking hot dossiers under their trench coats (although
that’s how they’re increasingly depicted).

They include big, well-known names like
Acxiom and Equifax, as well as a lot of smaller companies you may not have
heard of.
Like any industry, there may be responsible
and not-so-responsible actors, but it’s the
possibility
of the latter that’s captured the attention of regulators and
the media.

The collection and use of consumer
information by marketers is being characterized as somehow sinister or potentially dangerous

What’s especially troubling here is that the collection and use of consumer information by marketers, in general, is being increasingly characterized as somehow sinister or at least potentially dangerous to people.

Consider…

  • CBS’s ’60 Minutes’ aired a breathless segment on data brokers back in March that made admirable use of government surveillance anxiety in the wake of the NSA/Snowden scandal to scare the heck out of the viewing audience.
  • The FTC issued a report May 27 calling on Congress to regulate data brokers (with only a slightly less ominous tone than the ’60 Minutes’ episode).
  • A subsequent commentary about data brokers published online by the Wall Street Journal’that bastion of left-wing conspiracy nuts’went so far as to compare Big Data to the Nazis’ use of IBM punch cards to identify and round up Jews and enemies of the state. (The lead sentence of that article: ‘Adolph Hitler used Big Data.’)

There’s more where that came from,
but you get the idea.

Data brokers have come to represent a
‘threat’ to personal privacy that has already catalyzed a backlash

Data brokers have come to represent a ‘threat”whether
real or imagined’to personal privacy that has already catalyzed a backlash.

Notably, concerns about data brokers have begun to figure into international relations.
The EU has pushed for
suspension of the 2000 Safe Harbor agreement with the U.S. over alleged breach
of consumer privacy by data brokers, and earlier this month, the Center for Digital Democracy filed a complaint with the FTC alleging the same.
The debate is even reportedly spilling over
into the Transatlantic Trade and
Investment Partnership (TTIP) negotiations.
As noted, the FTC has already pushed
Congress for action. It’s not unlikely that we’ll see such calls intensify
domestically.
I’ll also point out that many articles on the subject either don’t trouble to distinguish between
or’as is the case with the ’60 Minutes’ coverage’conflate research companies
and data brokers.

This needs to be taken seriously.

It’s time for the research industry to engage, before the court of public opinion renders a verdict that may not serve the common good.

ABOUT THE AUTHOR
Marc Dresner is IIR USA’s sr. editor and special communication project lead. He is the former executive editor of Research Business Report, a confidential newsletter for the marketing research and consumer insights industry. He may be reached at mdresner@iirusa.com. Follow him @mdrezz.

Top 8 Takeaways on Privacy

One of the overlying themes of this year’s Future of
Consumer Intelligence conference is consumer privacy and the concept of
“Empowerment vs. Endangerment” as it relates to the handling and
usage of data. As researchers, we collect , analyze and utilize consumers’
information to improve products, services and the customer experience.
But really, the true question posed in all of this is,
“Where do we draw the line in privacy practices?”  Regardless of
whatever privacy policies consumers have agreed to (without actually reading
the pages and pages of fine print) they still expect companies to act
responsibility with their digital imprint. 
So here are the top 8 takeaways from today’s discussions
about consumer privacy:

1. Businesses typically dictate terms of privacy for consumers.
However, consumers should have the right to 
dictate their own terms and
conditions of privacy to businesses because it is their identify
2. We need need to move past the legality of consumer privacy
and responsibly consider the morality of consumer privacy within our agreements
3. Clarity is essential and needs to be installed within data
collection and data mining privacy guidelines and not be hidden in fine print
4. Consumer trust will increase as better practice guidelines
are built into frameworks and agreements
5. Location privacy is a fundamental part of who we are as our
location reveals our tastes, preferences and identities
6. Privacy equals control and consumers should control their
data and have freedom of choice as to how, where and when it is used
7. Privacy by design should be built into our studies and
framework. Yes it costs $$. But preventing a breach will save you even more $$

8. Embed privacy by design into initial frameworks because they
are harder to change down the line
 
MrChrisRuby is
an award-winning expert Marketing Research & Consumer Insights Executive
who has consulted with several Fortune 500 companies. He is passionate about
augmenting product development, the customer experience & corporate
revenue. Follow MrChrisRuby on Twitter @MrChrisRuby,
email him at mrchrisruby@gmail.com or
read The Market
Research Insider
 blog.

Data, Data Everywhere The Need for BIG Privacy in a World of Big Data by Ann Cavoukian #FOCI14

Ann Cavoukian, Ph.D., is the Information and Privacy Commissioner of Ontario, Canada. This  morning, she gave a talk entitles “Data, Data Everywhere: The Need for BIG Privacy in a World of Big Data.” Given that I love all things related to privacy, ethics, and standards, this talk was of great interest to me. Here are some of the key points that Ann addressed.
  • - big data and privacy are complementary interests
  • - her take, “privacy by design” is a win win proposition
  • - www.privacybydesign.ca
  • - privacy = personal control, freedom of choice, informational self-determination, context is key
  • - in 2010, this landmark resolution was passed to preserve the future of privacy, and has been translated into 36 languages because people are so desperate for this information
  • - the essence of it is to change the emphasis from a win-lose model to a win-win model, replace ‘vs’ with ‘and’
  • - you must address privacy at the beginning of a program, embed it into the code at the beginning
  • - 7 principles -
    1. 1. be proactive not reactive, prevention not remedial
    2. 2. default condition needs to be privacy
    3. 3. privacy embedded into design
    4. 4. full functionality, positive sum not zero sum
    5. 5. end to end security, full lifecycle protection, from the outset, from collection to destruction at the end
    6. 6. visibility and transparency, keep it open, tell customers what you’re doing, don’t let them learn afterwar
    7. 7. respect for use privacy, keep it user centric
  • - Big data will rule the world ‘ during the honeymoon phase, everything else must step aside, forget causality, correlation is enough
  • - Then the honeymoon phase ends ‘ found data’ digital exhaust of web searches, credit card payments, mobiles pinging the nearest phone mast; these datasets are cheap to collect but they are messy and collected for disparate purposes
  • - Big data is now in the trough of disillusionment
  • - Google flu trends used to work and now doesn’t because Google engineers weren’t interested in context but rather selecting statistical patterns in the data ‘ correlation over causation, a common assumption in big data analysis, imputed causality which is incorrect
  • - MIT professor Alex Pentland has proposed a New Deal on Data ‘ individuals to own their data and control how it is used and distributed
  • - data problems don’t disappear just because you are working with big data instead of small data, you can’t just forget about things like data sampling
  • - Forget big data, what is needed is good data
  • - data analytics on context free data will only yield correlations, if you add context, then you might be able to impute causality
  • - once businesses have amassed the personal information, it can be hard if not impossible for individuals to know how it will be used in the future ‘ ‘A long way to privacy safeguards’ New York Times Editorial
  • - people now have to resign when data breaches happen, you must address them at the beginning
  • - privacy should be treated as a business issue, not a compliance issue. gain a competitive advantage by claiming privacy, lead with it
  • - proactive costs money but reactive costs lawsuits, brand damage, loss of trust, loss of consumer confidence
  • - privacy drives innovation and creativity, privacy is a sustainable competitive advantage

Annie Pettit, PhD is the Chief Research Officer at Peanut Labs, a company specializing in self-serve panel sample. Annie is a methodologist focused on data quality, listening research, and survey methods. She won Best Methodological Paper at Esomar 2013, and the 2011 AMA David K. Hardin Award. Annie tweets at @LoveStats and can be reached at annie@peanutlabs.com.

Privacy Engineering: What Researchers Need to Know

McAfee
Chief Privacy Officer Urges Insights Pros to Own Privacy and Big Data





By Marc Dresner, IIR

‘The fantastic advances in the field of electronic communication
constitute a greater danger to the privacy of the individual.’ 
‘ Earl Warren, 14th Chief
Justice of the U.S. Supreme Court, died 1974
‘Privacy is dead, and social media hold the
smoking gun.’ 
‘ Pete Cashmore, Founder and CEO of Mashable, born 1985
The fellas quoted above
were, I believe, referring to opposing sides of the privacy coin: The former
was talking about government surveillance of the Orwellian sort; the latter’taken
from a 2009 blog post’spoke to people’s increasing compulsion to publicize their
personal lives.
To distinguish between the
two assumes there is a line that can be crossed, aka ‘informed consent.’
Privacy advocates have
argued that informed consent is more or less a fallacy because the information
needed to make a fully informed choice is largely inaccessible
But privacy advocates’including
top security and legal experts’have argued that informed consent is more or
less a fallacy, because the information needed to make a fully informed choice
is largely inaccessible to the
average person.
That’s ‘inaccessible’ in three
broad categories:
1.   
Inaccessible by design‘for legit* purposes
(national security or law enforcement) and also for ethically questionable purposes
(ex. Facebook’s privacy gaffes and antics).
*Sorry, but I’ll not kick the Edward
Snowden beehive in this forum today.


2.   
Inadvertently inaccessible, but fixable‘Ex. privacy
policies that can only be deciphered by lawyers or that will only be read by
very patient, unusually suspicious people with lots of time on their hands.
3.   
Inadvertently inaccessible, but unavoidable‘the
complex tangle of partnerships, affiliations, agreements and policy overlaps,
oversights, contradictions, accidents, etc., that comprise our digital universe
(it is called the Web, after all) make it practically impossible for someone to
be completely informed of all the ways information about them may be or is
being used.
The jury appears to be
out when it comes to the ownership and control of all of those digital data
points we generate
I’ll leave it to the
intelligentsia (not used in the pejorative here) to debate whether or not we’re
doomed to life in a digital panopticon, but the jury appears to be indefinitely
out when it comes to the ownership and control of all of those data points we’re
generating in the digital realm.
This much is clear: The
privacy debate isn’t going anywhere; it’s just getting started.
People seem resigned
to the fact that information about them is collected and used for purposes they
aren’t aware of and might not consent to if they were
For the time being, people
seem generally resigned to and even comfortable with the fact that information
about them is being collected by unknown others and used in all kinds of ways
for all sorts of purposes that we aren’t aware of and might not consent to if
we were.
But for how long? It seems a
tenuous peace at best.
I’ve attended sessions at
two of FoCI’s sister events within the past six months’Foresight & Trends
and Media Insights & Engagement, respectively’whose speakers warned their
audiences that the sleeping giant is stirring.
All of this
matters to insights jocks more than one might suppose.
Consumer researchers work hard to build and
maintain respondents’ trust, and I think most would agree that there’s no
privacy bugaboo in taking surveys, participating on panels, etc.
But even if transparent,
double opt-in instruments are still the primary source of consumer intelligence’debatable’they’re
certainly not the only source.
We have Big Data now, pulled
from across the digital universe. The sheer breadth of sources without a doubt increases
the likelihood that we’ve violated someone’s privacy.
As consumer insights become
increasingly dependent upon and intertwined with technology, we find ourselves
in a precarious position
So as the consumer
intelligence field becomes increasingly dependent upon and intertwined with technology,
we find ourselves in an increasingly precarious position because we cannot be
guaranteed that the data we’re collecting and analyzing was captured with
informed consent.
Moreover, research
professionals cast in the traditional mold aren’t the only ones accessing and
using these data. We’re not necessarily the gatekeepers and we can’t always
know which information from even our own internal databases is being used, how
and by whom.
That is the domain of the
chief privacy officer, or in lieu of a CPO, typically a mishmash of IT and
legal folks.
Michelle Dennedy
Enter Michelle Dennedy, VP and Chief Privacy Officer at McAfee,
and co-author of ‘The Privacy Engineer’s Manifesto: Getting from Policy to Code
to QA to Value.’
Dennedy is a top authority whose credentials straddle the legal and
technological aspects of data security and privacy.
She and co-authors Jonathon
Fox and Thomas Finneran have developed a new model: ‘privacy
engineering,’ which endeavors to operationalize privacy and embed it in the products and processes companies use, buy, create and
sell. 
‘Privacy engineering is a way to build respect for information about
people back into our infrastructure and to think about data from the consumer
perspective’
‘Privacy
engineering is a way to build respect for information about people back into
our infrastructure and to think about data from the consumer perspective,’
Dennedy told The Research Insighter.
This
is particularly important to the Future of Consumer Intelligence audience
because companies are increasingly looking outside the
research function to data scientists to manage Big Data.
The
approach outlined in ‘The Privacy Engineer’s Manifesto’ appears to offer a blueprint consumer researchers can
use to insinuate themselves in the fundamental discussions that shape not only
privacy policy and practice, but the manner and extent to which companies
harness Big Data moving forward.
(Full disclosure: I have not yet read the book, but I’ve researched it thoroughly and rest assured you don’t need to be an IT specialist to understand it.) 
‘At
best, most companies probably leverage maybe 1-2% of the true import of data
through analytics that count,’ noted Dennedy.
‘A lot of Big Data analytics are wrong because they fail to address the
true business problem, a human problem.’
‘I
think a lot of these Big Data analytics are wrong or bad,’ she added, ‘because
they fail to address the true business problem, and by that I mean a human
problem.’
‘Researchers tend to
understand the business case and how data should be leveraged,’ she observed.
According to Dennedy, it’s
time for researchers to step up and reach out to their counterparts in
functions they may not normally work with, even if it means taking on projects
outside their current purview.
‘Consumer
and marketing researchers become quintessentially important when they carry
insights across the aisle,’ Dennedy said.
‘Make
sure those customer insights and pain points are part of the equation from the
start.’
In
this podcast for The Research Insighter’the official interview series of the
Future of Consumer Intelligence (FoCI) conference’Dennedy discusses:

‘ ‘Privacy engineering”what it is and why it
matters

‘ The problem with Big Data

‘ Applications and implications for large and small
companies, alike

‘ What researchers can do today to get involved, and
more!

Editor’s note: Michelle Dennedy will present ‘The
Privacy Manifesto’ at The Future of Consumer Intelligence Conference taking place May 19-21 in Universal City,
California.
SAVE 15% on your registration to attend The Future of
Consumer Intelligence when you use code FOCI14BLOG. 

Register here today!

For more information, please visit www.futureofconsumerintel.com

  


ABOUT THE AUTHOR / INTERVIEWER 
Marc Dresner is IIR USA’s sr. editor and special communication project lead. He is the former executive editor of Research Business Report, a confidential newsletter for the marketing research and consumer insights industry. He may be reached at mdresner@iirusa.com. Follow him @mdrezz.

HBO Research Director Tackles Cross-Platform Media Consumption


Lack of Single Source Usage Data, Privacy and Device and Platform Proliferation Vex But Don’t Deter


By Marc Dresner, IIR
Jason Platt Zolov’s daily grind as market research director at HBO would probably overwhelm anyone who isn’t comfortable making sense out of uncertainty.
But that’s the nature of media research today: Platt Zolov manages to deliver reliable insights under challenging market conditions using less than perfect source materials.
In a recent interview with the Research Insighter podcast series, Platt Zolov discussed some of the obstacles he faces and how he’s addressing them.
‘Multi-platform usage is really one of our biggest issues,’ Platt Zolov told the Research Insighter.
‘We know individually how many people tuned in on Sunday night and individually how many people may have clicked on HBO GO or HBO on Demand, but to be able to see how they are moving between those different platforms is something that is very difficult to do,’ Platt Zolov said.
‘There is also the added layer of tracking all the different devices that people are watching on’mobile phones, tablets, new connected TV experiences with Apple TV and Roku’,’ he added.

“There are a lot of privacy issues around trying to link multi-platform usage to HBO loyalty’why people continue subscribing and paying that bill every month.”
Without a single source for usage data, Platt Zolov and his colleagues rely on conventional survey instruments and other sources to fill in the gaps, but it isn’t perfect.

And it gets even more complicated when you factor in new, non-traditional competitors like Netflix, Hulu and Amazon, which are developing original programming.

In this episode of The Research Insighter podcast series, Platt Zolov discusses:
‘ Balancing usage with self-reported data
‘ Privacy concerns related to linking multi-platform and loyalty
‘ Providing a 360-degree view of viewership in the modern media landscape and more’
Editor’s note:  Jason Platt Zolov will be speaking at the Media Insights& Engagement Conference 2014 taking place January 29-31 in Miami, Florida.
For more information or to register, please visit http://bitly.com/LHwwDX