by Lisa Wilding-Brown, Chief Research Officer, Innovate MR
Some of these terms are borrowed from science; others introduce a new methodology and signal an evolved insight into the questions we seek to answer. And then there are words like “privacy” whose use — complicated by its inexplicit, culturally amorphous, and rapidly changing definition — can create distance between ourselves and our most valuable asset: the research participant. Just take a look at the average privacy policy and you will quickly discover what I’m getting at.
Earlier last year, The New York Times published a noteworthy article on the oblique and incomprehensible nature of 150 prevalent privacy policies and characterized them as a “disaster” with the vast majority of these policies deemed unreadable; requiring a college reading level or beyond. Additionally, the article indicated that many policies serve to protect companies, not consumers; written by lawyers, for lawyers. Google’s privacy policy, for example, has evolved greatly in complexity and length, starting at just 2 minutes in 1999 and peaking at 30 minutes in 2018. It is safe to say that our industry is no exception to this troubling status quo and our appetite for buzzy words (and their respective imposing methodologies) have made their way into our privacy policies; leaving consumers and researchers equally mystified.
Source: The New York Times examined 150 widely-known privacy policies and evaluated these policies using the Lexile test developed by the education company MetaMetrics. The test measures a text’s complexity based on factors like sentence length and the difficulty of vocabulary. According to the most recent literacy survey conducted by the National Center for Education Statistics, over half of Americans may struggle to understand dense, lengthy texts. This means a significant chunk of the data collection economy is based on consenting to complicated documents that many Americans simply can’t comprehend.
The Data Collection Economy
As more governments and regulatory agencies take swift action against companies that fail to protect consumer privacy by enacting sometimes confusing or complex laws, the market research industry is scrambling to meet new obligations related to data privacy and transparency. In order to understand the breadth and depth of new data regulations and its impact on our industry, we must understand the wider eco-system that we operate in first. Data-hungry industries — including tech giants — are making moves. Google recently announced its intention to phase out support for third-party cookies in Chrome with the goal of creating enhanced consumer privacy via a series of API tools in what Google has baptized the “Privacy Sandbox.” In sharp contrast, Facebook announced a new first-party cookie that will aid publishers and advertisers in the measurement and optimization of Facebook ads; allowing publishers to analyze browser data blocked by third-party cookies.
It’s both interesting and alarming to see two tech goliaths take such a different position on privacy; it’s no wonder that many research companies see privacy as a burden or a hurdle to “overcome.” However, simply preparing to meet tech and legislative requirements as they are introduced is a short-sighted strategy and misses the opportunity to develop methodologies and operating procedures that align with the ethics and values consumers demand. As we embark on a new decade, a stance of radical transparency is the only position which meets the demands of policy, provides consumers a fair opportunity to understand and protect their data privacy, and invites the public at large to truly embrace market research.
As a veteran member of WIRexec, I had the privilege of crossing paths with Kerry Edelstein a number of years ago. Not only did we grow up in the same hometown and start our research careers at the same company, but we soon discovered a shared passion on several issues. A kindred connection was born, and we have been threatening to collaborate ever since. Over this past summer, we decided now was the time; our shared commitment to data ethics and privacy was undeniable, and the industry needed to hear our voices. Soon after, InnovateMR teamed up with Kerry’s firm, Research Narrative, to conduct a groundbreaking study to better understand the participant perspective on data privacy. We had a lot of questions: What do consumers know about data privacy? What do they care about? How can researchers be trustworthy stewards of data? Our research also sought to address topics such as:
Our Methodology
Our team collected 2,000 interviews from American consumers ages 18-79, in a 22-minute online survey that was balanced for age, ethnicity, and gender representation relative to US census. Participants were recruited from InnovateMR’s double-opt-in panel, PointClub, which was balanced for psychographic considerations such as panel tenure (length of time in panel), panel engagement (level of past participation frequency), and recruitment source (how panelists were sourced to join the InnovateMR panel). Survey responses were screened for several fraud indicators, using Innovate’s next generation fraud mitigation technology.
What Did We Uncover? Transparency is Needed!
The vast majority of participants indicated a desire for companies to be more transparent with how firms’ use personal data. A 30-page privacy policy is simply not adequate! To add more complexity to the conversation: in 2019 IBM Security and the Ponemon Institute released the 2019 Cost of a Data Breach Report, which was based on in-depth interviews with more than 500 companies around the world who have experienced a data breach between July 2018 and April 2019. The analysis in this study considered hundreds of cost factors from legal, regulatory and technical activities, to loss of brand equity, customer turnover and the drain on employee productivity. Based on the study’s findings, the average data breach costs a company 8.2 million dollars in the United States and takes an average of 245 days to fully identify and remedy. Further, notorious data breaches such as the Cambridge Analytica scandal have been widely publicized by the media. This combination of data security shortcomings, privacy violations, and dense privacy policies that lack transparency have created a culture of distrust when it comes to data privacy.
Trust Is Earned, Not Given
According to our research, 88% of Americans wish companies would be more transparent with how they use personal data. Furthermore, 70% of respondents indicated that they felt as though companies today do not want consumers to understand how their data is used. This statistic alone paints a gloomy outlook for researchers and reinforces our greatest fear: if consumers don’t trust companies with their data, do our surveys appeal to a sufficient cross-section of the populations we seek to represent?
The Perception of Industry
Our study revealed that the Social Media industry has a lot of trust to build; Social Media is considered 60% “very” or “somewhat” unethical in its use of personal data, with Advertising taking silver at 54%. Market Research fared better at 29%, however we are not the star athletes here: Academia (18%) and Healthcare (13%) achieved the lowest rating on the unethical scale. While our industry should feel some relief that we don’t share the same status as Social Media and Advertising, 29% indicates there is significant room for improvement and a desire from consumers for us to behave in a more transparent manner. Interestingly the GRBN, a non-for-profit MR industry association, presented a similar narrative in its Trust Survey of 2018, where respondents indicated a 27% trust rating among MR firms. It has been nearly 3 years since the GRBN collected this data and sadly, our study suggests we have failed to move the needle.
Private Conversations Are Not So Private
Our data uncovered that consumers find themselves at a troubling intersection where a lack of security, privacy, and transparency have created a very real and material lack of trust in emerging data collection forms. When presented with several scenarios, consumers indicated a heightened sense of fear and concern that passive data collection activities fail to meet basic ethical standards. 73% of consumers indicated concern related to companies recording and storing private conversations and messages.
Respect My Data Rights, or I’ll Support Regulation!
While we did see a lack of specific awareness around regulations such as GDPR and CCPA, the result of this distrust is that an overwhelming majority of consumers now seek regulation at both the state and federal level. What is even more fascinating is that our findings were consistent across political affiliation, making this a truly bi-partisan demand in an otherwise polarizing political environment. Although we can’t agree on many facets of public policy, consumers are squarely aligned regardless of where they fall on the political spectrum when it comes to data regulation.
More than ¾ of Democrat, Independent and Republican participants expressed support for the following regulations:
Collected Without My Permission
While lengthy privacy policies (in miniscule font) might technically capture consent, our study revealed that consumers don’t recall giving permission to many types of data collected. Over half of the participants surveyed believe their web browsing, search history and site visitation (54%), along with purchase behaviors (49%), are being tracked without their permission. Nearly 50% of Americans see companies following them with targeted ads and they don’t recall providing consent.
Percentages reflect the % of respondents who believe this information is collected from them, without permission.
While new technologies and emerging methodologies can help us get closer to the truth, we must not lose sight of the impact these approaches have on consumer privacy and the perceptions we curate among research participants. Behavioral-tracking technologies (when bridged with self-reported data) can provide researchers with an exceptionally comprehensive consumer outlook however, if the ethical and legal considerations are not regarded, we run the risk of compromising the very thing we look to instill: trust.
Take Action!
While we may feel overwhelmed, transparency can and should be regarded as an opportunity, not a burden. There are material changes MR companies can employ to win over consumers:
Interested in learning more about our Data Privacy study? Do you want to get involved in future dialogue around data privacy and consumer trust? We’d love to hear from you! Request an Executive Summary of our study and contact us at Lisa@innovatemr.com or Kerry@researchnarrative.com