Advertisers are creeping people out -- at least according to a recent report by the Privacy Commissioner of Canada, which found that 73 per cent of Canadians feel they have less protection over their personal information than they did 10 years ago.
A full third of Canadians said they were "extremely" concerned about privacy -- that's up from 25 per cent in 2012. Of those respondents, 49 per cent said that targeted ads were at least partly to blame.
This research has implications for nearly everyone involved: media platforms, agencies, clients, researchers, and the public. As Big Data and the Internet of Things continue to revolutionize how we live, ads will become increasingly personal in nature.
Data pulled from your credit card and social media, coupled with an increase in programmatic media buying means that you're more likely to be see ads related to your behaviour.
This is a tremendous opportunity -- but also one rife with ethical landmines.
Even de-identified data has the potential to cause harm to the targeted individual, as Ken Wheaton points out in Ad Age. Ads for personal lubricant on a shared work computer, for instance.
Perhaps it's time to establish a Big Data code of research ethics.
At present, there is no one governing body that oversees data usage by marketers and media platforms. There are codes of ethics put out by the Canadian and American Marketing Associations, as well as individual ethical codes drafted by marketing research associations among others, but who is accountable to them?
The data from online research subjects belongs to everyone and no one. Researchers and their subjects are far removed from each other.
So what might a Big Data research ethics body look like?
1. Independent and Objective Governing Body
An objective group of individuals educated in research and ethics, with practical knowledge of the industry, would oversee industry trends and issues and react accordingly. They would make changes to policies which are consistent with the evolution of technology and the concerns of citizens.
2. Participation of all Digital Media Platforms
Digital media platforms would be accountable to this body, and be expected to act according to its principles. This takes the burden off individual marketers and agencies, and places it on those responsible for displaying the ads.
Centralizing responsibility with organizations who already oversee the ad buying process means that ad-vetting requirements can be easily extended beyond violence, sex and language, to include ethical factors.
Holding these companies accountable to an independent body also ensures that the data they own can't be used against their own subjects without consent.
As we saw with the Facebook emotional manipulation experiment, not all big data is used to sell products or services on behalf of others. Facebook simply wanted to see what impact algorithm changes would have on their main product -- Timeline. This didn't make it any less ethical.
3. Meaningful Enforcement
Digital media platforms have a vested interest in seeing as many ads sold as possible, putting them in a position to turn a blind eye to unethical use of their data for marketing and advertising practices.
Oversight, auditing, enforcement of guidelines, and repercussions for violations are essential to policy effectiveness. Given the size and financial resources of companies like Facebook and Google, punishments for violations should be swift and firm.
4. Special Attention to Vulnerable Populations
There are specific members of society who are especially prone to harm by manipulation of their data.
There is inherent risk in revealing information about individuals with health conditions at work, or outing LGBTQ individuals to their friends and family through ads. An ethical board should give special consideration to the content of ads targeting vulnerable groups of people.
5. Standardized Data Storage Requirements
Each company has its own safeguards in place to protect its information. Data is treated as a valuable commodity, but breaches do happen.
Standardizing data storage requirements keeps everyone up to date, and reduces the chances of a major data leak happening.
This is by no means a complete examination of big data and ethics, and it does not address issues around data collection.
However, personal data has been around since the dawn of modern civilization -- it's only recently that we've been able to do anything with it. Therein lies our responsibility to treat it respectfully.
Marketing is evolving, and so should its oversight. We all have a duty to ensure the responsible use of data -if used well, its benefits can greatly outweigh its risks.
ALSO ON HUFFPOST: