BLOG - How to Regulate Privacy

Updated: Apr 30





Information warfare techniques have long been used to mislead, deceive, and manipulate public opinion. Data analytics can be used to identify trends and biases that can then be exploited by highly targeted campaigns. What’s new is that we now have the ability to personalize attacks to ever-smaller groups.


The best example of personalization in cyberattacks may well be what’s called “phishing.” Unsophisticated scammers can send mass emails trying to trick their victims into downloading malware or to wire money to a Nigerian Prince.


In recent years, we’ve seen “spear-phishing,” in which a highly personalized version is sent to an individual. This could be an email from what looks like an old friend with a link to a birthday e-card on your actual birthday, or it could be a note to a corporate CFO from the CEO asking for a wire transfer related to a deal. In each case, the attacker knows and exploits personal details about the victim.


Which brings us to data breaches and the “weaponizing” of data. Consumers are increasingly concerned with the misuse of their data following leaks and breaches. If bad actors know your Social Security Number they could try to fraudulently open credit in your name.


If they know from your smart thermostat or social media posts that you aren’t home, they might break into your house. If they know your charitable contribution history, they might try to scam you with fake charity donation requests. Your data can be personalized and weaponized against you.


More nuanced versions of this also target you. Analysis of the news stories you read can be used to filter the news you see in the future, creating a confirmation bias loop. Knowledge of your browsing and shopping searches can be used to talk you into buying a different product from what you seek. Reportedly, Facebook data was misused by a third party to influence voters during an election in the US and the Brexit vote in the UK.


Whose Data is it?


In all of this there is a fundamental question at stake. Whose data is it?


Many data privacy rules cover content but are less clear about so-called “metadata” describing what you did. When you make a phone call, the phone company creates a “Call Detail Record” that captures who you called and how long you spoke. Who owns the records describing your calls isn’t necessarily you.


When you connect to a web site, the site owner may track your IP address, login id, etc. and may record the content you looked at. Analyzing this allows sites to recommend products, but the records of your behavior don’t belong to you. Your Internet Service Provider might record every site visited and search performed, the name of every domain you look up, and the applications or services you use.


Internet Providers, apps, and services may disclose what they collect and how they use it in their privacy policies, suggesting it isn’t theirs to use without your consent, but you often can’t opt-out…suggesting you don’t own and control it either. We are left in a grey zone. Data owners clearly should decide how data is shared and used, but ownership is unclear.


Now look at firms like Facebook and Equifax. Both collect large amounts of data about an individual’s consumer behavior. Both make use of it in ways that the consumer may not have fully understood or provided informed consent to use. Both had data leakage incidents. Did they lose control of your data or of their data about you?


Until recently, many consumers did not consider this question. Ambiguity allows companies like Facebook and Google to make money from your behavior, selling ads that are targeted specifically at you. The ability to personalize and weaponize data for unintended purposes is now resulting in consumers choosing to assert that data about their behavior belongs to them. Use must be approved by consumers with informed consent, not buried in the fine print of privacy and terms of service policies.


Marketing vs Manipulation


Consumers have long been subject to those trying to influence their decisions. It’s called marketing. Over time, marketing has become better at analyzing demographic and other data to fine tune messages and target delivery. We still recognize when someone is trying to “sell us” and apply whatever filters we want.


Practices like subliminal advertising are considered abusive by some, mostly because they shift from overt selling to attempting subconscious manipulation. When data about our individual behaviors is used to tailor this to an individual, the manipulation becomes personal. When used for purposes like manipulating votes, it becomes intolerable.


Regulation has limited the commercial impact of deceptive marketing practices by requiring things like truth in advertising. Other regulations specifically address personalization. For example, FINRA regulates stock brokers and has rule 2210 that differentiates between marketing information and investor correspondence based on the number of recipients. If you send something to 25 or more retail investors you are marketing. Fewer than that and you are doing (personalized) correspondence with different rules.


How to Regulate Privacy


Thoughtful regulation of data privacy needs to address:

  1. Who owns the data

  2. Informed consent, and

  3. Permissioned use

Consumers are now demanding that they own not just data they explicitly produce, but the data that describes their individual actions and behaviors. The phone numbers I dial, the websites I visit, the apps I run, and the news I consume are fine to use if aggregated with others. If personalized to just me, we shift from potential marketing to manipulation and additional controls are needed. Much of the legislation recently proposed fails to clarify this distinction.


Consent need to be clear, opt-in, and revocable. This principle is found in some of the proposed legislation and defined in new European regulations like GDPR (General Data Privacy Regulation). However, without clarity around personalized data ownership, there will likely continue to be confusion as new data types emerge.


Critically, consent must be based on permission to use data vs permission to know data. Far too much focus has been on data secrecy and not enough on how consumer data is used (see my April 9 blog post).


If data is shared, consumers need to authorize not just the sharing but the purpose for which sharing is permitted and the duration for which this is valid. Equating privacy with secrecy ignores the fact that once shared (properly or not) all control is lost. It focuses only on data holders instead of also on those using it for manipulative or fraudulent purposes.


The recently introduced CONSENT act is one of the few that begins to address on data use vs just secrecy. This is an area that is worthy of much more discussion.


If new regulations contain clear principles that define who owns and controls behavioral data, provide for clearly informed consent with restrictions imposed by the owner, and focus the consent on authorized use vs secrecy, we will solve the problem of data being weaponized. There will still be those who mine it with bad intent, but they will have little ability to use it against most consumers.


Lou Steinberg is a Managing Partner at The Authoriti Network, https://authoriti.net.



Copyright © 2018-2020, The Authoriti Network, Inc. All rights reserved. 
The Authoriti logo and the Permission Code® platform are registered trademarks of The Authoriti Network.
Eliminate Friction and Fraud™, Frictionless Security. Delivered.™

      and the Authoriti "A" mark are trademarks of The Authoriti Network.        

Contact | ResourcesPolicies 

  • LinkedIn Social Icon
  • Twitter
  • YouTube