Privacy – in the eye of the beholder

 

Like with most things in our lives, finding friends or a partner, recruiting people, eating habits, hobbies, ones sense of discipline and creativity, it all kind of boils down to choices we make based on our personality. Our personality often boils down to our upbringing and values, genetic heritage, personal beliefswisdom and of course external input and feedback on ourselves. On top of it all, you are just the person you think you are, making the choices you think are right for you.

“Privacy is the ability of an individual or group to seclude themselves, or information about themselves, and thereby express themselves selectively. The boundaries and content of what is considered private differ among cultures and individuals, but share common themes. When something is private to a person, it usually means that something is inherently special or sensitive to them. The domain of privacy partially overlaps security (confidentiality), which can include the concepts of appropriate use, as well as protection of information. Privacy may also take the form of bodily integrity.” – Wikipedia definition of Privacy

What if a company, small or large, based on your average presence online, could predict what could convince you about anything, what to buy, how to vote or who to meet, and then also make suggestions or recommendations on how to act? Would you appreciate it? Or would you feel offended?

Research by Professor Michal Kosinski shows that algorithms can predict your personality and future behavior by just analyzing simple data, such as Facebook likes, browsing history or data from “fun tests” you take online. When companies or governments then use these predictions for their own purposes, targeting people with a message that we are most likely to respond to in a positive action; accept, vote yes, buy it, download it etc, it is called psychographic targeting.

“His [Kosinski’s] research covers everything from accurately predicting people’s personalities based on their Facebook likes, to using huge data sets to show that people are most likely to be friends with people who have similar personalities to them.” – Interview with Michal Kosinski in The Psychologist

Kosinski states that even friends doesn’t share the most intimate information about themselves, like their sexual orientation, certain diseases, or whether their parents are about to split up. If a computer algorithm could predict these things by just taking in the list of songs you like on Spotify, or a list of likes you made on Facebook or your browsing history in Chrome… Then that means we’re moving into the Post-Privacy era.

Recommendation, persuasion or manipulation, they all kind of mean the same thing, only we perceive them as positive, neutral and negative. If a friend recommends you a book or a movie, you appreciate this. If your spouse persuades you into agreeing to buy this new sofa for your living room, it’s kind of neutral. If a company or a government manipulates you into buying something or voting a certain way, we are quite negative to that.

We are generating data in everything we do online. We search on Google, we make friends on Facebook, we share YouTube videos and find new music on Spotify. We make purchases, we play games, we fill in forms, we request, we read, we chat and we browse. The data itself is not evil. Most of the times, these data are tracked by the company behind the website that you are visiting, companies advertising on this website, analytics companies and in some cases other third party companies handing services to the site. The data is used as a tool to learn how the website is performing, and for optimizing the website based on users behaviors and needs. It’s also used for user segmenting, profiling, predictions and recommendations. None of these things are bad in itself. It’s the intentions of the decision maker; often the leader of the company or government, and for what purposes these outcomes are used that can be bad. As what happens when a decision maker, like a presidential candidate, use the outcomes to manipulate people into voting for him or her, or if the management of a company chooses to target elderly people for their telemarketing to make them agree to terms they don’t understand. It’s unfair.

The power of algorithms might feel scary, or invasive, and they can be, depending on the person interpreting or making decisions based upon them. But they also come with great advantages, like helping people finding the right jobs for them or monitoring physical health as Kosinski exemplified. Sharing your data while using a service also comes with great benefits, it lets you use location services to find your way around a new city, monitor your heart rate while exercising or get relevant personal offers on purchases you’ve thought of making.

Tailored services and offers made to you based on your geographical location, age or gender is one thing. Specified ads, offers and services to your interests is another. Our interests are multidimensional, meaning we might like a cat-picture because of its composition and our interest in photography, or checking into a sports arena because it’s hosting a concert tonight, not necessarily sport event. It’s not you, it’s a model. Humans are complex, and what these algorithms do is making up a model; your data profile. This means not all of the targeting will be correct and totally awesome, but most will, and often times the recommendations will let you discover new things, and not let you get trapped in a filter bubble just because of your diverse data profile and multidimensional interests.

Before going into what privacy choices you can do, and what to make out of this, you need to decide what privacy means to you. Are you willing to give up location services, recommended stories in the newspaper or finding new friends or offers online? Are you willing to trade your data for these services? Do you trust the companies using this data to create this data profile or “model” of your preferences?

There is a legal regulation coming in place in the EU in May this year regarding your digital privacy. The regulation puts pressure on companies who are operating in the EU to:

  • inform about what data they track and how it’s used
  • structure the data so that it can be delivered back to the users who requests it and,
  • delete it upon request
  • offer the users additional privacy options for personalization and profiling

This regulation forces companies to rethink their data strategy drastically, and for many companies this entails a huge clean-up of policies and databases that is long overdue. It also forces users to think about their online privacy. This is where we stumble upon the privacy paradox. The privacy paradox is mentioned in several research studies where the conclusions points to the same thing; People are in general worried, upset and negative towards the lack of, or declining, privacy online. Although they are willing to trade their data for services and convenience, and they are easily willing to trade their friends privacy for a small monetary incentive.

It might seem strange that this regulation is happening now as we’re already talking about the Post Privacy era. Researchers such as Kosinski believe that trying to regulate for more privacy online is just a distraction and that we need a more long term sustainable strategy for dealing with the openness that is coming, and not regulate for more privacy.

Ending note:

Having it all out in the open, sharing data with companies, governments, family and friends might not be such a bad idea after all? It’s not the data, nor the predictions that are made from them that makes us uncomfortable sharing our data. Many of us grew up with the value that you should keep things private to yourself. But why? Sometimes because if the sensetive information was shared, about ones sexual orientation, physical or mental health state or political viewpoint, you would get reactions from the public or the government that was not beneficial. It could lead to shaming, less perceived self worth, isolation, feeling like an outcast, stalking humiliation and in some cultures even jail or death penalties. Meanwhile, today debates like #metoo against sexual harassment and other online initiatives to get young people to share their feelings about mental pressure and stress are empowering us to share and #talkaboutit.

It’s not wrong sharing your data, it can actually help the world become a more open place. We could predict who will need help, or how to avoid big economic or societal catastrophes if we detect data trends like that. We can become pro-active and treat people better. What are wrong still, are the effects that these predictions may have in the wrong hands, it’s the decision makers, closed corrupt non-democratic societies, lack of trust and empathy that is wrong in this equation. And I’m afraid that we will have much more data, smarter algorithms and bad decision makers before the people in power step up and work on the real problems to put legislation in place that protects the people that needs to be protected.

If we’re heading towards a world where there is no privacy, the legal entities of the European Union and countries and large companies need to work together on how to organize societies and the technology to work in a world without privacy. Educating the population on information retrieval, whether it may be political or social, work towards more open and tolerant societies in the world where right now having the “wrong” sexual orientation or religious view could lead to a death sentence. Let’s not regulate privacy through technology, let’s make way for a more humane and open world.

Watch more:

  1. The End of Privacy
  2. AI Is Already Smarter Than We Are
  3. Can We Out Evolve Artificial Intelligence?

Comments are closed.