Can intelligent Big Data technologies sway election outcomes? Are consumers now "public domain"? How do I go about having an online life while still protecting my privacy? Answers to these questions and more will be provided by the noted social scientist Dr. Michal Kosinski, from the Stanford University Graduate School of Business, when he takes the stage at the CeBIT Global Conferences on 23 March.
Kosinski enjoys international renown as an expert on psychometrics – a rapidly up-and-coming area of psychology that deals, among much else, with the influence of Big Data and other digital technologies. While studying at Cambridge University, Kosinski developed a mathematical method that analyzes Facebook likes and publicly available data to determine people's personality traits and predict their behavior. His method was used by the Trump campaign to spread personalized posts via a range of social media channels in the run-up to the presidential election.
Personalized advertising has been part of online life for a long time. But psychometric targeting opens up completely new opportunities – particularly for marketing services and consumer goods. In today's age of Big Data, wearables, car connectivity and the Internet of Things, the average consumer generates vast stores of personal data with everything they do, whether it's driving their car or strapping on a fitness tracker. This data is an absolute bonanza for corporations, who can, within certain legal constraints, harvest it to tailor their marketing message to specific consumer groups.
"A growing proportion of human activities, such as social interactions, entertainment, shopping, and gathering information, are now mediated by digital services and devices. Our research shows that capturing digital behavior patterns, such as tweets, Facebook likes or web browser logs, is sufficient to be able to build up a detailed picture of an individual's personality, intelligence or political leanings. These types of Big Data analysis do not require any active participation on the part of the data subjects. They can be applied to large populations, are cost-effective and potentially have revolutionary applications in many areas. But in the wrong hands, they pose substantial risks to privacy."
For these reasons, Kosinski urges people to support individuals, organizations and corporations who are committed to safeguarding privacy, whether in the context of purchase decisions, clicks in social networks or political elections. Misuse of privacy for political gain doubtless harbors real risks. For instance, intelligent Big Data technologies could well be used to influence France's presidential elections this spring or Germany's parliamentary elections in the fall.
Hello Mr. Kosinski, is this your first time at CeBIT and CeBIT Global Conferences?
It is my first time as a speaker! I attended CeBIT before and it was always an exciting experience.
What will be the key issue of your speech?
I will talk about how the algorithms can be used to reveal our intimate traits and predict future behavior based on the digital footprints we all leave behind.
This article about your work was one of the most heavily discussed news pieces in 2016 (in Germany, Austria and Switzerland). In the aftermath there was a big discussion if the methods really led to the outcome of the US election or not. A few months later: What is your personal conclusion? How have these methods led to the election of Donald Trump and Brexit?
We don’t know. What we know for sure, however, is that computers don’t vote, at least not yet. So it is not the algorithms that win the elections but the candidates. Also, the fact that the candidates can personalize their message for each individual to make it most relevant is great – we need more of it.
Political programs are too long for any single citizen to digest in full and much of it is irrelevant to any given voter. The algorithms can choose and share with you the aspects that are most relevant to you and that you are competent to judge. This means more informed and more engaged citizens, and more people engaged in the political process.
2017 is a election year in Germany and some other countries. What would be your personal advice to political parties in Germany. Should they use your findings? How?
Personalizing political messages is not wrong per se – quite the opposite. What I think is fundamentally wrong, is that algorithms are used to determine people’s intimate traits behind their back and often against their will. I do not think that any political party, or anyone else should be doing it. I also hope that voters will punish the parties that disrespect their privacy and try to influence them in a mischievous way.
You say "If used ethically, it could revolutionize psychological assessment, marketing, recruitment, insurance, and many other industries." What would be the key ethical guidelines to use your findings?
Informed consent and, if so chosen, the ability to use a given service or product in a completely anonymous fashion. Predictive algorithms could radically improve people’s lives, and people will opt-in to use them.
But they should always retain control over their data and what is being done with it. Without clear rules, the consumers will lose their trust in technology and this will be bad for us, consumers, and for the companies alike.
Which technological developments are currently most important to you? Which potential do you see in them?
Matching people with jobs and careers, improving their psychological health, better education, and even saving lives. I really mean it. I will be talking about it at CeBIT.