*/
Metadata bares our souls: Susie Alegre looks at the incursions of technology into our freedom of thought and the urgent need for serious laws to protect it
We are moving very fast into a new digital age where we need to decide what human rights mean to us and to the future of humanity. The complexity of the big data economy and the algorithms that dictate our modern lives, as well as the veil of secrecy drawn over the inner workings of the digital world by big tech companies and governments, makes it hard to understand exactly what the threats are to our human rights. But in the modern world, you don’t need to be a philosopher, a wise woman or a social misfit to have negative inferences drawn about your mind that could dictate your future life chances. Big tech is reading us all everywhere, all the time. And the potential to scale up attacks on our freedom of thought is almost limitless. Once we lose our freedom of thought, we lose what it means to be human.
We are at a crucial juncture. In 2010, Facebook founder Mark Zuckerberg claimed that privacy was no longer ‘a social norm’. Six years later, the General Data Protection Regulation made it clear that, in Europe at least, we are not so ready to give up on human rights in our daily lives. The incursions of technology into our freedom to think for ourselves, in private and without judgement or penalty are very real. We cannot allow freedom of thought to become an anachronism, a Luddite concept not suited for the modern age. Without freedom of thought we may find ourselves in a world where the violations of human rights of the past will seem petty compared with the scale of human rights abuse that may be perpetrated at the click of a mouse or the tweak of an algorithm. If we want to understand how to protect our freedom of thought for the future, we need to know where the current threats are, and in the digital age, they are everywhere.
The cult of fame has gradually become mainstream – if we are prepared to open up and bare our souls, we can be famous for Andy Warhol’s notorious 15 minutes. All it takes is a few moments of brutally honest exposure, or a carefully timed casual quip on Twitter, and the riches of global adulation can be yours. But this exposure not only has an impact on our privacy; it reveals and affects how we and others think in ways we may not even realise.
Sharing has become the norm without us noticing. And sharing somehow feels private when it is done in the night from our beds, on a device that is practically a part of us. Research done about people holding their phones in public, even when not using them, points to only one thing we prefer to phone holding – hand holding with a significant other (if they are physically there). Our phone use has become a surrogate for intimacy. It is the way we get close to people, and we can only get close if we open up and share. But how many of us would speak openly on a phone call if we really believed someone else was listening in and interpreting the subtext in our private conversations? Would we feel comfortable with the idea that someone else was taking notes as we whispered sweet nothings down the line, analysing our tone, comparing it with another call we made the year before, triangulating our emotions and desires, recording it for posterity?
If we thought about it, we probably would not. Our communications would be adapted; heightened or dumbed down. We would not be authentic, if we chose to talk at all. Yet what is happening with our interface with technology is much more intrusive than that. It is not the content we post or even the personal information we provide that matters. Every piece of information about every action we take, or hesitate to take, every thought we explore online, every place we visit, the speed we move at, every word we utter near a voice-activated device can be brought together and analysed in an effort to give a clear picture of our thoughts and desires, our levers and buttons. It is the metadata that bares our souls.
In 2015, the news was peppered with stories claiming that Facebook knows you better than your friends and family. Researchers at Cambridge University had discovered that analysis of the things you’ve ‘liked’ on Facebook was a pretty good predictor of your personality and other personal attributes based on the ‘Big Five’ personality traits of openness, conscientiousness, extroversion, agreeableness and neuroticism. The results of the study show just how hard it would be to cover our online personality tracks as we click and scroll. We are not consciously telling Facebook our innermost thoughts; rather these are being extracted through an algorithmic analysis of our online activity and turned to profit, though not for us. While you might expect that liking a Facebook page for a campaign supporting gay rights could in some way be a predictor of sexual orientation, how would you know that liking curly fries would be a predictor of high intelligence, while liking Harley Davidson or ‘I Love Being a Mom’ would allow someone to assume you were not so bright? Although perhaps the fact that liking ‘Being Confused After Waking Up From Naps’ is a strong predictor of male heterosexuality is less surprising than many men might like to think.
….
Commercial companies, governmental institutions, or even one’s Facebook friends could use software to infer attributes such as intelligence, sexual orientation, or political views that an individual may not have intended to share.’ Consent is crucial in drawing the legal line around our inner lives. We cannot give consent to something we do not notice and are not aware of. That is why legislators in Europe were so disturbed by the idea of subliminal advertising being used to bypass our critical faculties regardless of what it might have been trying to sell us. Freedom of thought and opinion gives us the right to decide which parts of ourselves we want to reveal about our inner worlds, and to whom. It lets us choose when it is safe to share what is going on inside our minds. Making these inferences on an industrial scale without consent is, effectively, a massive violation of the rights to freedom of thought and opinion.
And it’s a violation that most of us are subjected to each time we interact with technology.
….
In the modern world, it is practically impossible to avoid leaving traces of yourself that may be used to profile and even punish you. Privacy settings will not help. The way these inferences are made and used is beyond our personal control. It needs serious laws and effective regulation to draw the lines around what is acceptable from a human rights perspective for us as individuals and for our societies. This research lights up the bottom line for social media companies – the business model is not about supporting community and connection; it is about understanding and exploiting how the user thinks and feels. How we think and how we can be made to think and behave is what is commercially valuable. Because if you can understand and control how we think and feel, you can control what we buy, what we do and how we vote.
This article is an extract from Freedom to Think: The Long Struggle to Liberate Our Minds by Susie Alegre (Atlantic Books: 2022). Part history and part manifesto, and filled with shocking case studies across politics, criminal justice and everyday life, this ground-breaking book shows how our mental freedom is under threat like never before. Bold and radical, Alegre argues that only by recasting our human rights for the digital age can we safeguard our future.
We are moving very fast into a new digital age where we need to decide what human rights mean to us and to the future of humanity. The complexity of the big data economy and the algorithms that dictate our modern lives, as well as the veil of secrecy drawn over the inner workings of the digital world by big tech companies and governments, makes it hard to understand exactly what the threats are to our human rights. But in the modern world, you don’t need to be a philosopher, a wise woman or a social misfit to have negative inferences drawn about your mind that could dictate your future life chances. Big tech is reading us all everywhere, all the time. And the potential to scale up attacks on our freedom of thought is almost limitless. Once we lose our freedom of thought, we lose what it means to be human.
We are at a crucial juncture. In 2010, Facebook founder Mark Zuckerberg claimed that privacy was no longer ‘a social norm’. Six years later, the General Data Protection Regulation made it clear that, in Europe at least, we are not so ready to give up on human rights in our daily lives. The incursions of technology into our freedom to think for ourselves, in private and without judgement or penalty are very real. We cannot allow freedom of thought to become an anachronism, a Luddite concept not suited for the modern age. Without freedom of thought we may find ourselves in a world where the violations of human rights of the past will seem petty compared with the scale of human rights abuse that may be perpetrated at the click of a mouse or the tweak of an algorithm. If we want to understand how to protect our freedom of thought for the future, we need to know where the current threats are, and in the digital age, they are everywhere.
The cult of fame has gradually become mainstream – if we are prepared to open up and bare our souls, we can be famous for Andy Warhol’s notorious 15 minutes. All it takes is a few moments of brutally honest exposure, or a carefully timed casual quip on Twitter, and the riches of global adulation can be yours. But this exposure not only has an impact on our privacy; it reveals and affects how we and others think in ways we may not even realise.
Sharing has become the norm without us noticing. And sharing somehow feels private when it is done in the night from our beds, on a device that is practically a part of us. Research done about people holding their phones in public, even when not using them, points to only one thing we prefer to phone holding – hand holding with a significant other (if they are physically there). Our phone use has become a surrogate for intimacy. It is the way we get close to people, and we can only get close if we open up and share. But how many of us would speak openly on a phone call if we really believed someone else was listening in and interpreting the subtext in our private conversations? Would we feel comfortable with the idea that someone else was taking notes as we whispered sweet nothings down the line, analysing our tone, comparing it with another call we made the year before, triangulating our emotions and desires, recording it for posterity?
If we thought about it, we probably would not. Our communications would be adapted; heightened or dumbed down. We would not be authentic, if we chose to talk at all. Yet what is happening with our interface with technology is much more intrusive than that. It is not the content we post or even the personal information we provide that matters. Every piece of information about every action we take, or hesitate to take, every thought we explore online, every place we visit, the speed we move at, every word we utter near a voice-activated device can be brought together and analysed in an effort to give a clear picture of our thoughts and desires, our levers and buttons. It is the metadata that bares our souls.
In 2015, the news was peppered with stories claiming that Facebook knows you better than your friends and family. Researchers at Cambridge University had discovered that analysis of the things you’ve ‘liked’ on Facebook was a pretty good predictor of your personality and other personal attributes based on the ‘Big Five’ personality traits of openness, conscientiousness, extroversion, agreeableness and neuroticism. The results of the study show just how hard it would be to cover our online personality tracks as we click and scroll. We are not consciously telling Facebook our innermost thoughts; rather these are being extracted through an algorithmic analysis of our online activity and turned to profit, though not for us. While you might expect that liking a Facebook page for a campaign supporting gay rights could in some way be a predictor of sexual orientation, how would you know that liking curly fries would be a predictor of high intelligence, while liking Harley Davidson or ‘I Love Being a Mom’ would allow someone to assume you were not so bright? Although perhaps the fact that liking ‘Being Confused After Waking Up From Naps’ is a strong predictor of male heterosexuality is less surprising than many men might like to think.
….
Commercial companies, governmental institutions, or even one’s Facebook friends could use software to infer attributes such as intelligence, sexual orientation, or political views that an individual may not have intended to share.’ Consent is crucial in drawing the legal line around our inner lives. We cannot give consent to something we do not notice and are not aware of. That is why legislators in Europe were so disturbed by the idea of subliminal advertising being used to bypass our critical faculties regardless of what it might have been trying to sell us. Freedom of thought and opinion gives us the right to decide which parts of ourselves we want to reveal about our inner worlds, and to whom. It lets us choose when it is safe to share what is going on inside our minds. Making these inferences on an industrial scale without consent is, effectively, a massive violation of the rights to freedom of thought and opinion.
And it’s a violation that most of us are subjected to each time we interact with technology.
….
In the modern world, it is practically impossible to avoid leaving traces of yourself that may be used to profile and even punish you. Privacy settings will not help. The way these inferences are made and used is beyond our personal control. It needs serious laws and effective regulation to draw the lines around what is acceptable from a human rights perspective for us as individuals and for our societies. This research lights up the bottom line for social media companies – the business model is not about supporting community and connection; it is about understanding and exploiting how the user thinks and feels. How we think and how we can be made to think and behave is what is commercially valuable. Because if you can understand and control how we think and feel, you can control what we buy, what we do and how we vote.
This article is an extract from Freedom to Think: The Long Struggle to Liberate Our Minds by Susie Alegre (Atlantic Books: 2022). Part history and part manifesto, and filled with shocking case studies across politics, criminal justice and everyday life, this ground-breaking book shows how our mental freedom is under threat like never before. Bold and radical, Alegre argues that only by recasting our human rights for the digital age can we safeguard our future.
Metadata bares our souls: Susie Alegre looks at the incursions of technology into our freedom of thought and the urgent need for serious laws to protect it
The beginning of the legal year offers the opportunity for a renewed commitment to justice and the rule of law both at home and abroad
By Louise Crush of Westgate Wealth Management sets out the key steps to your dream property
A centre of excellence for youth justice, the Youth Justice Legal Centre provides specialist training, an advice line and a membership programme
By Kem Kemal of Henry Dannell
By Ashley Friday of AlphaBiolabs
Providing bespoke mortgage and protection solutions for barristers
Joanna Hardy-Susskind speaks to those walking away from the criminal Bar
From a traumatic formative education to exceptional criminal silk – Laurie-Anne Power KC talks about her path to the Bar, pursuit of equality and speaking out against discrimination (not just during Black History Month)
James Onalaja concludes his two-part opinion series
Yasmin Ilhan explains the Law Commission’s proposals for a quicker, easier and more effective contempt of court regime
Irresponsible use of AI can lead to serious and embarrassing consequences. Sam Thomas briefs barristers on the five key risks and how to avoid them