Netflix, Spotify and TikTok ‘knew journalist was bisexual before she did’
A BBC reporter recently realised that she is bisexual – but the 24-year-old thinks Netflix worked out her sexuality before she was even aware of it.
Writing for the BBC, Ellie House revealed that the streaming platform started recommending queer films with sapphic storylines, months before realised her sexual identity.
One film told the story of a women cheating on her husband with another woman.
And it wasn’t just Netflix. Music streaming service Spotify was also suggesting “sapphic” playlists with cover art of two women kissing, and her TikTok For You Page was full of videos from bisexual creators.
House says she never searched for LGBTQ+ content, and that she used a work account for TikTok. At this point, House considered herself to be straight.
“I realised that I was bisexual in my second year of university, but Big Tech seemed to have worked it out several months before me,” she wrote.
“I’d had one long-term boyfriend before then, and always considered myself straight. To be honest, dating wasn’t at the top of my agenda.
“However, at that time I was watching a lot of Netflix and I was getting more and more recommendations for series with lesbian storylines, or bi characters.
“One show that stuck out was called You Me Her, about a suburban married couple who welcome a third person into their relationship. Full of queer storylines and bi characters, it has been described as TV’s ‘first polyromantic comedy’.”
House asked: “What signs had these tech platforms read that I myself hadn’t noticed?”
Can algorithms know us better than we know ourselves?
There’s a lot of data out there about us. Platforms – not just social media sites – collect masses of information about us, from our location to our shopping preferences.
Sometimes, it can feel almost like our devices can read our minds – like when your mobile phone serves you an advert for something you were just talking about.
Netflix, meanwhile, uses an algorithm to help recommend films and TV that you might want to watch. But can they really know your sexuality before you do?
As part of the BBC investigation, House downloaded all of the information held about her from eight of the biggest platforms. This is something every individual has a right to do under UK data privacy laws.
She found that Facebook had been keeping track of other websites she had visited, and the coordinates to her home address.
Instagram had a list of more than 300 different topics that it thought she was interested in, which it used for personalised advertising.
Netflix sent her a spreadsheet which detailed every trailer and programme she had watched, when, on what device, and whether it had auto-played or whether she selected it.
There was no evidence, however, that any of these platforms had tagged anything to do with her sexuality.
In a statement to the BBC, Spotify said: “Our privacy policy outlines the data Spotify collects about its users, which does not include sexual orientation. In addition, our algorithms don’t make predictions about sexual orientation based on a user’s listening preferences.”
Netflix told her that what a user has watched and how they’ve interacted with the app is a better indication of their tastes than demographic data, such as age or gender. The streaming giant’s recommendation system learns as you use it, but generally speaking, the algorithm considers viewing history, how you rated other titles and what other members with similar tastes have enjoyed.
However, House says that she was being recommended queer TV series that her “friends – people of a similar age, with a similar background, and similar streaming histories – were not being recommended, and had never heard of”.
Netflix maintains that it’s recommendations system does not include demographic information (such as age or gender) as part of the decision making process.
“No one is explicitly telling Netflix that they’re gay,” Greg Serapio-Garcia, a PhD student at the University of Cambridge specialising in computational social psychology, told the BBC. But the platform can look at users that have liked “queer content”.
Gay detecting AI
A 2017 study published in the journal “Big Data” found that Facebook could work out if you are gay “based on a few likes”, while in the same year, researchers reported that they had developed an AI that could decide whether someone was gay or straight based on a picture of their face.
There are some fears around this. Kevin McKee, a senior research scientist at the AI research laboratory Google Deepmind, previously told PinkNews that facial recognition systems usually categorise gender as a binary concept, classifying individuals strictly as male or female.
By doing this, they overlook and disregard nonbinary and transgender identities.
A Wired article from 2018 also stated that “an advertiser in the future might know your sexual preferences before they are clear to you”.
“Did Big Tech know I was gay before I did?” will be available to download as a podcast on BBC Sounds.
How did this story make you feel?