Why Facial Recognition Technology in Pornography is morally justifiable

Nov 13, 2017
Vote on Hacker News

Sexy pose of two adults

Do you know Pornhub ? It’s essentially YouTube for pornography, for those of you pretending not to know what that is. And they’ve been using Artificial Intelligence to scan their pornography database.
Essentially, they’ve trained an AI whose job it is to sit and watch the latest pornography all day every day.
 
Nice.
 
Joking aside, AI is being trained via machine learning in the fine art of facial recognition. Then, it can detect who the porn star is in each video, and allow users to find other videos of that porn star.
 

Imagine ‘Amazon recommends’, but everyone’s naked.

 
So far this AI has watched 50,000 videos during its beta testing. Pornhub hopes to have this AI scan its entire porn database by Summer 2018. It should become able to recognise and account for hair colour, sex position and even fetishes. Not only can you find your favorite artist but you can…well…do whatever it is you want to do. It is also hugely helpful in combatting piracy.
 
The adult entertainment industry has a big copyright and piracy problem. Though ‘porn pirates’ don’t sound like something that’s likely to be a major social issue, the scale of it actually rivals that found in music or film. The porn industry grosses as much as $3 billion per year on the internet alone. That’s not accounting for video, magazine or in-store sales. In the USA alone they rack up an impressive extra $10 billion per year on top of this figure. And 69% of pay-per-view internet content is pornography meaning it’s actually very big business.
Global porn revenues have declined 50% in the last decade due to the large amount of free porn online. Sites like Pornhub, xHamster, Pornmd, YouPorn offer it for free and instead make much of their income from advertising and the sale of data.
 

So what does this have to do with facial recognition?

 
Well, YouTube uses ContentID to make sure that piracy isn’t happening on their platform. Porn sites want to make sure that the videos posted are owned by the person posting them. That means that no longer will people be able to record paid-for porn and then post it on a free site. An AI that can detect these copyright infringements and then flag them for removal would be a massive boon for those who benefit from porn.
 

So why has this caused widespread uproar throughout the industry and beyond?

 
People think this technology could ruin livesmainly with ‘revenge porn’. Revenge porn is when one posts a pornographic video or image online with the intention of embarrassing the person in it. It’s typically done by a rejected ex-partner after a break-up: intimate images from the relationship are shared on social networks.
Likewise, a facial recognition piece of software could be used to identify adult entertainers, and expose their activity to their family. Needless to say, many amateur porn actors do this in their very private lives and their families tend not to know about it.
 
On this front, Pornhub have already stated that they will only use this technology on professional porn stars. How it will be decided who is professional and who is not is so far unclear.
Despite this, many are claiming that professional porn stars have a right to privacy. Allowing people to trace their real names and identities is unethical and morally unjustifiable. And this is where they begin to lose touch a bit with reality.
 
Someone who makes their living showing their face on video and then sharing it then realistically loses their right to anonymity. It comes with the territory.
Besides, it’s almost unenforceable in both law and practicality.
 
Am I saying that the ‘average Joe’ or ‘average Jane’ does not have a right to privacy? Not at all. It should be illegal (and in most places it is) for someone to share intimate videos of people without consent.
But for those who produce content for the sole purpose of sharing it far and wide in exchange for money?
Can they be omitted because they’ve changed their minds?
That said, there should definitely be a way of grouping porn actors under their ‘porn name’ rather than their real one. If only so an elderly relative Googling their granddaughter’s address doesn’t accidentally give themselves a heart attack if their safe search is off.