How would it be when you find interesting celebrity news on a website and pop-up comes up with a fake video of that very celebrity? Totally embarrassing! You were definitely not expecting this sudden awkwardness.

In a shadowy corner of the World Wide Web, there are around 80,000 people in the world who are creating fabricated videos of women celebrities by swapping their faces with porn stars.

Well, that’s cruel. These are deep fakes that feature very realistic fake videos with face-swaps. To cut it short, it uses the nude body of a porn star and stitches over to that of a celebrity woman.

It has been introduced as the easiest technology as it has already created an enthusiastic community of numerous users on Reddit. The users swap their latest work and compare notes. For example, the demo sex tape of Emma Watson and Trump face to Puttin.

With a blink of an eye, the videos that we see turns to be a deep fake video.

Are deep fake real?

There are multiple deep-learning applications that can turn a basic video in a deep fake video and a startling real vs. fake audio note which would look natural.

It is surely a big deal when any person turns real to fake without even knowing of being fake. A Computer Science Professor at Dartmouth College, Henry Farid gave a statement to The Wall Street Journal, “You can put anything and everything in someone’s mouth, like whatever you want.”

To explain the technology a little better, a local actor sits in front of the mirror and Jenifer Lopez in the digital screen. When the actor smiles, Miss Lopez will smile as well. If the actor gives a thumbs up, Miss Lopez will do the same.

Where there’s a fake, there is a detector as well. Now, researchers have a forensic technology to detect the fakes.

Consequences can be Bad!

Prof. Farid emphasized the fact that the researchers who work on these computer-generated technologies must keep them aware of the consequences that they are sharing with society.

The known detectors might be strong at other platforms but they haven’t yet succeeded to detect the face-swaps.

The reason being, there is probably no way to identify if the videos we see online are fake or not. Heavy internet security that gives a pause to think about the video is fake. Everything we see looks real to us. There is no way that we could differentiate between the real and the fake.

PornHub says…

The Vice President of PornHub, Corey Price says that the users have started reporting the content they feel is not apt and they are taking it seriously to keep the situation under control. They also suggest the users visit the Content Removal page and report the issue.

 

Hands-on Experiment

While Prof. Farid gets himself an understanding of the deep fakes, he thanks the deep-learning applications that he can now dance like Jackson.

Every bright side has a dark side as well. Here, this concept has gained a darker side. Prof. Farid has learned from a victim who has faced the consequences by getting deeply affected with the deep fakes.

It’s not new to learn a harmful lie like this. The techniques of machine learning are intensifying the elegance of technology and making the fakes look more realistic and resistant to fake detectors.

Why it is dangerous

With regrets, the increase in the growing demand for such content has witnessed the flagrant violation of the dignity and modesty of many celebrities. Emma Watson, Selena Gomez, Natalie Portman, Sophie Turner, are some of the celebrities on the growing list of celebrities who have fallen victim to the Fake App tool.

From questions to improve technical assistance, specifications for the application, to suggesting names of celebrities, people seem to be hooked on this worrying trend of collective fantasies.

For now, the use of this face-swapping technology is concentrated around celebrity pornography. But the potential uses of technology are many.

At a time when the fake news epidemic is a significant concern, such applications could potentially be used to create and circulate minute political content.

Limitations and learning

While the results are exciting, there are definite limitations of what we can achieve with this technology today:

  1. It only works if there are many images of the lens: to put a person in a video, approximately 300-2000 images of his face are needed so that the network can learn to recreate it. The number depends on how different the faces are and how they match the original video.

 

  1. This works well for celebrities or anyone with many photos online. But clearly, this will not allow you to create a product that can change anyone’s face.

 

  1. You also need training data that is representative of the goal: the algorithm is not suitable for generating Oliver profile shots, just because you did not have many examples of Oliver looking sideways. In general, the training images of your target should approximate the orientation, facial expressions, and lighting of the videos in which you want to include them.

 

  1. Then, if you are creating a face-swapping tool for the average person, since most of the photos will be front-facing, you can limit face swaps to mostly forward-facing videos. If you are working with a celebrity, it is easier to obtain a diverse set of images. And if your goal is helping you create data, choose the quantity and variety to be able to insert them into anything.

Conclusion:

The deep mistakes and the resulting problems are entirely new, but it can be safely said that they have passed the childhood stage at an accelerated pace. That is why the issue has attracted the attention of many sectors of society and spreading forth.

Reddit, from which the problem arose, may have banned the pages that popularized this trend, but the Fake App tool is essential in its platform. That is where the real threat lies, and its availability guarantees the continuity of this trend.

Categories: Business

Leave a Reply

Your email address will not be published. Required fields are marked *