Deepfake's Threat to Internet Credibility

Due to the rapid advancement in technology and artificial intelligence, we are reaching a point where nothing on the internet is believable.

We’ve already seen this happen with articles and news stories on the internet and on TV. However, a new technology called deepfakes might make video evidence no longer credible. According to, deepfakes are “videos in which the face and/or voice of a person, usually a public figure, has been manipulated using artificial intelligence software in a way that makes the altered video look authentic.” Recently, deepfakes have been used to mislead the public. People have started deepfaking many politicians and celebrities, and some of them seem very real.

Deepfakes are made using machine learning and artificial intelligence. They started to emerge in the 1990s where researchers were experimenting with the technology. However, this has changed over the past few decades. Today, amateurs and developers around the internet have started to create their own deepfakes. Deepfakes have also been used by proffesionals in movies to insert faces into existing movies. They've also been used in art and other industries as well

So why are Deepfakes so dangerous? Deepfakes pose a large threat to politics and democracy. They can be used maliciously to make a politician say something false or controversial that they’ve never said before. And with the technology growing ever so quickly, it's starting to become difficult for the human brain to differentiate between a deepfake and a real video.

To combat this inevitable issue, researchers have started creating software that detects deepfakes using the algorithms that deepfakes use themselves. To add on, lots of internet companies such as Reddit, Twitter and Google have agreed to ban deepfakes from their sites.

Although deepfakes can have a positive impact on multiple industries, they can also pose a large threat to the credibility of the internet. We'll have to see for ourselves whether the good outways the bad.