YouTube has issued a reminder to the general public that deepfakes have been banned from its platform in relation to the US presidential election that will be taking place in November of this year.
YouTube, owned by Google, has said that it isn’t going to tolerate its platform being used to exploit its users. It will be looking out for deliberate attempts to mislead its viewers regarding how to vote and who to vote for.
A deepfake is a video that has been deliberately manipulated to show something that hasn’t actually occurred, and the results can be very convincing.
YouTube pioneered this move last year when it removed a deepfake video showing Nancy Pelosi apparently slurring her speech. Somebody had deliberately slowed down the video to make it appear as if she was slurring her words, implying that she was under the influence.
YouTube has also reminded its users that videos invoking doubt around the authenticity of politicians’ birth certificates and where they were actually born are also banned. While neither of these policies are new concepts, YouTube thought it would be wise to clarify its rules ahead of the 2020 November election.
While restrictions are not in place on videos that have been edited so that speech can be taken out of context, YouTube is actively on the lookout for other deliberately altered videos that they consider to be deepfakes.
Facebook, who previously refused to take the video of Nancy Pelosi down, has just announced its own deepfake ban.