Charlotte Foster


How Frank Sinatra was caught singing 20 years after his death

How Frank Sinatra was caught singing 20 years after his death

Fans were confused in 2020 when seemingly footage of Frank Sinatra went viral of him singing about hot tubs.

The iconic singer died in 1998, so many were wondering how old audio clips of him surfaced, but the audios were actually new. 

However, they weren’t Frank Sinatra singing at all.

The song, titled Hot Tub Christmas, was the product of a new technology known as a “deepfake” that mimicked Sinatra’s iconic voice. 

The video came from a San Francisco tech company who used their AI system, known as Jukebox, to generate new songs and vocals that almost sound exactly like real artists. 

So, what is a deepfake?

Deepfakes are realistic video or audio of events that never actually took place and are generated by artificial intelligence.

These videos have been used to trick online users into thinking their favourite celebrities said things they never actually did. 

The tech has been used to create fake videos of Hollywood actor Tom Cruise, which set off alarm bells in national security circles. 

Deepfakes can also be used to manipulate images, where people’s faces have been added into random events and videos. 

Audio deepfakes, like this unusual track of Frank Sinatra’s have received less attention in the media so far. 

One audio deepfake that has garnered a lot of criticism is a recreation of the voice of late chef Anthony Bourdain for use in his upcoming documentary. 

How are deepfakes made?

These audios are created by artificial intelligence ingesting and examining 1.2 million songs, their corresponding lyrics and information, such as artist names, genres and years of release.

Using this data, AI can create new music samples from scratch and make them seem like they came from the original artist. 

While some celebrities who have been spoofed in deepfakes have expressed their discomfort and irritation in the new tech, one singer named Holly Herndon believes they are here to stay

She said, "Vocal deepfakes are here to stay. A balance needs to be found between protecting artists and encouraging people to experiment with a new and exciting technology."

Image credit: Getty Images

Our Partners