HubSpot AI Blog - Latest News About Artificial Intelligence

New ElevenLabs Feature Empowers Voice Actors to Charge for Usage

Written by Martina Bretous | Feb 27, 2024 9:30:00 AM

Voice actors have been the backbone of ElevenLabs, the AI text-to-speech generator, since its launch. Now, the company is putting money back into their pockets.

Through a new professional voice cloning feature, voice actors can now upload audio of their voices and get paid every time someone clones it to generate new audio.

Though it’s marketed for voice actors, anyone can get paid for this. But just because you can, doesn’t mean you should, as voice cloning can come with scary consequences.

How ElevenLabs’ Profession Voice Cloning Feature Works

ElevenLabs has been insanely easy to use since its inception and this new feature is no different.

To start getting paid for your voice, you only have to follow a few steps:

  • Have an account with a “Creator” subscription tier or above.
  • Upload 30 minutes to 3 hours worth of audio featuring your voice, without any background music, sounds, or effects.
  • Name and describe your voice, including labels related to accent region, sex/gender, ethnicity, age, and tone.
  • Verify your voice by reading a text prompt.
  • Submit your fine-tuning request, which can take up to 4 weeks as your voice undergoes model training.

ElevenLabs emphasizes that what you read is not as important as how you read it. The model will focus on things like tone, inflection, and accent.

Once the uploading process is complete, voice actors can set usage parameters to protect them against misuse.

In addition, ElevenLabs verifies each upload to ensure voices aren’t uploaded without the speaker’s consent (think celebrities and public figures).

If you don’t have aspirations of being a professional voice actor, this feature is an opportunity to earn passive income. If you do, it’s a way to build brand awareness.

Now let’s move on to the most important part: The dolla, dolla bills.

How much does professional voice cloning pay?

Voice actors can set their price by selecting a default or custom rate. Payouts can vary greatly depending on a number of factors, such as:

  • The pricing tiers of the users cloning your voice
  • The number of characters they generate with your voice (i.e. how much audio they create)
  • The notice period you select, which is the usage length for your voice

Image Source

In addition to this self-service feature, voice actors can work with ElevenLabs directly on licensing deals.

This involves serving as one of ElevenLabs’ default voices for usage terms of one to 11 years, recording up to 12 hours of studio-quality audio, and earning premium placement in their voice library.

Now onto the not-so-fun part.

The Scary Implications of Voice Cloning

Deepfake videos and images have been around for a while but now voice clones are rising in popularity. In the past year, we’ve seen several examples of how dangerous it can get.

Last week, TikTok star Jordan Howlett shared a terrifying encounter in which his voice was cloned to promote a cure for blindness on social media.

“When I heard my voice, I was terrified,” he said to Bloomberg. “They could theoretically get me to say anything.”

Back in August, the NYT reported on a bank account holder who, along with his banker, experienced a scam involving a deepfake of his voice. Someone attempted to get his money rerouted to another account, using an AI voice clone.

On a “60 Minutes” episode, an ethical hacker cloned the correspondent's voice and conned her colleague into sharing her passport details.

The scary thing is, anyone with a social presence can fall victim to this. All it takes is tracking down a few videos of the user, isolating their voice, and using software that can clone it.

This makes professional voice actors and content creators more susceptible to AI voice scams. And according to one Forbes article, ElevenLabs hasn’t always protected its contributors.

Just like Howlett, voice actor Allegra Clark shared with Forbes that her voice was cloned using ElevenLabs to say things she never recorded.

She wrote an email to the company asking them to remove her upload to prevent future cloning, but they never took action. They argued it wasn’t clear the clone was made using their technology and it wasn’t featuring hateful or defamatory content.

After this incident, ElevenLabs launched additional safety guardrails, like voice captcha, to prevent unauthorized cloning.

Voice actors, ineligible for intellectual property protection, have little to no legal protectors for their voice. And that doesn’t even cover the fear of being replaced by AI voices.

Though passive income is a great incentive, there’s way more to consider.