The FCC wants to make AI robocalls like this creepy one illegal



Robocalls are annoying enough, but the growth of generative artificial intelligence could make it much easier for phone-based scams to fool people. Now, the Federal Communications Commission (FCC) is hoping to head off this potential threat before it becomes an even bigger problem.

FCC Chairwoman Jessica Rosenworcel, on Wednesday, proposed new rules that would make robocalls using AI-generated voices illegal.

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” said Rosenworcel in a statement. “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.”

The calls can be convincingly human-sounding, but some appear to be subject to the same shortcomings of other chatbots. A video on TikTok, posted by user lothalrebels, appears to show a telemarketing cold call from an AI calling itself “Andrew” that claims to be from real estate firm Keller Williams.

After asking questions about whether the person is interested in buying or selling a home, Andrew asks, “Is there anything else I can assist you with or any other questions you may have?” That’s when things get creepy.

The cold call recipient asks Andrew for a 10th-grade-level essay about the U.S. military’s naval base in Bermuda, to which Andrew obliges. (It should be noted, unsurprisingly, that Andrew got some important facts wrong.) Next, Andrew immediately rattles off a recipe for homemade hot fudge as requested.

The voice on the call had a familiar, lilting British accent. Commenters noted it sounded very similar to professor Brian Cox, an English physicist who has sold out lectures around the world.

Keller Williams spokesperson Darryl Frost, when asked about the video, told Fast Company, “Keller Williams Realty, Inc. has not enabled or encouraged franchisees or agents to use AI to make telemarketing calls to consumers. While there may be exciting opportunities to use AI to make real estate a more efficient marketplace for all, we emphasize and train our Keller Williams franchisees and their affiliated agents that they must comply with all federal, state, and local telemarketing laws in all their communications with consumers.”

Last year saw roughly 55 billion robocalls in the U.S., a bit lower than the 2019 peak of 58.5 billion, as estimated by YouMail, which blocks the calls. Just last week, a robocall began spreading among voters that imitated President Joe Biden, telling voters not to vote in the New Hampshire primary. That was likely the catalyst behind Rosenworcel’s proposal.

The proposal aims to identify AI-generated voices as artificial under the Telephone Consumer Protection Act (TCPA). This makes them illegal under the existing law and gives state attorneys general the ability to pursue legal action against the companies behind them. (That act was passed in 1991 to quell the volume of robocalls.)

Consumers would still have the ability to give permission for some robocalls, including those using AI-generated voices that “do not include an advertisement or constitute telemarketing.” So the law would not impact the sort of automated calls routinely made by pharmacies letting you know your medication is ready.

Last year, the FCC used the TCPA to impose a $5 million penalty against conservative activists, who used robocalls to inform Black voters that any votes by mail would result in their personal information being put into “a public database that will be used by police departments to track down old warrants and be used by credit card companies to collect outstanding debts.”

Americans lost an estimated $87 billion to robocalls and phone scams in 2022, according to RoboKiller, an app that strives to eliminate spam calls. That’s a 116% increase over 2021. Globally, phone fraud is soaring as well. A report from Hiya, a voice security company, screened over 21.5 billion suspected spam calls in just the first nine months of last year. 





Robocalls are annoying enough, but the growth of generative artificial intelligence could make it much easier for phone-based scams to fool people. Now, the Federal Communications Commission (FCC) is hoping to head off this potential threat before it becomes an even bigger problem.

FCC Chairwoman Jessica Rosenworcel, on Wednesday, proposed new rules that would make robocalls using AI-generated voices illegal.

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” said Rosenworcel in a statement. “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.”

The calls can be convincingly human-sounding, but some appear to be subject to the same shortcomings of other chatbots. A video on TikTok, posted by user lothalrebels, appears to show a telemarketing cold call from an AI calling itself “Andrew” that claims to be from real estate firm Keller Williams.

After asking questions about whether the person is interested in buying or selling a home, Andrew asks, “Is there anything else I can assist you with or any other questions you may have?” That’s when things get creepy.

The cold call recipient asks Andrew for a 10th-grade-level essay about the U.S. military’s naval base in Bermuda, to which Andrew obliges. (It should be noted, unsurprisingly, that Andrew got some important facts wrong.) Next, Andrew immediately rattles off a recipe for homemade hot fudge as requested.

The voice on the call had a familiar, lilting British accent. Commenters noted it sounded very similar to professor Brian Cox, an English physicist who has sold out lectures around the world.

Keller Williams spokesperson Darryl Frost, when asked about the video, told Fast Company, “Keller Williams Realty, Inc. has not enabled or encouraged franchisees or agents to use AI to make telemarketing calls to consumers. While there may be exciting opportunities to use AI to make real estate a more efficient marketplace for all, we emphasize and train our Keller Williams franchisees and their affiliated agents that they must comply with all federal, state, and local telemarketing laws in all their communications with consumers.”

Last year saw roughly 55 billion robocalls in the U.S., a bit lower than the 2019 peak of 58.5 billion, as estimated by YouMail, which blocks the calls. Just last week, a robocall began spreading among voters that imitated President Joe Biden, telling voters not to vote in the New Hampshire primary. That was likely the catalyst behind Rosenworcel’s proposal.

The proposal aims to identify AI-generated voices as artificial under the Telephone Consumer Protection Act (TCPA). This makes them illegal under the existing law and gives state attorneys general the ability to pursue legal action against the companies behind them. (That act was passed in 1991 to quell the volume of robocalls.)

Consumers would still have the ability to give permission for some robocalls, including those using AI-generated voices that “do not include an advertisement or constitute telemarketing.” So the law would not impact the sort of automated calls routinely made by pharmacies letting you know your medication is ready.

Last year, the FCC used the TCPA to impose a $5 million penalty against conservative activists, who used robocalls to inform Black voters that any votes by mail would result in their personal information being put into “a public database that will be used by police departments to track down old warrants and be used by credit card companies to collect outstanding debts.”

Americans lost an estimated $87 billion to robocalls and phone scams in 2022, according to RoboKiller, an app that strives to eliminate spam calls. That’s a 116% increase over 2021. Globally, phone fraud is soaring as well. A report from Hiya, a voice security company, screened over 21.5 billion suspected spam calls in just the first nine months of last year. 

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
biden administrationCreepyFCCgenerative aiIllegalLatestpoliticsRobocallsTechTechnology
Comments (0)
Add Comment