HARRISBURG – Attorney General Michelle Henry led a coalition of 26 states in a letter to the Federal Communications Commission (FCC) emphasizing the potential harm in the use of artificial intelligence (A.I.) by telemarketers and asking the FCC to strongly restrict such usage.
In November, the FCC posted a Notice of Inquiry, in which it requested input on the implications and usage of A.I. technology in consumer communications and how the technology fits under the Telephone Consumer Protection Act (TCPA). Specifically, the FCC inquired about the potential ability of A.I. technologies to act as the functional equivalent of a live agent.
Pursuant to the TCPA, robocalls are those calls made using an artificial or prerecorded voice. Such calls are generally prohibited unless the calling party obtains the prior express written consent of the consumer.
Attorney General Henry led the comment letter to the FCC, saying marketers wanting to use A.I. to impersonate a human voice should be required to follow the TCPA’s rules and regulations with respect to artificial voices, including obtaining the prior express written consent from consumer targets.
Technology is advancing and expanding, seemingly, by the minute, and we must ensure these new developments are not used to prey upon, deceive, or manipulate consumers, Attorney General Henry said. “This new technology cannot be used as a loophole to barrage consumers with illegal calls. I commend the partners in this bipartisan coalition for seeing the potential harm A.I. can present to consumers already overwhelmed by robocalls and text communications.
Attorney General Henry was joined in the comment letter by the Attorneys General for the states of Alabama, Arizona, California, Colorado, Connecticut, Delaware, Washington D.C., Hawaii, Illinois, Massachusetts, Maine, Maryland, Michigan, Minnesota, Mississippi, New Jersey, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, South Dakota, Tennessee, Vermont, and Washington.
A copy of the comment letter can be found HERE.
# # #