Deepfake of principal's voice is the latest case of AI being used for harm
The most recent criminal case involving artificial intelligence emerged last week from a Maryland high school, where police say a principal was framed as racist by a fake recording of his voice.
The case is yet another reason why everyone — not just politicians and celebrities — should be concerned about this increasingly powerful deep-fake technology, experts say.
“Everybody is vulnerable to attack, and anyone can do the attacking,” said Hany Farid, a professor at the University of California, Berkeley, who focuses on digital forensics and misinformation.
Here’s what to know about some of the latest uses of AI to cause harm:
AI HAS BECOME VERY ACCESSIBLE
Manipulating recorded sounds and images isn’t new. But the ease with which someone can alter information is a recent phenomenon. So is the ability for it to spread quickly on social media.
The fake audio clip that impersonated the principal is an example of a subset of artificial intelligence known as generative AI. It can create hyper-realistic new images, videos and audio clips. It’s cheaper and easier to use in recent years, lowering the barrier to anyone with an internet connection.
Related articles
Ship that caused Baltimore bridge collapse has been refloated
BALTIMORE (AP) — The container ship that caused the deadly collapse of Baltimore’s Francis Scott Key2024-05-21- Contact Us HomeNewsHighlightACWF NewsSocietyWom2024-05-21
- A visitor views an exhibit during a joint exhibition of tri-colored glazed potteries in Zhengzhou, c2024-05-21
Over 70 national IP protection centers under construction or in operation in China
A total of 71 national intellectual property (IP) protection centers are under construction or in op2024-05-21Supreme Court rejects an appeal from a Canadian man once held at Guantanamo
WASHINGTON (AP) — The Supreme Court on Monday rejected an appeal by a Canadian-born former Guantanam2024-05-21- This aerial photo taken on Oct. 24, 2022 shows the autumn scenery of a Great Wall section in north C2024-05-21
atest comment