As if we needed more reasons to be The Doctor Has Big Boobs 2freaked out by increasingly powerful digital assistants, there's a new nightmare scenario: The music you listen to or conversations you hear on TV could hijack your digital assistant with commands undetectable to human ears.
This is known as a "Dolphin Attack" (because dolphins can hear what humans can't), and researchers have been aware of the possibility for years. The basic idea is that commands could be hidden in high-frequency sounds that our assistant-enabled gadgets can detect, but we are unable to hear.
SEE ALSO: Google: Use phones less, but use AI moreResearchers proved in 2016 they could use the technique to trigger basic commands, like making phone calls and launching websites. At the time, they hypothesized that it might be possible to embed these audio cues into music and other recordings, which would significantly amp up the creepy factor.
Now, that day has come. In a paper first reported on by The New York Times, researchers proved it is in fact possible to hide audio inside of other recordings in a way that's nearly undetectable to human ears.
The researchers were able to do this using recordings of music and speech; in both cases, the changes were almost completely undetectable. Notably, the researchers tested this with speech recognition software, not digital assistants, but the implications of the experiment are huge.
A 4-second clip of music came out as “okay google browse to evil dot com”
In one example, they took a 4-second clip of music, which, when fed to the speech recognition software, came out as “okay google browse to evil dot com.” They were able to do the same with speech — hiding "okay google browse to evil dot com," inside a recording of the phrase "without the dataset the article is useless.”
In both cases, it's nearly impossible for humans to detect any differences between the two clips. The paper's authors note there is some "slight distortion," in the adulterated clips, but it's extremely difficult to discern. (You can listen to them for yourself here.)
This research could have troubling implications for tech companies and the people who buy their assistant-enabled gadgets. In a world in which television commercials are already routinely triggering our smart speakers, it's not difficult to imagine pranksters or hackers using the technique to gain access to our assistants.
This is made all the more troubling by the growing trend of connecting these always-listening assistants to our home appliances and smart home gadgets. As The New York Timespoints out, pranksters and bad actors alike could use the technique to unlock our doors or siphon money from our bank accounts.
It's not difficult to imagine hackers using the technique to gain access to our assistants.
Tech companies, on their part, are aware of all this, and features like voice recognition are meant to combat some of the threat. Apple, Google, and Amazon told the Timestheir tech has built-in security features, but none of the companies provided specifics. (It's also worth pointing out that Apple's HomePod, Amazon's Echo, and the Google Home all have mute switches that prevent the speakers from listening for their "wake words"—which would likely be a hacker's way in.)
It doesn't help that the latest research comes at a moment when many experts are raising questions about digital assistants. Earlier this week at Google's I/O developer conference, the company showed off a new tool, Duplex, which is able to make phone calls that sound just like an actual human.
Since the demo, many have questioned whether it's ethical to for an AI to make such calls without disclosing that it's an AI. (Google says it's working on it.)
Now, we might have even more to worry about.
Topics Alexa Artificial Intelligence Google Assistant Siri Gadgets
Previous:The Reaching-Out Industry
Dyson Black Friday deal: $200 off Dyson V15 Detect AbsoluteNicanor Parra, the AlphaRobot vacuum Black Friday deal: Take $400 off Roborock S8 Pro UltraEspresso Machine Black Friday deal: 42% off De'Longhi ECP3420Two Thousand Pieces of Subway Ephemera18 best family movies on Max for a fun night inGet $300 off iRobot Roomba Combo j5 for Black Friday 2023Stripperweb, a forum that empowered strippers, has shut downAn Inspired Theft by Ann BeattieTwitter/X will bring back link headline previews, says Elon MuskFitbit Black Friday deal: $60 off Fitbit Charge 6"What Does Your Husband Think of Your Novel?""What Does Your Husband Think of Your Novel?"How Do We Bury the Writing of the Dead?Meta Verified will offer new paid subscription service to verify Facebook and Instagram accountsChinua Achebe on Martin Luther King: He Died Too YoungCooking For an Ogre With Giambattista BasileViral Thanksgiving grandma and guest share their eighth holiday with Airbnb strangerNicanor Parra, the AlphaJapanese Tea, Rockets, and Switchblades: Tom Sachs and David Searcy It’s Time to Formulate an Opinion on Rauschenberg (Everyone’s Doing It) Staff Picks: Stephen Greenblatt, Eve Babitz, Halle Butler, and More My Quest for Albanian 45s (Circa 1985) Staff Picks: Finn Murphy, Robert Rauschenberg, and Prog Rock Colorful City: My History with Pride Week in the South Madame Bovary’s Wedding Cake Domenico Zindato’s Vibrant Works on Paper, Made from a Oaxacan Book A New Photo Book Lingers Between Baseball and the American Dream How Jean Stein Reinvented the Oral History Paradox Formation: Anelise Chen’s Meditations on the Snail Le Corbusier’s Iconic Chaise Longue Has Changed the Adult Jaime Davidovich’s Pioneering Television Art How Gerhard Steidl Mastered the Art of Bookmaking Do Not Let the Robots Name the Colors. The Robots Are Color Before Fiction Dealt with Feelings Gustav Wunderwald Painted the Quieter Side of Weimar Berlin Who the Fuck Knows: Writing on Music in the Age of Trump Photographs of Lost Gloves: A Thriving Subculture Remembering David Lewiston, Who Recorded Music Around the World The Key to Good Nature Photography? Trick the Animals.
1.9302s , 10130.7734375 kb
Copyright © 2025 Powered by 【The Doctor Has Big Boobs 2】,Wisdom Convergence Information Network