As if we needed more reasons to be english sex videosfreaked out by increasingly powerful digital assistants, there's a new nightmare scenario: The music you listen to or conversations you hear on TV could hijack your digital assistant with commands undetectable to human ears.
This is known as a "Dolphin Attack" (because dolphins can hear what humans can't), and researchers have been aware of the possibility for years. The basic idea is that commands could be hidden in high-frequency sounds that our assistant-enabled gadgets can detect, but we are unable to hear.
SEE ALSO: Google: Use phones less, but use AI moreResearchers proved in 2016 they could use the technique to trigger basic commands, like making phone calls and launching websites. At the time, they hypothesized that it might be possible to embed these audio cues into music and other recordings, which would significantly amp up the creepy factor.
Now, that day has come. In a paper first reported on by The New York Times, researchers proved it is in fact possible to hide audio inside of other recordings in a way that's nearly undetectable to human ears.
The researchers were able to do this using recordings of music and speech; in both cases, the changes were almost completely undetectable. Notably, the researchers tested this with speech recognition software, not digital assistants, but the implications of the experiment are huge.
A 4-second clip of music came out as “okay google browse to evil dot com”
In one example, they took a 4-second clip of music, which, when fed to the speech recognition software, came out as “okay google browse to evil dot com.” They were able to do the same with speech — hiding "okay google browse to evil dot com," inside a recording of the phrase "without the dataset the article is useless.”
In both cases, it's nearly impossible for humans to detect any differences between the two clips. The paper's authors note there is some "slight distortion," in the adulterated clips, but it's extremely difficult to discern. (You can listen to them for yourself here.)
This research could have troubling implications for tech companies and the people who buy their assistant-enabled gadgets. In a world in which television commercials are already routinely triggering our smart speakers, it's not difficult to imagine pranksters or hackers using the technique to gain access to our assistants.
This is made all the more troubling by the growing trend of connecting these always-listening assistants to our home appliances and smart home gadgets. As The New York Timespoints out, pranksters and bad actors alike could use the technique to unlock our doors or siphon money from our bank accounts.
It's not difficult to imagine hackers using the technique to gain access to our assistants.
Tech companies, on their part, are aware of all this, and features like voice recognition are meant to combat some of the threat. Apple, Google, and Amazon told the Timestheir tech has built-in security features, but none of the companies provided specifics. (It's also worth pointing out that Apple's HomePod, Amazon's Echo, and the Google Home all have mute switches that prevent the speakers from listening for their "wake words"—which would likely be a hacker's way in.)
It doesn't help that the latest research comes at a moment when many experts are raising questions about digital assistants. Earlier this week at Google's I/O developer conference, the company showed off a new tool, Duplex, which is able to make phone calls that sound just like an actual human.
Since the demo, many have questioned whether it's ethical to for an AI to make such calls without disclosing that it's an AI. (Google says it's working on it.)
Now, we might have even more to worry about.
Topics Alexa Artificial Intelligence Google Assistant Siri Gadgets
Ulta members save 20% on all hair tools, even Dyson and Shark'Andor' Season 2 trailer Easter eggs suggest a Star Wars revolutionAre you shadowbanned? The FTC wants to hear from you.LinkedIn Games widget launches to compete with Connections, Wordle for your timeBest portable monitor deal: The 16Belkin 10,000mAh Portable Powerbank: get 50% off at Woot!Samsung 1TB T7 Portable SSD deal: save 39% at AmazonBest Apple Pencil deal: Save $10 on Apple Pencil (USBSave $120 on the Dyson V8, its lowest price this yearNYT Connections hints and answers for February 24: Tips to solve 'Connections' #624.Tile Bluetooth tracker deal: 28% off at AmazonInter Miami vs. Sporting KC 2025 livestream: Watch Concacaf Champions Cup for freeBest flight deals: Save up to 30% at SouthwestiPhone transcribes the word 'racist' as 'Trump.' Apple explains why.Is the TikTok ban about China or Palestine? It's complicated.Best smartwatch deal: Save $69 on Garmin vívoactive 5Best Pokémon TCG deal: Twilight Masquerade Booster Box restocks for Pokémon DayAmazon eero Max 7: Get it for 20% off at Amazon'Paradise's apocalypse episode is absolutely unforgettableBest flight deals: Save up to 30% at Southwest Mueller report: Trump shared event from Russian trolls on Facebook The Brexit plan is there is no Brexit plan, says leaked memo 'Saved by the Bell' cast reunites for an emotional 30 'Time' 100 list offensively honors both Christine Blasey Ford and Brett Kavanaugh Cersei and Tyrion Lannister meet their 'Sesame Street' Muppet doubles The 5 most surprising things from Donald Trump's '60 Minutes' interview Apple's new iPhones will reportedly have a 12 Little Mix singer Perrie Edwards opens up about living with anxiety in powerful Instagram post Maisie Williams wants to know why we haven't spotted her other tattoo Bitmoji Bible tries to make religion more appealing to young people Quotes from Trump's new chief strategist that will terrify you The CIA is about to launch an Instagram account People are boycotting companies that endorse Donald Trump We made a bracket to determine who should survive 'Avengers: Endgame' Frisky lizard climbs on CNN reporter during live shot Cybersecurity CEO fired after threatening to kill Trump on Facebook Paul Rudd to host 'Saturday Night Live' season finale Lyft's new TV ads take aim at Uber Chimamanda Ngozi Adichie shuts down white man on racism #WhatsMyName movement is deadly reminder of Uber, Lyft safety shortcomings
2.9035s , 8611.8359375 kb
Copyright © 2025 Powered by 【english sex videos】,Co-creation Information Network