
A new generation of wearable technology could be silently listening to your brain—and it may be telling others what it hears.
Marketed as wellness gadgets, a surge of consumer neurotechnology devices promises to improve meditation, induce lucid dreams, or even enhance your swiping experience on dating apps. But behind the sleek headbands and dopamine-laced slogans lies a troubling reality: they’re also collecting vast amounts of sensitive neural data—sometimes without clear user consent.
This month, three U.S. Senators—Chuck Schumer, Maria Cantwell, and Ed Markey—sent a pointed letter to the Federal Trade Commission, calling for an investigation into how neurotech companies collect, handle, and sell brain data. The trio wants regulators to impose tighter restrictions on these companies, which operate in what they describe as a dangerous “gray area” of privacy law.
“Neural data is the most private, personal, and powerful information we have—and no company should be allowed to harvest it without transparency, ironclad consent, and strict guardrails,” Schumer told The Verge. “Yet companies are collecting it with vague policies and zero transparency.”
A Legal Loophole for Brain Data
Not all neurotechnologies are created equal in the eyes of the law. Medical devices—like Elon Musk’s Neuralink brain implant—must comply with strict rules under HIPAA, the Health Insurance Portability and Accountability Act. But wellness products, which don’t require a prescription or clinical oversight, escape those same rules.
These devices are designed to be as accessible as a smartwatch. You can buy them online, have them delivered to your door, and begin tracking your brain’s activity in minutes. Many claim to improve focus, reduce stress, or optimize productivity. But according to the Senators, this ease of access comes at the cost of meaningful oversight.
“Unlike other personal data, neural data — captured directly from the human brain — can reveal mental health conditions, emotional states, and cognitive patterns, even when anonymized,” the lawmakers wrote. “This information is not only deeply personal; it is also strategically sensitive.”
The letter cites a damning 2024 report from the Neurorights Foundation, which reviewed 30 brain-computer interface (BCI) companies whose products are sold directly to consumers. The results: 29 out of 30 collected user data with virtually no restrictions. Most offered minimal options to revoke consent, and less than half allowed users to delete their data.
The Uncharted Mind
Stephen Damianos, executive director of the Neurorights Foundation, likens brain data collection to a search of your home—but with no clear sense of what’s inside.
“The analogy I like to give is, if you were going through my apartment, I would know what you would and wouldn’t find… But brain scans are overbroad… It’s extremely hard — if not impossible — to communicate to a consumer or a patient exactly what can today and in the future be decoded from their neural data.”
He adds that the boundary between medical and wellness neurotech is alarmingly blurry. A headset may not be certified to treat depression, but it may still claim to “help with mood” or “optimize emotional balance”—a phrasing that can mislead consumers into assuming medical-grade oversight.
Currently, very few regulations apply to these devices. Only two U.S. states—Colorado and California—have passed laws specifically protecting neural data. Colorado’s 2024 legislation expanded its definition of “sensitive data” to include neural and biological information. California soon followed, amending its Consumer Privacy Act to cover brain data.
But these state-level protections are just a patchwork, say lawmakers, and the stakes are too high to leave this frontier unguarded.
In their letter to the FTC, the Senators urge the commission to:
- Investigate whether neurotech companies are violating consumer protection laws
- Require reporting on how companies use and share neural data
- Apply children’s privacy laws to brain-computer interfaces
- Launch a rulemaking process to set clear standards for data use, storage, and consent
Perhaps most strikingly, they also call for limits on secondary uses of brain data—such as training artificial intelligence systems or creating behavior-based advertising profiles.
“We Want to Get This Moment Right”
The Senators’ letter does not suggest banning neurotechnology outright. Nor do they dismiss the field’s promise. On the contrary, the concern is rooted in the belief that neurotech could reshape what it means to be human—for better or worse.
“We believe in the transformative potential of these technologies… We want to get this moment right,” said Damianos. “Enormous risks come from that, but we also believe in leveraging the potential to improve people’s lives.”
But right now, those risks loom larger than the safeguards.
As more companies race to monetize our thoughts, the question becomes urgent: who gets to listen to your brain—and what are they allowed to do with what they hear?