But in recent years, the Colorado doctor turned to consumer headbands, commonly sold online to monitor sleep patterns or boost brain function, to capture the brain activity of some patients suffering seizures.
Cheaper and easy to use, the headbands - which can cost just a few hundred dollars - capture similar electronic data as state-of-the-art hospital machines, only with far less fuss.
"In the beginning I was thrilled, I thought: 'patients can even do all this themselves, at home,'" he told the Thomson Reuters Foundation.
"But then I thought: 'wait a second, that means all their brain data is going to some private company.'"
Advances in brain science have made it easier to capture detailed data flows for the human brain and interpret their meaning. Some recent experiments have also shown the possibility of manipulating thoughts through neurological intervention.
By processing electronic brain images with artificial intelligence (AI) systems, researchers at the University of Texas were even able to accurately predict what words were running through a participant's head.
These kinds of advances have led to big breakthroughs, letting some paralyzed patients communicate via brainwaves or helping rewire dormant neural pathways after spinal injuries.
"I am confident that in the next couple of years there will be many devices that can read your thoughts," said Pauzauskie, who is excited to gain this new insight into how minds work.
But he also fears a potential for abuse.
So Pauzauskie recently joined a coalition of lawmakers and scientists pushing Colorado to become the first U.S. state to enshrine privacy guarantees for brain data.
Legislation passed the Colorado assembly last month, on a vote of 61-1, and now goes before the state senate.
It is part of a trickle of bills under consideration countrywide, united by the common aim of ensuring that what goes on in a brain belongs to its owner - and can stay private.
Lawmakers in Minnesota introduced their own bill in March.
California - which often sets the pace for privacy rules - is also preparing a law that could be introduced within weeks.
Experts say the push for 'neural rights' is a rare attempt to lock consumer protection into an emerging technology before its mass adoption.
Some gadgets that capture brain data - with the potential to resell or share that information - are already on sale, and a number of start-ups plan to roll out devices soon.
"We want more guidance from lawmakers," said Adam Molner, co-founder of Neurable, a company developing headphones that can monitor brainwaves.
With its first product due to hit the market this year, Neurable says it does not plan to sell raw brain data, and will expressly ask users for consent before its collection.
Major companies have not yet rolled out consumer devices that interact with the brain, though many firms - including SnapChat and Meta - are working on the technology.
Last year, Apple filed a patent for airpods that could monitor electrical activity in the brain.
"We have a real patchwork of laws when it comes to this data," said Sara Pullen Guercio, a lawyer with the privacy group at Alston & Bird who is tracking U.S. neural rights legislation.
While neural data gathered in a medical setting is covered by health data protection laws, she said that same data gathered for commercial purposes was much more loosely regulated.
"What we don't want is a Wild West for neural data," she said.
MISSING LINK
The United States has no federal privacy laws, and while 15 states have passed their own versions of legislation, none directly addresses neural data, said Jared Genser, a lawyer and co-founder of the Neurorights Foundation.
The Neurorights Foundation backed the Colorado law, which would insert neural data into the state's existing privacy bill under the category of "sensitive data".
Companies would then need consent before collecting neural data, and would have to give customers options to limit what can be done with it as well as the right to delete it.
"Neural data is really missing from these existing laws in a large part because people were not thinking of it when they were drafting these laws," Genser said.
The Foundation wants to enshrine rights for the brain around the world; last year, Chile issued the first ever ruling demanding that illegally collected brain data be deleted.
MIND-READING BILLS
Pauzauskie, the Colorado neurologist, approached his state assembly member Cathy Kipp at a recent fundraiser to raise his concerns about unregulated neural data.
"The fact that a company could capture your brainwaves today, and then resell them and then those brain waves could be used for something totally different in five years - that's a problem," she told the Thomson Reuters Foundation.
As the bill moves through the legislature, there's been back and forth with tech industry groups about how to define neural data and the devices that collect it, Kipp said.
She wants to make sure the language remains broad enough to capture future developments in what is a quickly evolving field: neural data that is recorded by a headset today could be gathered by a wristband tomorrow, she reasoned.
David Stauss, a Colorado lawyer who focuses on privacy law, said the Colorado bill is part of a national pattern: states are increasingly tweaking privacy laws to cover sensitive data.
For example, Oregon and Delaware recently amended their bills to add transgender and non-binary status to their definitions of sensitive data.
ENFORCEMENT
In Minnesota, which does not have a state-level privacy law, lawmakers are taking a different tack.
They are proposing a law to enshrine "cognitive liberty" and impose penalties for accessing brain data or using neural technology to interfere with thoughts without consent.
"Far too often we wait for bad things to happen, and we respond to a crisis after people have been harmed," said Walter Hudson, a member of the Minnesota state house pushing the bill.
State Senator Josh Becker in California is preparing a draft of a neurorights bill in California to be unveiled this month.
"I am focused on making sure that people own their own neural data," said Becker, who previously spearheaded state legislation to rein in databrokers.
It's still not clear exactly how this patchwork of laws will impact industry, said Rachel Marmor, a privacy law expert at the firm Holland & Knight.
"Until there’s a bunch of enforcement actions it's going to be hard to move the needle on how businesses approach this," she explained.
But with any privacy law, there's a risk that compliance comes down to checking countless boxes on a myriad of forms and clicking through long, complex privacy policies, she warned.
"Forcing people to read thousands of privacy disclosures a year is not going to be workable here," she said.