

I kept a few recipes from a subscription I was gifted. Honestly, replacing the missing ingredients has been more fun than cooking the boxed meals.
I kept a few recipes from a subscription I was gifted. Honestly, replacing the missing ingredients has been more fun than cooking the boxed meals.
Well shit. That makes a lot of sense.
No no, they listen. How do you think the “Hey Google” feature works? It has to listen for the key phrase. Might as well just listen to everything else.
I spent some time with a friend and his mother and spoke in Spanish for about two hours while YouTube was playing music. I had Spanish ads for 2 weeks after that.
I saw a Copilot prompt in MS PowerPoint today - top left corner of EVERY SINGLE SLIDE - and I had a quiet fit in my cubicle. Welcome to hell.
I’ve seen publishers advertise their other titles within the box, which honestly, not an issue for me. These, however, are crossing a line.
See Alk’s comment above, I touched on medical applications.
As for commercial uses, I see very few. These devices are so invasive, I doubt they could be approved for commercial use.
I think the future of Brain Computer Interfacing lies in Functional Near Infrared Spectroscopy (FNIRS). Basically, it uses the same infrared technology as a pulse oximeter to measure changes in blood flow in your brain. Since it uses light (instead of electricity or magnetism) to measure the brain, it’s resistant to basically all the noise endemic to EEG and MRI. It’s also 100% portable. But, the spatial resolution is pretty low.
HOWEVER, the signals have such high temporal resolution. With a strong enough machine learning algorithm, I wonder if someone could interpret the signal well enough for commercial applications. I saw this first-hand in my PhD - one of our lab techs wrote an algorithm that could read as little as 500ms of data and reasonably predict whether the participant was reading a grammatically simple or complex sentence.
It didn’t get published, sadly, due to lab politics. And, honestly, I don’t have 100% faith in the code he used. But I can’t help but wonder.
A traditional electrode array needs to be as close to the neurons as possible to collect data. So, straight through the dura and pia mater, into the parenchyma where the cell axons and bodies are hanging out. Usually, they collect local data without getting any long distance information - which is a limiting factor to this technology.
The brain needs widespread areas to work in tandem to get most complicated tasks done. An electrode is great for measuring motor activity because those are pretty localized. But, something like memory and language? Not really possible.
There are electrocorticographic devices (ECoG) that places electrodes over a wide area and can rest on the pia mater, on the surface of the brain. Less invasive, but you still need a craniotomy to place the device. They also have less resolution.
The most practical medical purpose I’ve seen is as a prosthetic implant for people with brain/spinal cord damage. Battelle in Ohio developed a very successful implant and has since received DARPA funding: https://www.battelle.org/insights/newsroom/press-release-details/battelle-led-team-wins-darpa-award-to-develop-injectable-bi-directional-brain-computer-interface. I think that article over-sells the product a little bit.
The biggest obstacle to invasive brain-computer implants like this one is their longevity. Inevitably, any metal electrode implanted in the brain gets rejected by the immune system of the brain. It’s a well-studied process where a glial scar forms, neurons move away from the implant, and the overall signal of the device decreases. We need advances in biocompatibility before this really becomes revolutionary.
ETA: This device avoids putting metal in the brain and instead the device sends axons into the brain. Certainly a novel approach which runs into different issues. The new neurons need to be accepted by the brain, and they need to be kept alive by the device.
If they move the cell bodies into the brain and then had the device house axons and dendrites (neuron input and output), they could maybe let the brain keep the device alive. But that is a much more difficult installation procedure
Fantastic question, like Will_a said, I’ve never seen a device designed for input to the brain like this.
In this particular example, if someone were to compromise the device, even though it’s not able to “fry” their brain with direct electricity, they could overload the input neurons with a ton of stimulus. This would likely break the device because the input neurons would die, and it could possibly cause the user to have a seizure depending on how connected the input was to the users brain.
That does bring to mind devices like the one developed by Battelle, where the device reads brain activity and then outputs to a sleeve or cuff designed to stimulate muscles. The goal of the device is to act as a prosthesis for people with spinal cord injuries. I imagine that device was not connected to the internet in any way, but worst case scenario and a hacker compromises the device, they could cause someone’s muscle to sieze up.
Agree, fascinating question. To be precise, they used genetically modified neurons (aka optogenetics) to test if the device can deliver a signal into the brain. Optogenetics incorporates neurons modified with light-sensitive channel proteins, so the neuron activates when a precise wavelength of light is “seen” by the special protein. One of the coolest methods in neuroscience, in my opinion.
“To see if the idea works in practice they installed the device in mice, using neurons genetically modified to react to light. Three weeks after implantation, they carried out a series of experiments where they trained the mice to respond whenever a light was shone on the device. The mice were able to detect when this happened, suggesting the light-sensitive neurons had merged with their native brain cells.”
Oh neat, another brain implant startup. I published in this field. If anyone has questions, I’m happy to answer.
Unironically, I had to delete this game from my phone because I wasn’t getting work done. This game slaps.
In grad school I worked with MRI data (hence the username). I had to upload ~500GB to our supercomputing cluster. Somewhere around 100,000 MRI images, and wrote 20 or so different machine learning algorithms to process them. All said and done, I ended up with about 2.5TB on the supercomputer. About 500MB ended up being useful and made it into my thesis.
Don’t stay in school, kids.
It’s been in development for a while: https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o
Even before the above paper, I recall efforts to connect (rat) brains to computers in the late 90s/early 2000s. https://link.springer.com/article/10.1023/A:1012407611130
It’s a bunch of neurons that speak to a computer with a microelectrode array. So they “speak to” the neurons with electric impulses, and then “listen to” what they have to say. The computer it’s connected to uses binary, but the neurons are somewhere in between. Yes, the change in electrical potential is analog, but neurons are typically in their “on” state, recovering from their “on” state, or just chilling out.
The brain is incredible because of the network of connections between neurons that store information. It’ll be interesting to see if a small scale system like this can be used for anything larger scale.
Believe it or not, I studied this in school. There’s some niche applications for alternative computers like this. My favorite is the way you can use DNA to solve the traveling salesman problem (https://en.wikipedia.org/wiki/DNA_computing?wprov=sfla1)
There have been other “bioprocessors” before this one, some of which have used neurons for simple image detection, e.g https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o. But this seems to be the first commercial application. Yes, it’ll use less energy, but the applications will probably be equally as niche. Artificial neural networks can do most of the important parts (like “learn” and “rememeber”) and are less finicky to work with.
Thanks for the recommendation, I was worried they would be missing some of my artists but they had 99% of my music. Can’t wait to ditch Spotify.
ETA: dear lord the sound quality is so much better. I had no idea what I was missing.
My favorite AI fact is from cancer research. The New Yorker has a great article about how an algorithm used to identify and price out pastries at a Japanese bakery found surprising success as a cancer detector. https://www.newyorker.com/tech/annals-of-technology/the-pastry-ai-that-learned-to-fight-cancer