- Riding the Wave
- Posts
- πMeta's researchers read mind with AI
πMeta's researchers read mind with AI
Researchers came up with a model that predicts visuals in real time.
Hello Surfersπ!
Reading minds, every confused teenage boyβs dream. Remember those times when you'd try to decode every gesture, every hair twirl, and that deep gaze to figure out, "Did she actually like me or was she just having a good hair day?β
Well, science might soon be able to peek into our minds and tell whether you should go for that goodnight kiss or not. But for now, it remains the age-old art of βjust going for itβ.
Hereβs your two minutes of AI news for the day:
ONE PIECE OF NEWS
πMetaβs researchers came up with a model that predicts visuals in real time.
You know how sometimes you just wish you could, like, beam your thoughts into someone's brain? Like, not the "I wish I had pizza right now" kind, but the full 3D, crispy-crusted, slightly burnt cheese sensation? Well, science is saying, "Hold my lab coat."
On one hand, you've got Elon Musk wanting to go full sci-fi with Neuralink, which is basically saying, βLet's drill a hole in your skull, plug an AUX cable into your brain and see what tunes play.β Yeah, icky.
But what if we didnβt have to go all Frankenstein on this? Some scientists are using MRI to peek into our brain's secret slideshow. But here's the teeny tiny hiccup: MRI scans are slower than a snail on ketamine, taking two whole seconds for a single image. It's like trying to watch Netflix with your grandma's dial-up internet. Not happening.
Enter Team Meta. These guys used data from a brain monitoring technique called magnetoencephalography (MEG). Think of it like tapping into the brain's live radio station. Itβs less crisp than MRI, but itβs instantaneous.
They hooked it up with a self-supervised image model, DINOv2, that trains on images without human annotation. Kind of like babies observing the world and making sense of it without help.
The research focused on feeding the AI brainwaves and seeing if this digital infant can draw what you're thinking.
And here's where I nearly spit out my coffee: It actually did it! Okay, itβs far from perfect, but itβs kind of magical that an algorithm is now doodling based on our brainwaves in real-time.
And the most incredible thing is that this proves that unsupervised artificial models learn similarly to us humans and neurons in the algorithm tend to fire just like the neurons in our brain. They essentially have a virtual brain.
We're not at Hollywood-quality mind movies... yet, and much more research needs to be done. But, seriously, the idea that we're on the brink of real-time mind reading? That's the kind of sci-fi stuff that makes you wonder: where will humanity be in 20 years?
ONE MORE THING
Midjourney is now spitting out pictures in 4096 Γ 4096 px.
Breaking: Midjourney just dropped a new update!
You can now use upscalers to increase the resolution of your image by up to 4x.
How to do it?
Just press the upscale buttons to enlarge your images.
They can go up to a resolution of 4096 x 4096 px!
β Chase Lean (@chaseleantj)
3:33 AM β’ Oct 19, 2023
β If you have one more minute:
Foxconn and Nvidia team up to build 'AI factories'
Will Artificial Intelligence Replace Architects?
How Marketing Executives Are Thinking About Integrating AI Into Their Strategies
Metaβs Yann LeCun argues against premature AI regulation
AI Art of the day π¨
Lego Jesus created by ChatGPT and DALL-E 3, shared by user u/rutan668.
DALL-E is pretty good in LEGO.
ππππππππππππππππππππππππ
Thatβs it folks!
If you liked it, please share this hand-crafted newsletter with a friend and make this writer happy!