πŸ„Meta's researchers read mind with AI

Researchers came up with a model that predicts visuals in real time.

Hello SurfersπŸ„! 

Reading minds, every confused teenage boy’s dream. Remember those times when you'd try to decode every gesture, every hair twirl, and that deep gaze to figure out, "Did she actually like me or was she just having a good hair day?β€œ

Well, science might soon be able to peek into our minds and tell whether you should go for that goodnight kiss or not. But for now, it remains the age-old art of β€œjust going for it”.

Here’s your two minutes of AI news for the day:

ONE PIECE OF NEWS

πŸ“ˆMeta’s researchers came up with a model that predicts visuals in real time.

You know how sometimes you just wish you could, like, beam your thoughts into someone's brain? Like, not the "I wish I had pizza right now" kind, but the full 3D, crispy-crusted, slightly burnt cheese sensation? Well, science is saying, "Hold my lab coat."

On one hand, you've got Elon Musk wanting to go full sci-fi with Neuralink, which is basically saying, β€œLet's drill a hole in your skull, plug an AUX cable into your brain and see what tunes play.” Yeah, icky.

But what if we didn’t have to go all Frankenstein on this? Some scientists are using MRI to peek into our brain's secret slideshow. But here's the teeny tiny hiccup: MRI scans are slower than a snail on ketamine, taking two whole seconds for a single image. It's like trying to watch Netflix with your grandma's dial-up internet. Not happening.

Enter Team Meta. These guys used data from a brain monitoring technique called magnetoencephalography (MEG). Think of it like tapping into the brain's live radio station. It’s less crisp than MRI, but it’s instantaneous.

They hooked it up with a self-supervised image model, DINOv2, that trains on images without human annotation. Kind of like babies observing the world and making sense of it without help.

The research focused on feeding the AI brainwaves and seeing if this digital infant can draw what you're thinking.

And here's where I nearly spit out my coffee: It actually did it! Okay, it’s far from perfect, but it’s kind of magical that an algorithm is now doodling based on our brainwaves in real-time.

And the most incredible thing is that this proves that unsupervised artificial models learn similarly to us humans and neurons in the algorithm tend to fire just like the neurons in our brain. They essentially have a virtual brain.

We're not at Hollywood-quality mind movies... yet, and much more research needs to be done. But, seriously, the idea that we're on the brink of real-time mind reading? That's the kind of sci-fi stuff that makes you wonder: where will humanity be in 20 years?

ONE MORE THING

Midjourney is now spitting out pictures in 4096 Γ— 4096 px.

⌚ If you have one more minute:

  • Foxconn and Nvidia team up to build 'AI factories'

  • Will Artificial Intelligence Replace Architects?

  • How Marketing Executives Are Thinking About Integrating AI Into Their Strategies

  • Meta’s Yann LeCun argues against premature AI regulation

AI Art of the day 🎨

Lego Jesus created by ChatGPT and DALL-E 3, shared by user u/rutan668.

DALL-E is pretty good in LEGO.

πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„πŸŒŠπŸ„

That’s it folks!

If you liked it, please share this hand-crafted newsletter with a friend and make this writer happy!