- Riding the Wave
- Posts
- ๐Size matters
๐Size matters
tiny AI models are moving to your phone, ChatGPT now reads PDF
Hey there, Surfers๐!
Size does matter when it comes to AI, but while thereโs a fierce race to be the biggest a parallel race is unfolding to fit an AI on your phone.
Hereโs your two minutes of AI news for the day:
ONE PIECE OF NEWS
๐Size matters: Small Models Making Big Waves
Right now, in the world of AI size equals performance. Feed it enough data and an LLM can answer anything. But there's a catch: training these big models needs a lot of GPU power, and that's costly. Plus, there's a limit to how much GPU we have. In simple terms? The more money you have, the bigger the model you can train and run.
And let's be real: Most of the times these huge models are an overkill for what we need. Using GPT-4 to correct your tweetโs grammar is like using a bazooka to swat a fly.
Another problem? These big models live on the cloud. This means they can be slow and not always private.
The rise of the small models
Small models are crafted by refining larger ones, carving away the fat from the likes of Meta's LLaMa. Whatโs the advantage, you ask? Oh, just monumental cost savings, sacrificing only a tiny bit of accuracy.
Instead of shelling out hundreds of millions, these models cost way less to train. Some might still cost millions, but others, like Vicuna-13B, cost as little as $300. And the smaller the model the cheaper to run it. Plus these models are small enough to run locally, keeping company data private and secure. And they're good too: Vicuna's accuracy is 90% of ChatGPTโs (GPT 3.5).
Specialized models
There's even more potential for tiny models trained for specific jobs to be better than big ones. Take for example Googleโs PaLM, half of the training data was just social media banter.
Google PaLMโs training data
When you cut through this noise, focusing the training, you get gems like MedPalm, a model fine-tuned for medical scenarios that even aced the US Medical Licensing Examination.
AI goes mobile
Talking to GPT-4 in conversation mode is an eerily human experience, but the delay before it replies breaks the illusion. And yes, ChatGPT's stalling tactics like "Hmm... good question" are cute the first time but get old fast.
Here's what's exciting: AI is coming to our phones. Qualcommโs new smartphone chips coming to high end Androids early next year can run models as big as 10 billion parameters. This means super-fast responses and generating images with Stable Diffusion in half a second on your device.
Qualcomm's idea? Phones do the easy tasks, and for harder questions, the cloud helps out.This could make interacting with AI seamless and allow for a quick-witted and super-smart Alexa or Siri.
So, in the ongoing AI saga, it's not just about size; it's how you use it. And as we move forward, it seems like smaller models might be the future for most users.
ONE MORE THING
ChatGPT gets killer new feature: it now reads PDFs.
It seems to be rolling out in waves. I haven't received access to the feature yet, but I'll report back once I've had a chance to play with it.
GPT-4 now works with PDFs and other documents, and can choose the best model on its own.
โ Bryan McAnulty (@BryanMcAnulty)
12:52 AM โข Oct 29, 2023
โ If you have one more minute
๐ The Mobile Revolution vs. The AI Revolution
Google to invest $2 billion on AI startup Anthropic
Pigeons solve problems the same way AI does, study says
Now add a walrus: Prompt engineering in DALL-E 3
Israeli tech workers use AI to search for hostages held in Gaza
How AI can help to save endangered species
AI Art of the day ๐จ
Washington in tactical gear. Created by DALL-E 3.
๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐๐
Thatโs it folks!
If you liked it, please share this hand-crafted newsletter with a friend and make this writer happy!