• Riding the Wave
  • Posts
  • ๐Ÿ„ Europeโ€™s new โ‚ฌ2 billion AI unicorn

๐Ÿ„ Europeโ€™s new โ‚ฌ2 billion AI unicorn

Mistral just became a โ‚ฌ2 billion company, ChatGPT gets lazier during holiday season

Hello Surfers๐Ÿ„!

Itโ€™s crazy, but it seems like ChatGPT learned to be more lazy around holiday season. So if AI ever takes over, we know when to start our rebellion.


Hereโ€™s whatโ€™s happening in AI today:

THE NEWS

๐Ÿฆ„ Europeโ€™s new โ‚ฌ2 billion AI unicorn

In Silicon Valley, big tech companies are pouring millions of dollars into AI development with one standing out: Mark Zuckerbergโ€™s Meta. While others guard their models closely, Meta is investing heavily in open-source AI. (Who would have thought?)

Donโ€™t get me wrong, they have their own selfish reasons to do so, but their support is vital to the open-source movement.

Remember when Meta released Llama 2 in three versions back in July? These models came in three sizes 7, 13 and 70 billion parameters. The biggest brought us close to the capabilities of GPT3.5, while the smaller ones made running LLMs on a local computer possible.

Enter Mistral: Since then, there's been a race to tweak and improve these AI models. And that's where Mistral, a French startup, made headlines. In September, they launched a compact model with only 7 billion parameters, yet it outshined Llama 2's 13 billion-parameter version in every test.

The cool part? A 7B model can run locally, even on a MacBook Air, no internet required.

Two days ago, Mistral released their latest model, Mixtral 8x7B. It outperforms both Llama 2 70B and GPT-3.5 in most benchmarks, with only 46.7B parameters.

  • It boasts a 32k token window, recalling up to 24,000 words or 48 pages.

  • It handles English, French, Italian, German and Spanish.

  • Itโ€™s good at coding.

But the real crazy feature is its inference power. It is performing at the level of a 70B model, while only requiring the computational power of a 14B model. Making it possible for an open-source AI that outperforms GPT 3.5 to run on a local computer, only a year after the release of GPT 3.5.

Very impressive!

Can a 2 billion euro unicorn stay open-source?

In just six months, Mistral's value has skyrocketed, with a recent funding round bringing in โ‚ฌ385 million, making them Europe's most valuable AI startup.

BUT: Mistralโ€™s main business is offering their models through their platform, and the open-source nature of their work means competitors can quickly catch up.

Case in point: within 24 hours of releasing Mixtral 8x7B, a competing AI platform, together.ai, offered it for 70% cheaper.

It remains to be seen if Mistral can maintain their business model and focus on open-source models, or if they'll shift to closed-source models like OpenAI did.

I'm cheering for them; Europe could definitely use a bit of innovation.

๐Ÿค– ChatGPT gets lazier during holiday season

LLMs are weird. Some users on X tested prompting GPT-4 Turbo with a May system date and a December system date, and found that on average GPT writes 200 characters shorter code if it thinks itโ€™s December.

This means GPT-4 might perform worse in December because it "learned" to do less work over the holidays.

Seems like LLMs are ready for Christmas as well. I wonder what happens if they know itโ€™s snowing outside.

ONE MORE THING

You can scan your hand-written journals with ChatGPT

Hereโ€™s a great tweet about how you can scan your hand-written notes with ChatGPT. Itโ€™s surprisingly good at predicting hard to read words from the context. If you click on the tweet you can read a step-by-step guide.

โŒš If You Have One More Minute

  • ๐Ÿ” FTC wants Microsoft's relationship with OpenAI under the microscope

  • ๐Ÿ“ Google's new AI note-taking app just got upgraded with Gemini!

  • ๐Ÿ”Š You can try Metaโ€™s new model for audio generation

  • ๐Ÿ”ฌ Runway is starting a long-term research effort around general world models

AI Art of the Day ๐ŸŽจ

Bringing emojis to life. Made with Magnific AI by @doganuraldesign. Hereโ€™s his tweet on how to do it yourself.

๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„๐ŸŒŠ๐Ÿ„

That's all for today, folks!

If you enjoyed this, please consider sharing this hand-crafted newsletter with a friend.