- The Intelligence
- Posts
- “Hey Siri, are you an AI now?”
“Hey Siri, are you an AI now?”
Discover Cutting-Edge AI: Learn Faster with Our YouTube Summarizer and Stay Ahead with Apple's Newest Innovation.

“Hey Siri, are you an AI now?”
Hello, innovators and info-seekers!
As we surge into another week packed with tech revelations, it feels like we’re living inside a circuit board, with AI developments firing up on all ends. Apple has rolled out its very own AI model—prompting us all to wonder, "Is it too late, or just in time?" We’ll also provide you with a brand new GPT, this GPT can summarize YouTube videos making it easy for you to actually learn on YouTube.
This week’s rundown zooms from Elon Musk’s AI ventures soaring to new heights to ChatGPT not just conversing but actually remembering—creepy or cool? You decide. Plus, ever curious about the nuts and bolts behind these AI conversations? Our AI-Classroom section will dissect how ChatGPT works, turning complex tech into easy explanations. Let's plug in!
Table of contents 📚
Apple has finally released an AI model, but is it too late? 🤔
GPT: Summarize YouTube video’s with ease ✍️
The weekly Rundown:
Elon Musk’s xAI Sparks a Funding Frenzy with a $6 Billion Boost 💰
ChatGPT remembers 🧠
AI-classroom: How does ChatGPT work exactly?
Apple has finally released an AI model, but is it too late?
Apple Unleashes OpenELM: A Local Powerhouse in AI
Apple's latest marvel, OpenELM, is redefining the AI landscape with a peppy lineup of open-source large language models (LLMs) that are making a splash right on your device. Wave goodbye to cloud dependency; these models, ranging from a sprightly 270 million to a brawny 3 billion parameters, are designed to operate where it counts—locally.
Don’t let their size fool you. Although smaller than the giants like Meta's 70 billion-parameter Llama 3 or GPT-3's immense 175 billion, OpenELM models are all about efficiency and smarts. Their secret sauce? A clever "layer-wise scaling strategy" that optimizes parameter use across each layer, enhancing both accuracy and computational thrift. The proof is in the pudding: a nifty 2.36% accuracy improvement over Allen AI’s OLMo 1B, achieved with just half the usual pre-training tokens.
But Apple doesn't stop at just the models. The release comes packed with a full toolkit—CoreNet library, training frameworks, and even evaluation protocols for public datasets, all designed to ease the development of on-device AI applications. And yes, for those looking to push boundaries, there's more: conversion codes for Apple’s MLX library to bring these models to life on your Apple devices.
However, no innovation comes without challenges. OpenELM might outpace OLMo in accuracy, but it does stumble a bit in speed, thanks to a "naive implementation of RMSNorm." But hang tight—Apple’s wizards are already on a quest for speed-enhancing spells.
All set to explore? The entire suite of OpenELM models is ready for action on the Hugging Face AI code community under the Apple Sample Code License. They're available in both pre-trained and instruction-tuned forms, but a heads up—proceed with caution as these come sans safety guarantees. So whether you're dabbling out of curiosity or crafting the next big app, OpenELM invites you to tinker with the cutting edge of on-device AI. Let your coding adventure begin!