Beyond the Giant Models: Alternative Paths for AI

View profile for Veena Siva

2025 CSE Graduate | Sustainability Enthusiast| Lean Six Sigma White Belt

⚡𝐀𝐈 𝐃𝐨𝐞𝐬𝐧’𝐭 𝐇𝐚𝐯𝐞 𝐭𝐨 𝐋𝐢𝐯𝐞 𝐢𝐧 𝐃𝐚𝐭𝐚 𝐂𝐞𝐧𝐭𝐞𝐫𝐬 We often imagine AI as giant models locked away in massive GPU farms. But what if the real future of AI is smaller, faster, cheaper — and everywhere? Before ChatGPT went mainstream, researchers and startups were already exploring alternative pathways to make AI. Some fascinating directions include: 🔹 𝐎𝐧-𝐝𝐞𝐯𝐢𝐜𝐞 𝐀𝐈 (𝐓𝐢𝐧𝐲𝐌𝐋, 𝐓𝐞𝐧𝐬𝐨𝐫𝐅𝐥𝐨𝐰 𝐋𝐢𝐭𝐞) – running models directly on your phone or IoT device. 🔹 𝐌𝐨𝐝𝐞𝐥 𝐜𝐨𝐦𝐩𝐫𝐞𝐬𝐬𝐢𝐨𝐧 (𝐝𝐢𝐬𝐭𝐢𝐥𝐥𝐚𝐭𝐢𝐨𝐧, 𝐪𝐮𝐚𝐧𝐭𝐢𝐳𝐚𝐭𝐢𝐨𝐧, 𝐩𝐫𝐮𝐧𝐢𝐧𝐠) – making giant models lean and efficient. 🔹 𝐒𝐩𝐚𝐫𝐬𝐞 / 𝐜𝐨𝐧𝐝𝐢𝐭𝐢𝐨𝐧𝐚𝐥 𝐜𝐨𝐦𝐩𝐮𝐭𝐞 (𝐌𝐢𝐱𝐭𝐮𝐫𝐞-𝐨𝐟-𝐄𝐱𝐩𝐞𝐫𝐭𝐬) – only “waking up” expert parts of a model per query. 🔹 𝐏𝐡𝐨𝐭𝐨𝐧𝐢𝐜 𝐚𝐧𝐝 𝐨𝐩𝐭𝐢𝐜𝐚𝐥 𝐚𝐜𝐜𝐞𝐥𝐞𝐫𝐚𝐭𝐨𝐫𝐬 – using light instead of electricity to move data. 🔹 𝐈𝐧-𝐦𝐞𝐦𝐨𝐫𝐲 & 𝐚𝐧𝐚𝐥𝐨𝐠 𝐜𝐨𝐦𝐩𝐮𝐭𝐞 – chips that compute where data is stored. 🔹 𝐍𝐞𝐮𝐫𝐨𝐦𝐨𝐫𝐩𝐡𝐢𝐜 𝐡𝐚𝐫𝐝𝐰𝐚𝐫𝐞 – brain-inspired chips for ultra-low power AI. 🔹 𝐅𝐞𝐝𝐞𝐫𝐚𝐭𝐞𝐝 𝐥𝐞𝐚𝐫𝐧𝐢𝐧𝐠 – training across devices without ever pooling raw data. 👉 Each of these is a radically different vision of AI — some already practical, some still experimental. 💡 My question to you: If you had to bet on one of these methods 𝐬𝐡𝐚𝐩𝐢𝐧𝐠 𝐭𝐡𝐞 𝐟𝐮𝐭𝐮𝐫𝐞 𝐨𝐟 𝐀𝐈 𝐟𝐨𝐫 𝐞𝐯𝐞𝐫𝐲𝐨𝐧𝐞 — 𝐰𝐡𝐢𝐜𝐡 𝐨𝐧𝐞 𝐰𝐨𝐮𝐥𝐝 𝐲𝐨𝐮 𝐩𝐢𝐜𝐤, 𝐚𝐧𝐝 𝐰𝐡𝐲? Let’s go beyond the “more GPUs = more AI” mindset and explore how intelligence could become truly accessible to all. 🌍 #ArtificialIntelligence #AI #FutureOfAI #MachineLearning #EdgeAI #TinyML #NeuromorphicComputing #PhotonicComputing #FederatedLearning #DeepTech #TechTrends #Innovation

Alina Vinay Kumar

Brand storyteller | Editorial and content strategist | Women empowerment | LinkedInfluencer

2w

I think the future of AI wouldn't be using only one of these methods, it would be a mixture of a few in order to have a truly optimised model!

To view or add a comment, sign in

Explore content categories