Vijay Shekhar Sharma, the entrepreneurial force behind Paytm, recently weighed in on a fascinating assertion by Christian Keil of Astranis Space Technologies. Keil’s tweet, which sparked discussions online, proposes that all human knowledge might be distilled into a mere 40 GB.
Sharma responded, acknowledging that while the data presented isn’t spot-on, it hovers close to the truth. He foresees a future where smartphones will carry AI so advanced that they can encapsulate global information, rendering the need for external AI assistance largely obsolete.
What Happened? The image accompanying Keil’s tweet features a list of files seemingly related to Llama 2’s language model iterations — Meta’s answer to OpenAI’s GPT models.
These files vary in size and method but intriguingly, each one fits within a 30-40 GB window, showcasing the power of modern data compression. References like “q3” and “q4” likely denote different stages of development, while “K_M” and “K_S” suggest variant model specifications.
Why it matters? Sharma’s vision reflects a significant shift toward on-device AI, hinting at a leap in the way we interact with technology. His commentary aligns with the industry’s march towards self-contained devices that handle complex AI tasks, marking a pivot from cloud-based reliance. This shift could revolutionize access to information, making it instantaneous and personal, all from the palm of one’s hand.
Get all the latest Share Market trends and news to set you up for the week ahead.
In essence, Sharma is not just acknowledging the technical feat of data compression but is spotlighting a forthcoming era where our smartphones could become the gatekeepers of the human intellect, powered by AI that’s trained on the collective knowledge of the world.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.