In a significant move in artificial intelligence (AI) computing, the creators of the widely popular AI apps WOMBO, Dream, and WOMBO Me have teamed up with io.net to revolutionize their machine learning (ML) models, according to the information shared with Finbold on April 11.
These apps, boasting over 200 million downloads combined across iOS and Android, are now set to enhance their computational capabilities using io.net’s decentralized graphics processing unit (GPU) compute network.
Pioneering work in decentralized compute networks
The strategic partnership between io.net, known for its pioneering work in decentralized compute networks, and WOMBO, a leading generative AI startup, aims to harness the immense potential of Apple (AAPL) silicon chip clusters to bolster their ML endeavors.
Picks for you
For companies operating in the realm of AI like WOMBO, cloud computing often constitutes a significant portion of operational expenses.
The partnership with io.net comes as a strategic move to address the challenges posed by limited hardware supply and intense competition, enabling companies, especially startups, to access compute resources at a fraction of the usual cost and time.
io.net’s computing power
With io.net boasting over 100,000 nodes in its network and offering a diverse range of hardware options, from top-tier Nvidia (NVDA) A100 and H400 models to more affordable alternatives like 4090 or A6000 cards, the collaboration promises to provide machine learning engineers with the means to deploy clusters rapidly and cost-effectively.
WOMBO CEO Ben-Zion Benkhin expressed enthusiasm about the partnership, highlighting its potential to tap into unused computing power and alleviate the GPU supply shortage:
“We are excited about partnering with io.net to help bring unused computing power and put it to use in groundbreaking AI applications — together, our teams have the potential to put a serious dent in the GPU supply shortage.”
— WOMBO CEO Ben-Zion Benkhin
At the heart of the collaboration lies an ambitious project leveraging Apple silicon chip clusters, facilitated by io.net, to power WOMBO’s intricate ML models.
By leveraging the Neural Engine capabilities of Apple’s chips and io.net’s mega-clustering potential, the initiative aims to harness the computing potential of hundreds of millions of consumer devices for AI workloads.