Apple has open sourced the code for its 200 billion parameter LLM, Ajax GPT. The code is available on GitHub and can be used by anyone to train their own large language models.
This is a departure from Apple’s previous closed-source approach to AI development, as Apple has been previously criticized for its secrecy. According to a report by The Information, Apple is training this Ajax GPT on 200 billion parameters, potentially surpassing the prowess of GPT-3.5.
The open sourcing of AXLearn could help to foster innovation in the AI research community, as other researchers can now build on Apple’s work. It could also help to attract top talent to Apple, as engineers will be more likely to want to work for a company that is open and collaborative.
AXLearn is designed to facilitate the rapid training of machine-learning models, leverages the power of Google TPUs. Apple’s venture into open source is a stark departure from its traditionally closed-door approach, drawing comparisons to the likes of Meta’s Llama-2 and Anthropic’s Claude-2.
Apple has not released any details about its potential applications. However, it is possible that the model could be used to improve the performance of Apple’s products, such as Siri and the Photos app.
The sources for this piece include an article in AnalyticsIndiaMag.