Microsoft has released an open-source AI model that learns by imitating the reasoning of larger AI models like GPT-4.
Developed by Microsoft Research, Orca is a smaller AI model with 13 billion parameters. Compared to large models like GPT-4 or GPT-3.5, which has up to 175 billion parameters, this might seem miniscule but unlike these models, Orca is designed and tailored for specific use cases.
Also read | Teleperformance signs $185 million deal with Microsoft to launch GenAI tool
It also has another trick up its sleeve. Like larger models, Orca can be taught with human supervision but unlike larger models, it can learn from step-by-step instructions and more interestingly, can also learn by larger language models.
It does this by imitating the logic and reasoning of larger models like GPT-4, which powers ChatGPT. GPT-4 also serves as a base for Orca. Microsoft says that it is using diverse imitation data to train the model.
Also read | Microsoft, Activision CEOs on witness list for hearing on FTC bid to block merger
Since its smaller than GPT-4, it can only used for specific use cases but this also gives it an advantage. It doesn't need the computation power and resources that bigger models need.
Microsoft says it already outperforms similar instruction-tuned models, and is on par with ChatGPT in benchmarks. It does well and is competitive with larger models in academic tests like SAT, LSAT, GRE and GMAT.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.