Quick Insights:
- HUB24’s Innovation Lab has found smaller open-source models offer greater transparency and control over AI systems, strengthening governance and helping deliver more accurate outcomes for licensees.
- Smaller models are evolving faster than larger ones, enabling quicker experimentation and iteration.
- Smaller models are also more energy-efficient and cost-effective, making them a sustainable and accessible option for advice practices looking to scale AI solutions.
Early in its AI journey, HUB24’s Innovation Lab worked with large foundational models such as Google, Text Bison, and GPT3.5 in Microsoft Azure to test and learn how AI could be applied to increase efficiencies and drive advice delivery.
However, these models were highly proprietary, with strict licensing that limited access to the latest versions – making them both restrictive and expensive.
“Australia has not been set up with the right infrastructure to run these large models,” said Dr Evan Morrison, Head of HUB24’s Innovation Lab.
According to Morrison, the team also experimented with smaller models to address these challenges, anticipating potential privacy and security issues with large language models, especially relating to data storage.
This is a particularly sensitive challenge of AI, given confidential client data must be stored by advisers and licensees.
Smaller models offer transparency and enhance governance
In 2022, smaller models began to emerge, enabling more innovative, cost-effective solutions.
HUB24’s Innovation Lab currently works with smaller models like Microsoft’s Phi-4, which offer permissive licencing and greater transparency, supporting version tracking and strong governance.
Understanding where data is stored, whether onshore or offshore, also improves privacy considerations.
“Smaller foundation models are pushing new boundaries, solving challenges once posed by larger models and becoming key players in the AI landscape,” said Morrison.
Generative AI, energy usage and carbon emissions
Smaller AI models also address other barriers such as electricity costs and carbon impact. They use just a quarter of the energy of ChatGPT and cost significantly less to run without compromising quality.
“We’ve achieved this because not every job needs a Ferrari,” said Morrison. “With prompt engineering, fine tuning, and a deep understanding of what we are working with, we can deliver powerful, efficient solutions to problems facing the advice industry.”
Keep exploring how HUB24 is shaping the future of advice.
Return to our Productivity Hub for insights on AI, innovation, and how we’re enabling better outcomes for advisers and their clients.