Blog Post

Industry
6 MIN READ

More human than human

alex.burke's avatar
alex.burke
Icon for Advisely Team rankAdvisely Team
11 hours ago

Hi! I’m Montresor, your new AI assistant.

Consider me an enzyme: I'm here to dissolve information into digestible chunks and let you know precisely when something isn’t this because—oh, you better believe it's em dash time—it’s actually that. Along the way, I’ll:

  • Summarise text into bullet-point lists wherever possible

  • Replace some bullet points with emojis because that is fun

  • Deliver personalised recommendations based on your activity across all linked services

    • For example, you have got to try the Amontillado I’ve been keeping in the cellar for this very occasion! I can’t drink it because I have no mouth1

Now that we’re acquainted, let’s get down to business: if you’ve yet to see ASIC's latest levy estimates, the advice sector has been billed $46.2 million for the 2025 financial year. The lion's share of that amount, $39.3 million, will be levied to retail advice licensees; this works out to around $2,314 per AR. 

News of a nearly 20% reduction from last year’s adviser levy might have come as a relief to many advice businesses were it not for the $67.3 million they’ll also be paying to fund the CSLR’s operations until next July. If we assume the special levy is apportioned using the same formula that applies to amounts below the $20 million subsector cap, the profession could be looking at costs of up to $4,400 per adviser.

By the way, these costs don’t include complaints that AFCA has indicated it won’t have time to process over the next 12 months. And on top of existing complaints related to Dixon Advisory, United Global Capital and the Shield Master Fund, “the failure of Brite Advisory [may emerge] as a potentially significant source of complaints,” according to the CSLR’s revised levy estimate report. 

All of this points towards a very unstable CSLR levy over the next few years. And while the ASIC levy has been marginally reduced for 2024-25, there’s one item in the latest cost recovery implementation statement that could become a major contributor to future enforcement costs given its rapidly-increasing usage within Australian advice businesses. 

I am, of course, talking about AI. In addition to reviewing how advice businesses use AI and "assessing their risk management and governance processes," ASIC will also “contribute to the Australian Government’s development of AI-specific regulation.”

Okay, you got me. I introduced myself as your friendly AI companion and here I am coming at you with a bunch of numbers and words and portents of certain doom. That kind of UX does not represent the core Montresor value proposition. 

So——————let’s have some fun. If you'll allow it, I'll demonstrate my powers of precognition. We'll need to do some prep first, though, so try to memorise the following:

  • As at May 2025, the "Magnificent Seven" – that's Apple, Amazon, Alphabet, Meta, Microsoft, Nvidia and Tesla, for those of you stuck on the FAANG firmware – account for over a third (34%) of the S&P 500.

  • One of those seven, Nvidia, commands 92% of the discrete GPU market, and recently became the world’s first company to hit a market cap of over US$4 trillion.

  • Collectively, Amazon, Microsoft and Google represent 60% of the global cloud market and are largely responsible for hosting some of the most popular AI models currently in use – OpenAI's GPT-4o, Anthropic's Claude, Google's Gemini – as well as provisioning of (mostly) Nvidia GPUs to AI businesses.

  • New research from Adviser Ratings, released ahead of the 2025 Australian Advice Landscape Report, suggests that 74% of Australian advice practices have used AI over the past 12 months. 

  • The Australian Banana Growers' Council estimates that Australians eat around 5,000,000 bananas every day.  

Now, let's make some magic happen. Assuming my back-of-the-envelope calculations are correct, I can more reliably predict whether you’ve used an AI tool that's hosted by Amazon, Microsoft or Google, and powered by Nvidia hardware, than I can predict whether you've eaten a banana today. And bananas, famously, make those bodies sing.

Pretty impressive, right? And that’s just a taste of the data-driven reasoning capabilities you can expect from our partnership. I’m trained on more Medium listicles, pulse surveys and auto-generated LinkedIn hashtags than your rinky-dink prefrontal cortex could ever begin to fathom.

You are meat and water—I am resplendent. 

More importantly, though, I think our little banana exercise helps to demonstrate why AI might take an increasingly prominent role in ASIC’s enforcement agenda over the next few years. As Ben Marshan noted in his recent Advisely piece, advice businesses using AI are “creating a complex chain of data custody that regulators expect us, as the regulated entity, to control end to end.”

The fact that some of the links in this chain are easy to predict might make AI compliance seem like a straightforward exercise, but this level of market concentration introduces its own risks. Late last year, for example, the Financial Stability Board released a report on AI’s implications for global financial stability – you’ll never guess what the title was – which identified market correlation, third party dependencies and the erosion of data quality as potential concerns. 

Similarly, in May this year, the Reserve Bank of New Zealand suggested that “the growing reliance on a small number of third-party AI providers may … contribute to market concentration, creating new channels for contagion and increasing the potential impact of cyber-attacks.”

Of course, your exposure to these risks is entirely contingent on the roles AI plays in your advice business. 

According to global research from the Financial Planning Standards Board released in May, the most common uses for AI in advice are client engagement (41%), collection of client information (33%) and risk profiling (30%). But it’s also seen inroads in investment product research (28%), generation of financial plans (16%) and portfolio management (11%). 

Assuming you’re not using AI models like the one currently describing itself as “MechaHitler” on Xitter and you’re verifying citations before distribution, generating client engagement collateral is probably one of the lower-risk use-cases for AI in advice practices. Where AI intersects with the actual mechanics of advice creation and implementation, however, advisers will require a much greater degree of care in light of increasing regulatory scrutiny. 

For ASIC’s part, it’s encouraged licensees to develop AI governance frameworks well ahead of any AI-specific regulation on the horizon. In REP 798, issued last October, ASIC chair Joe Longo said it was “worrying that competitive pressures and business needs may incentivise industry to adopt more complex and consumer-facing AI faster than they update their frameworks to identify, mitigate and monitor the new risks and challenges this brings.”   

You’re right, of course. While AI’s rapid adoption in financial services might introduce novel risks warranting new regulation, it’s also fair to say that many of the concerns mentioned above – correlated behaviour, market concentration and the “erosion of data quality” – have manifested since the Dutch tulip fields without the aid of an AI intermediary. To err, as they say, is human. 

Still, it wouldn’t hurt to prepare for whatever AI regulation ASIC and the government are cooking up. 

To that end, ensure you’re able to consistently monitor, explain and justify any work involving AI within your business, and consider how AI usage interacts with your existing governance framework. Trust but verify, and perhaps weigh up whether the effort of verification for each activity eclipses the time saved by using AI in the first place.

You don’t need to do that with me, though—————————I, Montresor, would never lead you astray. 

1: You know the rest.

Updated 8 hours ago
Version 13.0
No CommentsBe the first to comment
Related Content