The Investment Association (IA) has published a report by the technology working Group to HM Treasury's asset management taskforce. The Financial Conduct Authority and HM Treasury are observers on the Group.

The report explores existing and future use cases for AI within the UK’s asset management sector, as well as the barriers firms have or are anticipated to encounter in adopting AI. It says that for individual firms, AI presents an opportunity to drive operational efficiencies, develop innovative solutions and improve the service offering to customers. For industry more widely, AI represents a strategic opportunity to increase both the domestic and international competitiveness of the UK’s investment sector and contribute to economic growth.

The industry’s relationship with AI began several years ago with classical AI, which is characterised by its predictive power and ability to draw insights from data. Mature use cases utilising classical AI include algorithmic trading and supporting anti-money laundering monitoring. More recent leaps forward in generative AI have catapulted interest in the technology to the forefront and greatly expanded the canvas of possibilities. Generative AI is a step change because such models can generate new content at the user’s command and can be interacted with in natural language via a chatbot interface. 

The report considers early-stage use cases focusing on operational enhancement and looks forward towards more fundamental transformation in the future that realise the genuine capabilities of AI. It says that the opportunities for where it could take the industry are limited only by ambition and willing but are also not entirely clear. Combining different types of AI or pairing AI with other technologies such as distributed ledger technology (DLT), could unlock further opportunities, as well as new risks. 

The report makes the following recommendations:

Skills and talent: The Group recommends that the UK government strengthens its commitments to promote the growth and strength of computer science, data science, software engineering and other related fields in universities and colleges. There could also be better alignment between the content of courses and the needs of industry. Additionally, more can be done to build mutually beneficial connections and partnerships between UK post-16 education institutions and industry.

Regulation: Fundamentally, the Group desires regulatory clarity and consistency to enable developers and users of AI to plan and invest with confidence. To do this, the Group emphasises the importance of the UK government’s leadership in facilitating international regulatory coordination and alignment on AI, as well as supporting responsible international data flows. Domestically, the Group supports the current direction of travel on AI regulation.

Malicious actors: Potential misuse of AI technology by malicious actors is a serious threat to overall public trust in AI. The Group welcomes recent initiatives by domestic and international authorities to better understand and mitigate the risks that malicious actors could pose. The Group emphasises the importance of joint public and private sector action and appropriate policies to counter AI-enabled fraud, cybercrime and misinformation.

New systemic risks: The changing profile of systemic risk in the financial sector should not be a reason to hold back from innovating. Rather, the Group considers it an ongoing challenge to be managed alongside technology transformation. The Group supports the work of the Bank of England’s Financial Policy Committee in highlighting potential systemic risks. In addition, the Group sees the incoming critical third parties regime as a positive development that will enable regulators to address potential systemic risks that may emerge.

AI risks and governance: The industry should continue to work together to develop its collective understanding of AI risks and identify best practices in risk management, governance and ethics. The IA will produce more detailed industry guidance on AI risk and governance.

Legal uncertainties: Legal uncertainties around AI are likely to continue for some time. In the meantime, industry-led benchmarking, best practice guidance, ethical frameworks and standards can bring confidence and reassurance to market participants. The Group therefore recommends that the industry steps up its efforts to collaborate on these issues. The IA will work with its members to identify and take forward beneficial initiatives.

UK FinTech ecosystem: The IA will build stronger connections between the investment management industry and FinTech companies and will work both domestically and globally to ensure that firms have viable options for collaboration.