As a result of geopolitical and economic volatility, and strengthening regulatory frameworks, the landscape for 2024 presents both challenges and opportunities.
The rise of GenAI is driving digital change, and this is bringing investment in advanced technologies and digital infrastructure. Gen AI has also shown the potential of AI to transform businesses, to enable them to adapt, streamline efficiency and uplift productivity. However, the risks associated with these technologies are becoming more apparent. To benefit from these technologies, investors and companies will need to navigate the complex legal and risk landscape marked by increasing digital regulation, enforcement and litigation, and heightened cyber risk.
We outline how we expect these global trends to shape the legal outlook for the technology sector in 2024.
Leveraging AI
As organizations explore the potential for AI beyond the GenAI hype, many recognize the advantages it can bring in augmenting work to make it more efficient and productive. Yet the reality of the significant costs and risks of implementing AI – particularly GenAI – are becoming evident.
The rush of interest and record investment in GenAI in 2023 has brought a renewed focus from governments and regulators on the risks of AI. While there are concerns about the existential threat AI could pose to humanity, more immediate concerns focus on the lack of transparency, the risks of inaccuracy and biased outputs, and the need to protect privacy and intellectual property. Elections in 2024 raise the spectre of AI being used to generate vast quantities of disinformation, deepfakes and highly personalized propaganda.
Governments and regulators are grappling with how to regulate AI and agree global standards, while protecting national security interests and leveraging AI to drive innovation, productivity, efficiency, and growth. Regulators are already enforcing existing laws to protect consumers, and there are already high-profile AI cases and class actions.
There have been hundreds of AI-related initiatives at international, inter-governmental, regional, and national levels. China has introduced some of the earliest regulations on AI, the EU is progressing its highly anticipated AI Act and President Biden’s Executive Order in October 2023 on AI marked a significant development in regulating AI in the US. We expect the Order to spur further government action in 2024, adding to the complex matrix of digital regulation across the globe.
As companies and investors seek to invest in AI, compete for AI-driven targets in M&A transactions, or roll out AI solutions in their businesses, it will be important for them to approach this with caution and diligence. They will need to establish the right governance and risk management models to safeguard their interests and take account of the rapidly evolving legal landscape.
Cyber Regulation
In a digital world fuelled by data – vast amounts of data in the case of generative AI – cyber risk has become an-ever present danger. High-profile cases have highlighted the significant financial and reputational costs of a cyber-attack. Businesses will need to adapt to safeguard their data, ensure operational resilience and comply with new cyber specific legislation.
Ten years ago, there was little ‘cybersecurity law’ beyond the general obligations in data protection legislation to take appropriate technical and organizational measures to ensure the security of personal data. Since then, the volume and significance of cyber-attacks has increased dramatically. It is therefore hardly surprising to see a legislative response with strengthening of cybersecurity laws, including sector-specific and product security laws, such as identifying regulatory notifications, addressing specific technical obligations, and focusing on supply chain.
In the EU, the impending digital regulation package will ratchet up cyber security obligations.
- NIS2, which replaces the Network and Information Systems Directive (NISD), will require, among other things, essential entities (e.g. in the energy, transport, financial, health, water, digital infrastructure, ICT, public administration and space sectors) to use appropriate measures to protect their systems, including in relation to their supply chain.
- The Digital Operational Resilience Act (DORA) will primarily apply to the financial services institutions and third-party providers of ‘critical’ ICT services to those institutions. Financial services institutions will need to apply uniform standards to protect against cyber-attacks.
- The Cyber Resilience Act (CRA) will primarily apply to suppliers of products – both software and hardware – containing digital technology that are designed to connect to another device or network.
The UK is unlikely to adopt legislation to mirror changes in the EU. However, the UK has passed a product security law which is similar to the proposed EU Cyber Resilience Act.
In the US, developments and proposals for cyber regulation have emerged at both federal and state level. At the federal level, the U.S. Securities and Exchange Commission (SEC) and the Federal Acquisition Regulatory Council have been actively pursuing cybersecurity rulemaking. The SEC adopted new rules to enhance and standardize cybersecurity disclosures: public companies must make annual disclosures about their cybersecurity risk management, strategy and governance, as well as periodic disclosures about material cybersecurity incidents.
It is therefore more important than ever for companies to ensure that their cybersecurity programs (including policies, procedures, and safeguards) are fit for purpose and reflect the evolving regulatory landscape. Much of this is common sense. Most organizations will do much of this already; but the consequences and sanctions for not doing so are rapidly increasing.
Moving Forward
In 2024 and beyond, it is critical for companies and investors to stay abreast of these shifting dynamics to make the most of the opportunities that digital transformation presents. However, careful monitoring and adherence to regulatory changes, a commitment to robust risk management strategies, and a considered approach to the adoption of AI technologies will be essential to thrive in this fast-paced environment. The long-term rewards of successfully traversing this path could be transformative for businesses that skilfully negotiate these complexities.