There are many cyber-security implications relating to AI. Here, we’re focusing on two important cyber related AI risk areas, and one cyber related opportunity.
(For additional points of consideration and discussion, including your defence against AI-enabled cyber-attacks, harnessing the power of AI for cyber-defence, and ensuring that other AI tools are properly safeguarded, please contact an Evelyn Partners Advisor.)
1. Risk: Cyber-criminals are adopting AI, lowering the technical barrier of entry to cyber-crime and reducing the cost of finding and attacking victims. This can be done by:
- Improving the effectiveness of phishing emails by using generative AI to draft convincing phishing text
- Using automation to identify and exploit network vulnerabilities for initial access
- Using generative AI to write malicious code
- The proliferation of Ransomware as a Service (RaaS), whereby sophisticated threat groups lease attack instructions and infrastructure to less sophisticated groups (affiliates)
2. Risk: Organisations adopting AI need to consider how to protect themselves against cyber-risk and how to manage regulatory compliance issues such as those arising from the EU AI Act. The threat for AI users is constantly changing, with new actors and attack techniques targeting the sector. For instance, the Magecart group targets chatbots to install key loggers and steal data, including credit card data to conduct fraud.
3. Opportunity: Organisations can harness AI for improved cyber-defence and response. Regulations such as General Data Protection Regulation (GDPR) require organisations to adopt cutting-edge safeguards for systems and data. For instance:
- Adopting machine learning to inspect and alert on a far greater range and volume of systems data than has previously been possible. ML can identify and halt suspicious or malicious behaviour in computer networks and systems.
- Using AI enabled automation within investigation processes to speed up the collection of data and incorporating large-language model functionality into forensic tools to aid the responder’s speed and judgement.
- Using AI enabled automated tools to discover shadow IT and data, unusual storage locations and access permissions, and to undertake advanced testing, such as that required by regulations such as the European Union’s Digital Operational Resilience Act (DORA).