Many individuals, including numerous celebrities, have fallen victim to malicious use of ‘deepfake’ technology: the realistic manipulation of a person’s features by artificial intelligence, so they resemble somebody else. This fast-evolving technology is also being used by criminals to target organisations, with potentially catastrophic consequences.

The Financial Times reported last week that the firm which had fallen prey to one of the world’s largest deepfake scams to date, losing HK$200m (£20m) in a highly sophisticated fraud, was British professional services multinational Arup. Senior associate and Crypto Fraud and Asset Recovery network (CFAAR) member Ed Holmes reviews how the Arup fraud was carried out, and how businesses can safeguard themselves from high-tech attacks like it.


How was Arup defrauded?

First brought to the police’s attention in January 2024, the scam began when a member of staff at Arup in Hong Kong received a message about a “confidential transaction”, from a person claiming to be the company’s UK-based chief financial officer.

The staff member who received the email then joined a videoconference with multiple individuals, including one that was using deepfake technology to resemble the company’s CFO. The targeted staff member was convinced to make 15 transfers from company funds to five different Hong Kong-based bank accounts, before eventually contacting Arup’s headquarters and at that point discovering the scam.

Hong Kong’s police has initiated an investigation into the fraud, with no arrests made to date. With the equivalent of £20m lost to fraudsters, Arup was forced to make a statement that the company’s internal systems were not compromised and its business remained stable. Beyond confirming that fake voices and imagery were used, Arup has not divulged further details due to the ongoing police investigation.

Arup’s East Asia chair Andy Lee quickly departed the business. The firm’s global chief information officer Rob Grieg has voiced the hope that Arup’s experience “can help raise awareness of the increasing sophistication and evolving techniques of bad actors”.


How can businesses protect themselves from deepfake fraud?

Unfortunately this is unlikely to be the last story of this nature, as deepfake technology is increasingly harnessed by fraudsters for nefarious purposes. The technology is becoming so sophisticated and developing at such speed that it is hard for organisations, regulators and legislators to keep up.

In Arup’s case, whilst the finer details remain unclear, some sort of two-step verification of payment instructions may have avoided the issue. However, the nature of the fraud and the deployment of the technology was so advanced that it is hard not to have some sympathy for the individual involved.

To prevent these kinds of losses, firms would do well to engage external specialists in this area to provide up-to-date training, specifically on an ongoing basis given the pace of change, so that employees are made aware of the latest techniques being adopted. As for recovery, organisations who fall victim to such incidents may want to consider whether they can seek redress not only from the fraudsters themselves (if they can be traced) but also from the tech providers involved.

Earlier this year in England and Wales, a new false communications offence was introduced under s.170 of the Online Safety Act 2023, making it unlawful to send a message conveying information that is known to be false with the intention of causing non-trivial psychological or physical harm. While that was aimed at preventing behaviour like the sharing of intimate deepfakes, rather than financial fraud, it indicates that legislators and regulators are at least attempting to keep up with technological developments in this area.



You can find further information regarding our expertise, experience and team on our Fraud page.

If you require assistance from our team, please contact us.



Subscribe – In order to receive our news straight to your inbox, subscribe here. Our newsletters are sent no more than once a month.

Key Contacts

See all people