Powered by MOMENTUMMEDIA
Advertisement

Bulk of generative AI data policy violations involve regulated data, report reveals

Technology
09 April 2026

Regulated data accounts for over half of all data policy violations in the financial services sector, according to Netskope.

The risk of data exposure is growing for financial services organisations as generative AI becomes increasingly embedded in operational workflows and activities that involve sensitive financial data, according to the latest financial services report by cyber security firm Netskope.

The Threat Labs report for financial services found that while organisations have shifted to managed AI tools to improve AI governance, an overlap remains between personal AI accounts and enterprise use.

The researchers at Netskope identified that regulated data accounts for 59 per cent of all data policy violations related to generative AI usage. In addition, intellectual property accounts for 20 per cent of exposure risks, with source code at 11 per cent and passwords and API keys at 9 per cent.

 
 

According to the report, currently, 70 per cent of generative AI users in the sector actively use generative AI tools, and 97 per cent work with applications that incorporate generative AI capabilities. Ninety-four per cent use generative AI applications that relied on user data for training.

Director of Netskope Threat Labs, Ray Canzanese, said that as “financial institutions accelerate their adoption of generative AI, they are also expanding the number of pathways through which sensitive data can be exposed.”

The generative AI ecosystem continues to expand and diversify as demand for specialised AI capabilities grows. ChatGPT remains the most widely used application, adopted by 76 per cent of organisations, while Google Gemini sits at 68 per cent. Newer tools are also increasing in popularity, such as Google NotebookLM and AssemblyAI.

At the same time, tools such as ZeroGPT, DeepSeek and PolitePost are among the most frequently blocked generative AI applications due to security and compliance concerns.

While the proportion of users relying on personal generative AI applications has dropped 40 per cent over the last year, and adoption of organisation-managed applications has risen, the overlap between personal AI accounts and enterprise tools continues to put company data at risk.

The number of users switching between the two has risen by 6 per cent, increasing the risk of sensitive data moving between secure and unmanaged environments. It is no surprise, then, that regulated data accounted for 65 per cent of data policy violations in these personal apps, with ChatGPT alone the second-most-used personal app in the financial services sector.

Attacks have reportedly exploited trusted cloud platforms to distribute malware, with GitHub and Microsoft OneDrive being particularly targeted. According to the research, using trusted cloud infrastructure allows attackers to blend malicious activity into normal cloud traffic, making it harder to detect.

With the scale and depth of this risk posing a significant challenge to those in the financial sector, organisations are taking steps to reduce shadow AI usage – the use of unsanctioned AI tools – especially when it comes to company data.

Canzanese said that to reduce risk, organisations need a layered approach: inspecting all web and cloud traffic to stop malware, blocking non-essential applications, and using data loss prevention to protect sensitive information.

“Technologies like remote browser isolation also play a key role in enabling safe access to higher-risk websites,” he added.

Want to see more stories from trusted news sources?
Make Accounting Times a preferred news source on Google.
Click here to add Accounting Times as a preferred news source.

About the author

author image

Amelia is a Professional Services Journalist with Momentum Media, covering Lawyers Weekly, HR Leader, Accountants Daily and Accounting Times. She has a background in technical copy and arts and culture journalism, and enjoys screenwriting in her spare time.