Features 05.10.2023

Home-Grown Trouble: Five of the Biggest Insider Threats of 2023

Edward Snowden put the term ‘insider threat’ firmly on the map in 2013. A decade later, insiders are still running amok…

Malicious and negligent employees are a growing cause of cyber risk. Phil Muncaster explores some of the most compelling cautionary tales of the year so far

In the popular imagination, cyber threats are the stuff of shadowy Russian criminals and powerful state-sponsored operatives. Informed by Hollywood and poorly chosen images to illustrate cyber news, cyber threat is often portrayed with masks, black hoodies, and hostile state actors.

But the truth is somewhat different. While these threat actors account for the majority of incidents that security teams must deal with, a growing share of accidental and deliberate breaches can be traced back to employees and contractors. According to Verizon’s analysis of real-world data breaches over the past 12 months, the split is 83:19. That means nearly a fifth of incidents are home-grown.

Further research highlights the growing insider threat. A DTEX study conducted by the Ponemon Institute and based on analysis of 7,343 insider incidents finds the average cost of an insider breach is $16.2m (£13.3m) – up 40% over the past four years. Worryingly, the study claims that the average number of days to contain an insider incident stretched to 86, up a day from 2022. This matters, because the longer it takes for security teams to find and respond, the more damage can be done and the higher the cost. The estimated financial outlay for incidents that take longer than 91 days shoots up from $16.2m (£13.3m) to $18.3m (£15.09m).

Here are five of the most significant insider threat stories to have broken so far in 2023:

1: The Pentagon leaker:

The US Department of Defense has a history of whistleblowing incidents that have seriously disrupted US military planning and intelligence gathering. But few have been as damaging as the classified Pentagon and CIA documents shared by a 21-year-old solider in the Massachusetts Air National Guard. Hundreds of documents reveal US spying operations against nominal allies and, perhaps most damaging, detailed assessments of Ukraine’s armed forces and operations. This was no act of whistleblowing but one of apparent arrogance and naivety by a young man looking for bragging rights over co-members of a Discord group.

“It’s about measuring behaviour,” ConnectWise CISO, Patrick Beggs, tells Assured Intelligence.

“If CISOs are not employing machine learning and behavioural analytics across their platforms and users, they are behind the game. Hyper-automation technology helps measure what’s normal behaviour for any machine or environment, human-operated or not.”


2: Microsoft’s AI researcher:

This recent tale of insider error is at the other end of the spectrum to the Pentagon leaks. The culprit: a Microsoft researcher who misconfigured access to a 38TB trove of AI information. It included personal employee data, passwords, secret keys for Microsoft services, and over 30,000 internal Microsoft Teams messages from 359 employees. Not only could the security snafu theoretically have provided threat actors access to the sensitive information, but they would also have had the rights to delete or overwrite these files.

Skillsoft CISO, Okey Obudulu, tells Assured Intelligence that machine learning can help identify anomalies, although the shift to remote work has added significant complexity to risk mitigation efforts.

“Whether it’s employees leaking data by leaving unprotected data wide open or colleagues discussing sensitive data on an unsecured app, identifying and differentiating between accidental, negligent, and malicious insider threats can be complex,” he adds.

3: Tesla whistleblowers:

Two former Tesla employees gave personal information on around 76,000 current and former employees, including Elon Musk’s Social Security number, to a German newspaper

Unlike the Pentagon leaker, the two former Tesla employees who leaked highly sensitive corporate information to a German newspaper earlier this year fit the whistleblower mould perfectly. Although the files reportedly contained personal information on around 76,000 current and former employees, including Elon Musk’s Social Security number, the paper wasn’t interested in those. The real reason for the leak was to reveal what they considered a serious corporate cover-up of safety issues with Tesla vehicles. The story – revealing internal reports of self-acceleration issues, braking problems and “phantom stops” – could have had a severe reputational impact on the EV giant.

“While whistleblowers are motivated by different things, be it money, emotions or geopolitics, they often show their behaviour early on. Security teams should put enhanced data loss prevention (DLP) controls on sensitive user groups,” argues ConnectWise’s Beggs.

4: NHS WhatsApp data sharers:

Shadow IT is a serious problem for many organisations – so much so that the National Cyber Security Centre (NCSC) recently published guidance on managing the threat. Whilst it appears pretty innocuous alongside some of the other cases on this list, using unsanctioned devices and services can severely impact compliance programmes and impair IT’s ability to monitor and secure data flows. That’s the decision data protection regulator the Information Commissioner’s Office (ICO) made after reprimanding NHS Lanarkshire staff for sharing patients’ personal and clinical information, including images on over 500 separate occasions.

Although the WhatsApp group was initially set up to help staff communicate during the early days of the pandemic, it wasn’t approved for sharing patient data. The trust was lucky to escape without a massive fine.

“Mitigating these risks requires proper monitoring and governance,” Skillsoft CISO, Okey Obudulu, tells Assured Intelligence.

“When it comes to shadow IT, CISOs should conduct a risk assessment and develop policies for employees around corporate use before making any app available to download. These policies must be clear and prescriptive, supplemented with training that ensures every employee understands best practices.”

5: The worst-case scenario, Nickolas Sharp:

Perhaps the most egregious example of an insider threat is the case of former Ubiquiti engineer Nickolas Sharp. He was sentenced in May to six years in prison for stealing tens of gigabytes of confidential data, demanding a $1.9m ransom from his former employer, and then publishing the data when his demands were refused. To make matters worse, he posed as a whistleblower, alleging Ubiquiti had mishandled and downplayed the breach he perpetrated, planting false reports which cost the firm a loss of over $4bn in value.

Sharp apparently executed his plan out of sheer greed, having already lined up another job elsewhere before stealing the data in question. As a software engineer, he went to great lengths to hide his tracks and was only discovered after an internet outage temporarily took offline the VPN he was using to stay hidden.

Better communication between IT and other teams can help to mitigate malicious threats like this, says ConnectWise’s Beggs.

“CISOs should work with these teams to identify any behavioural challenges employees may be having, undertaking enhanced monitoring if needed,” he argues. “Cross-office collaboration provides a better understanding of departure motivations, helping alert security teams if access needs to be terminated.”

Skillsoft’s Obudulu adds that IT should continuously review access privileges across files, tools and systems.

“This will ensure only relevant people have access to important information such as financial records,” he argues. “Given nearly one-third of employers have suffered a website hack due to ineffective offboarding, CISOs also need a proper strategy to revoke administrative access and deactivate accounts during the offboarding process.”

 

According to DTEX, the vast majority (75%) of insider incidents last year were non-malicious. That’s good news on the face of it, but the cases of deliberate malice usually cause the most damage and are the hardest to mitigate. Security teams will need all the tools and techniques they can muster at their disposal to push back. Looking at some cautionary tales like these is a good place to start.

 

Latest articles

Be an insider. Sign up now!