The United States Secret Service
, in a joint study with the Carnegie Mellon Software Engineering Institute CERT, released a report on computer crime called "Insider Threat Study: Illicit Cyber Activity in the Banking and Finance Sector" (click here for the PDF version
). Of the financial cybercrimes studied, 87 percent were accomplished without sophisticated programming trickery, and in 78 percent of those events, the attackers were authorized financial institution users, often employees, seeking financial gain through known flaws in the system. In other words, we shouldn't always suspect that it's some script kiddie in Romania breaking into our banks and financial services institutions; maybe it's the person sitting in the cubicle in the center of the bank.
In some cases, one supervisor was responsible for both the insider and auditors who might be able to sniff out fraud; often the supervisor was unable to devote the additional resources necessary for a full investigation.
The Secret Service and Carnegie Mellon research study looked at 23 financial services cybercrimes involving 26 individuals between 1996 and 2002. Of those, 81 percent of the crimes were preplanned, and the same percentage of the crimes were committed specifically for financial gain, rather than from a desire to do damage. Within the cases studied, 30 percent of losses to financial institutions were in excess of half a million dollars.
Foxes often guarding the henhouses
In one of the cases, the Secret Service discovered that a contractor writing software for a financial institution learned enough of the network's architecture to manipulate accounts, another was able to write to conceal illegal trading activity, and yet another allowed an automated logic bomb to detonate at some future date (long after the contracting work was done). In some cases, one supervisor was responsible for both the insider and the auditors who might be able to sniff out fraud; often that supervisor was unable to devote the additional resources necessary for a full investigation. In one case, a credit union employee was terminated, and while his internal account access was disabled, his remote access and root password access to the network remained--allowing him to enter the network from home at a later date.
In another case, the credit card account manager at a financial institution changed the address of an account, ordered a new credit card and PIN, then withdrew money from the account. Although he covered his tracks by restoring the original address information, a review of the logs reported the change to the accounts. Thus, the Secret Service report recommends logging as much information as possible for data access such as read, modify, and deletion dates and times.
When criminals don't fit a pattern
But who are these employees? Inside criminals ranged in age from 18 to 59 years old, with 42 percent female, 54 percent single, and 31 percent married. The insiders came from varied racial and ethnic backgrounds. Thirty-one percent were in service trades, 23 percent were in administration, 23 percent were technical, and 19 percent were professionals. Only 17 percent had administrative privileges prior to an incident. Twenty-seven percent had prior arrests; only 15 percent of these individuals had been deemed by management to be problem employees. Thirteen percent showed previous interest in "hacking," but only 5 percent were ranked by supervisors as "untrustworthy." Nineteen percent were perceived by superiors to be disgruntled employees.
The report found that anomaly-detection software, such as applications that monitor predefined user behavior, is very expensive and only minimally effective.
Here's what's surprising to me: 27 percent had come to the attention of supervisors for suspicious activities prior to the crime they were caught committing. These included increasing complaints regarding salary, increased personal phone use in the office, refusal to work with new supervisors, increased outbursts, and isolation from coworkers. The behaviors mentioned are not always indicators of future criminal activities, but they should at least prompt guidance from human resources and, in the case of handling sensitive information, perhaps even warrant relocation within the company to handle less sensitive data (remember, we're talking financial institutions).
When security software doesn't work
Another interesting finding: in 61 percent of the cases, insiders were caught through classic auditing methods, not automated security software. The report cited that anomaly-detection software, such as applications that monitor predefined user behavior, is very expensive and only minimally effective. Thirty-five percent of the insiders were caught through complaints by customers; 26 percent after system failures, for example, after they crashed the system; 20 percent through routine auditing; and 13 percent were found out by their supervisors. Given that the profile of the typical inside criminal is all over the map, it doesn't surprise me that behavior-predicting software would fail to catch these individuals. Good human management is the better solution here.
This is just one study
The Secret Service and Carnegie Mellon limited this report to financial institutions, but I'd be willing to bet the data would be similar for medical, academic, and technical institutions as well. Given the lack of sophistication used in these attacks and the fact that they often happen from within a secure perimeter, companies need to take a serious look at security basics, such as implementing passwords, requiring employees to change those passwords frequently, and encouraging employees and supervisors to discuss potential behavior problems early on before they escalate to a criminal level. Clearly, a lot of money businesses spend on perimeter defenses could be better spent somewhere else.
Got a security question? Let me hear about it!