Antivirus software likes to make a point of popping up a small window in the system tray to show you when they have updated their detection definitions. So your software is up to date and ready to catch all the latest malware, right?
In a test described in its State of Infections Report Q4 2014, Damballa analyzed tens of thousands of sample files that enterprise organizations sent in for review. The files that its Failsafe scanning system detected as malicious were also scanned by the four most commonly deployed antivirus products, although Damballa declined to name names.
They found that within the first hour of identification of suspicious code, the antivirus products only caught 30% of the malware. After 24 hours, 66% of the files were identified as malicious, which means one-third of the files were still slipping through. After seven days, the identification rate rose to 72%. After one month, it identified 93% of the malicious files, and it wasn’t until six months later before all malicious files were identified.
This kind of inaccuracy is compounded by the fact that there are so many attacks on companies on any given day. Damballa cited a 2015 Ponemon Institute report that showed the average enterprise receives 17,000 malware alerts weekly from their IT security products. Only 19% of the attacks are deemed to be reliable and just 4% are ever investigated, which suggests security teams don’t have the time or resources to do anything about it.
In a real-world environment, an antivirus product would scan a file just once, usually when it first arrives via email. If the average security team receives 17,000 weekly alerts, or 2,430 alerts every day, then AV products with a 30% accuracy rate on day one would miss 796 malicious files every day.
Damballa’s conclusion is that while prevention-based defenses remain important, companies need to put greater emphasis on detection and response. “If you can reduce the time between the initial infection and its discovery and remediation, you reduce your risk of damage,” it wrote.
Naturally, Damballa happens to sell one of those discovery solutions, but its recommendations were not entirely self-serving. It recommends automation to handle detection, since 86% of companies surveyed report being short-staffed with cybersecurity experts.
“If security teams can integrate high-fidelity detection with response mechanisms, like endpoint security tools and network access control systems, they can make headway. Instead of a judgment call, decisions are policy-driven,” it said.
Mind the gaps
Don’t be mistaken: antivirus software is a crucial part of any security arsenal and every day malware scanners the world over detect and throttle millions of malicious software strains. This is not a category of software that we should live without.
Antivirus tools work by scanning both static files and programs running in memory. They use several techniques to try and detect malicious activity.
Signature scanning, which looks for known patterns in files, is a well-established method of finding software nasties, as its scanning code runs in memory, looking for potentially malicious activity as it happens.
These are solid, reliable tools but when attackers are determined enough, antivirus software alone may not stop them from grabbing your data.
The malware industry thrives on zero-day attacks – exploits using obscure or completely unknown vulnerabilities. A hacker smart enough to devise one – and there are plenty – can get past malware detectors.
The smart IT manager uses complementary technologies to reduce the risk of attack, and one is to look at the potential delivery channels for malware.
Web protection software can reduce that risk by blacklisting certain sites or groups of sites. Filtering web access is a good way to reduce the risk of infection by simply prohibiting access to sites that are not necessary for work.
It can also be a worthy complement to antivirus software that will attempt to detect anything installed via the browser. This multi-faceted protection is a basic tenet of modern cyber security.
All it takes is for one user to open a file or click a link and you can wave goodbye to the integrity of your network
Another important vector is email. This has gained huge traction among attackers, who use it for phishing, and in some cases spear phishing targeting specific companies.
Attackers can gather information about a company’s organizational structure and employees. The list of sources here is endless, ranging from annual reports through to social media posts.
These can be used to socially engineer employees to obtain login details or have them open a file containing a zero-day attack.
Employee training is all-important here but it must be backed by a technological solution too. All it takes is for one user to open a file or click a link to a fake IT administrator page asking them to enter their single sign-on password as part of a security audit, and you can wave goodbye to the integrity of your network.
The best way to counter threats delivered via email is to choke them off before employees even see them. Monitoring and filtering emails is therefore an important part of any corporate cyber-security strategy.
Email can be scanned for viruses, and it can be controlled still further by scanning for known spam signatures and characteristics. This alone can root out the lion’s share of malicious or pestering emails, increasing employee productivity as well as reducing the risk of compromise.
Adding blacklists for known bad domains and whitelists for recognized sources, such as business partners and customers, can be an extra-useful technique for locking email down.
The further that companies can keep unscrubbed email away from their IT architectures the better. Pre-filtered email streams contain not only infected files but also large volumes of spam, which serve only to clog bandwidth and servers.
Having these filtered offsite by a third-party service mitigates the problem, ensuring that only clean communications touch company servers.
Patch and mend
Even after all these measures have been taken, there is still the chance that a company’s systems can be compromised.
The likes of Gonzalez, or the Sony Pictures hackers, are determined assailants. The battle doesn’t stop with web protection or email scanning.
Making sure the software running on the network is up to date is an important aspect of any cyber-security strategy so that attackers can’t exploit any of the known vulnerabilities in the average operating system or application.
Patch management processes and tools are critical, especially as companies grow larger and IT infrastructures become more complex. Understanding what has been rolled out and when can help IT administrators prevent dangerous holes from appearing in the system.
All of these measures, layered onto antivirus software, can help to reduce the risk of a successful cyber attack.
Here’s the dirty little secret of cyber security, though: nothing is 100 per cent secure. The key is to make things so difficult for attackers that they decide to move on to easier targets.
The way to do that is to layer your defenses, using multiple tools and protecting different parts and communications channels of the IT infrastructure.
Managing it centrally also gives you a single point of access, helping you not only to quash incidental attacks but also to spot any emerging trends that could indicate a sustained, targeted assault on your company.
This concept reflects a long-established military strategy: defence in depth, in which layers wear down an attacker’s ability to mount an offensive.