Posted on July 27, 2015 by Venkat Sundaram
Data breaches have become an everyday occurrence. The remediation is very expensive and resolution of a breach costs much more. The overall impact of a breach depends on the type of data stolen – these can include threats to national security, credit fraud, risk to personal finance, industrial espionage or the economic implications from theft of trade secrets. A 2013 Ponemon institute research for Health data put the average out of pocket cost incurred by medical identity theft at $18,660 per person. Ponemon's report on “Malware Containment” released in Jan 2015 also stated that out of 3218 alerts, only 19% were reliable and just 705 are investigated. We should recognize that end goal upon intrusion for most malicious actors is to escalate privileges and access data.
Over the last few years, we have seen incidents that stem from a variety of motivations. Some incidents have been identified as inside jobs, some incidents stem from lost computers and others because of poor technology security practices or processes. Irrespective of the post breach analysis, they are all security incidents that had a vulnerability – be it people, process or technology. The security breach is always attributed to abuse of access in some form which lead to the eventual data-loss notification and disclosure.
Today, data is everywhere – from mobile devices, remote employees, co-location data centers, cloud infrastructure both inside and outside of the enterprise boundary – they are in motion, at rest and in use. The surface area for attack is very large and without an enterprise approach it’s quite impossible to manage and protect data assets from malicious actors. Understanding their modus operandi provides some clarity around who, what, why, when, where and how. Typically an attempt to breach a network for some type of gain is segmented into 3 basic stages:
This inexpensive three-step attack methodology combined with, social engineering, and a plethora of vulnerable systems puts system owners at great risk. Typically due to finite system owner resources, contrasted with vast resources of malicious actors that includes free automated monitoring tools for reconnaissance that many organizations simply unaware of. After reconnaissance and targeting the next two stages of a breach are relatively easy. Not all systems are patched consistently, multi-factor authentication is not enforced, privileged user access is not monitored, and most importantly systems are not designed with security from the ground up. To complicate the defensive security posture, a large gap exist within organizational roles and responsibilities – typically the application owners and data custodians optimize system performance and invest heavily into user experience; The information security organization monitors network related issues with a plethora of tools and perform periodic certification of systems. This separation of priorities leaves a very large surface area for attack – of which most attacks are not detected for months.
Data theft actors are very sophisticated and utilize other service providers to facilitate attacks – Many use malware toolkits sold in the dark web with add-on services like 24x7 live agent support. This is far more sophisticated than a detailed system security test and security certification usually valid to ‘just an instance of time’. The security gaps when compared to the sophistication and low cost operation of a malicious actor is a very hard pill to swallow. The goal for an organization should be to decrease malware replication time, shorten incident response time should there be a breach – They should include security within dev-ops process to harden exposed targets, patch known vulnerabilities, enforce good identity management practices, protect data at rest- in use and in motion. Information security is just not a pile of gear like firewalls and intrusion detection systems but rather a combination of people, process and technology that work together to create usable and secure systems.
Crowdstrike a cyber-intelligence leader, analyzed the “deep panda” executable – referred to as dropper to be a self extracting malicious payload on a target system as a dynamic link library with clear-text function names and obfuscated single character substitutions – then moved as a hidden within a dropper binary. The payload, a fully functioning tool with an encrypted communication channels to command and control locations also sets up a remote desktop functionality like VNC over a custom protocol to allow the adversary to view the target graphically and control the keyboard and mouse. Crowdstrike also noted that the malware had sophisticated controls such as multiple instance protection, anti-debugging protecting, hooking and helper functions and other obfuscated registry entries proving that the code samples had a variety of tools, techniques and procedures. In this example these actors assume the role of legitimate users and traverse the network to identify systems of interest. They often lay dormant for periods of time and provide support as a helper to malicious actors attacking these systems. Many times, the goldmine systems are relational database assets that hold data critical to the operations. Over the years, Oracle has addressed this issue to become a leader in database security – they also provide capabilities that expand into IBM & Microsoft database technologies to monitor and enforce rules on data tier traffic such as IP, User attributes, protocols etc.
It’s not all doom and gloom with data security, Mythics has developed a 5 step – repeatable process for managing information technology risks and demonstrating techniques to identify sensitive data and to protect it. For more information visit – https://www.mythics.com/datasecurity. The art of mitigating the risk to information technology assets is three fold People, Process and Technology at a high level – at a domain level, this requires understanding of the TTP's, network protections, system patching and identity management - very often system specific risk attributes are forgotten because of the segregation of duties between the security staff focused on network protections and the application owner that's focused on system performance – For example not all systems have a complete inventory of all data elements, analysis of attributes and classification of Personally identifiable- personal health or other sensitive data, key escrow, access controls amongst other things. If you cannot define what data is at risk – how are you going to protect adequately?
Over the course of the next few weeks, our blog will cover practical strategies for improving data security, best practices for implementing security controls around Oracle technology, and policies that help enforce compliance. Additionally, we will also discuss breach notification laws and safe harbor use cases. If you or your team is exploring a strategy to evaluate your data risk and then remediate that risk, drop us a line or review our Data Security programs to see if we are a good fit to help. We have helped Federal Agencies, S&L Governments and some of the most famous commercial brands in the world develop processes and leverage technology (often that they already own) to better protect their data now and in the future.
Venkat Sundaram, CISSP, Enterprise Architect, Mythics Inc.
Next security blog articles will include: