As flow-data is rich in metadata and continues to be extended with more contexts, NetFlow Forensics offers the perfect method of how you can deal with a large aspect of network security.
Topics: Network Forensics
Part 3 in our series on How to counter-punch botnets, viruses, ToR and more with Netflow focuses on ToR threats to the enterprise.
ToR (aka Onion routing) and anonymized p2p relay services such as Freenet is where we can expect to see many more attacks as well as malevolent actors who are out to deny your service or steal your valuable data. Its useful to recognize that flow analytics provides the best and cheapest means of de-anonymizing or profiling this traffic.
Data Retention Compliance
Hosts that communicate with more than one known threat type should be designated a high risk and repeated threat breaches with that hosts or dependent hosts can be marked as repeat offenders and provide an early warning system to a breach or an attack.
It would be negligent of me not to mention that the same flow-based End-Point threat detection techniques can be used as part of Data Retention compliance. In my opinion this enables better individual privacy with the ability to focus on profiling known bad end-points and be used to qualify visitors to such known traffic end-points that are used in illicit p2p swap sessions or access to specific kinds of subversive or dangerous sites that have been known to host such traffic in the past.
You can't secure what you can't see and you don’t know what you don’t know.
Many network and security professionals assume that they can simply analyze data captured using their standard security devices like firewalls and intrusion detection systems, however they quickly discover their limitations as these devices are not designed for and cannot record and report on every transaction due to lack of granularity, scalability and historic data retention. Network devices like routers, switches, Wi-Fi or VMware servers also typically lack any sophisticated anti-virus software.
Presenting information in a manner that quickly enables security teams to act with simple views with deep contextual data supporting the summaries is the mark of a well constructed traffic analyzer ensuring teams are not bogged down by the detail unless required and even then allowing elegant means to extract forensics with simple but powerful visuals to enable quick contextual grasp and impact of a security event.
In the age of the Internet of Things (IoT), billions of connected devices - estimated at 20 billion by the year 2020 - are set to permeate virtually every aspect of daily life and industry. Sensors that track human movement in times of natural disasters, kitchen appliances reminding us to top up on food supplies and even military implementations such as situational awareness in wartime are just a few examples of IoT in action.
Exciting as these times may be, they also highlight a new set of risk factors for Security Specialists who need to answer the call for more vigorous, robust and proactive security solutions. Considerations around security monitoring and management are set to expand far beyond today's norms as entry points, data volumes and connected hardware multiply at increasing rates in the age of hyper-interconnectedness.
Topics: Cyber Security
From helping prevent loss of life in the event of a natural disaster, to aiding marketing teams in designing more targeted strategies to reach new customers, big data seems to be the chief talking point amongst a broad and diverse circle of professionals.
For Security Engineers, big data analytcs is proving to be an effective defense against evolving network intrusions thanks to the delivery of near real-time insights based on high volumes of diverse network data. This is largely thanks to technological advances that have resulted in the capacity to transmit, capture, store and analyze swathes of data through high-powered and relatively low-cost computing systems.
With the increasing abstraction of IT services beyond the traditional server room computing environments have evolved to be more efficient and also far more complex. Virtualization, mobile device technology, hosted infrastructure, Internet ubiquity and a host of other technologies are redefining the IT landscape.
From a cybersecurity standpoint, the question is how to best to manage the growing complexity of environments and changes in network behavior with every introduction of new technology.
In this blog, we'll take a look at how anomaly detection-based systems are adding an invaluable weapon to Security Analysts' arsenal in the battle against known – and unknown - security risks that threaten the stability of today's complex enterprise environments.
Topics: Anomaly Detection
Information retention, protection and data compliance demands are an important concern for modern organizations. And with data being generated at staggering rates and new entry points to networks (mobile devices, wireless network, etc.) adding their own levels of complexity, adherence to compliance obligations can prove challenging. In addition, when considering high profile network hacks such as the Sony, Dropbox and Target intrusions, it quickly becomes clear that no organization is immune to the possibility of having their systems compromised. This backdrop demonstrates the importance of finding a suitable network monitoring solution that is able to navigate the tightrope between meeting regulatory requirements without placing too much strain on hardware resources.
With the pace at which the social, mobile, analytics and cloud (SMAC) stack is evolving, IT departments must quickly adopt their security monitoring and prevention strategies to match the ever-changing networking landscape. By the same token, network monitoring solutions (NMS) developers must balance a tightrope of their own in terms of providing the detail and visibility their users need, without a cost to network performance. But much of security forensics depends on the ability to drill down into both live and historic data to identify how intrusions and attacks occur. This leads to the question: what is the right balance between collecting enough data to gain the front foot in network security management, and ensuring performance isn’t compromised in the process?