Breach, Data Security, Incident Response, TDR

Big data offers cyber-attack early warning

Share

Connecting more devices to the internet is increasing the attack surface for hackers, but delegates at a techUK briefing in London yesterday were told that the sheer volume of new data generated by ever more data sources also presents opportunities for big data analytics to provide a more granular look at what activity is normal – and thus identify abnormal behaviour earlier.

A key point made during presentations was that simply identifying an attack signature would not prepare you for a new type of attack, whereas big data analysis in near-real time – based on an understanding of attack methods and approaches – can potentially identify new types of attacks on a network while they are happening and, ideally, before they have accomplished their objectives.

During the event, entitled How can big data analytics predict, prevent and manage cyber threats?, moderator Sue Daley, head of programme, big data, cloud and mobile at techUK asked panellists, “How do you get board level buy in to use data analytics for cyber defence?”  

Panelist Bob Tarzey, analyst at Quocirca, suggested that awareness of companies being targeted by hackers had risen across organisations up to board level, and his own company's research report – Masters of Machines II  – had shown that better insight at board level delivered better security awareness. This in turn increases concerns about cyber-security, which is resulting in increased expenditure. 

However, while this is certainly a positive trend, he noted that compliance with legal obligations still remains the main driver of expenditure in cyber-security.

James Hodge, principal product manager at Splunk, agreed, but explained his approach to clients was to look at business impacts first, to see where investment in cyber-security might make the biggest impact.  

Tony Marques, lead cyber-security consultant at Endcode Group agreed, noting that well-publicised breaches meant it was less of a challenge than three years ago to convince companies that their perimeter defences will be overcome, and that what was now important was to mitigate the effect by detecting breaches early. 

However, he added, the biggest challenge was to translate awareness of breaches into something meaningful in terms of assessing the risk – including the cost of remediation and the impact on the business in terms of human resources, time and materials – and potential company reputational damage including on the CEO personally.

During the Q&A session, there were many questions about SMEs' ability to afford to leverage big data with analysis in real time and public bodies' ability to integrate disparate legacy systems to get a unified view.  

The vendors present responded that there were tools, including cloud-based options, that could scale to any size organisation, from those appropriate for say a company with turnover of £10 million and upwards. Integrating different data sources via APIs could present difficulties with incompatible products, but the problem of processing the volume of data was seen as simply a hardware issue. 

The point was not to get a single database but to get data onto a common platform to make it possible to search for data of interest. The approach advised for organisations with large legacy infrastructure was to go for cloud services on new projects. 

The impact of early detection was seen as greater for small organisations as they are less able to absorb the impact of a malicious breach – and, of course, via the supply chain, these smaller companies have an impact on larger enterprises.

Hodge noted that the cloud is actually easier to secure as a smaller number of people within the client organisation have access, and very few within the cloud provider have access to the data, reducing the insider threat.

For the future, the increase in data available on every individual would greatly increase the granularity of response, both in offering services and building defences, knowing what was typical behaviour for that role in that company, in that industry, but also for that individual. 

Machine learning would also help by building up data and understanding how a person's behaviour might change from childhood, to adulthood and as a pensioner. 

This would throw up privacy issues, including who owned the data, who had a right to look at it, and what could be done with it – as well as the security of how it is stored in a format that can be re-used in the future. 

Quocirca research showed average enterprise commercial transactions in Europe currently generate an average 42,929 data elements per day. And one delegate referred to a project which created a zetabyte (1021 or a billion terabytes) of data every few weeks. 

Issues then included how to process such volumes of data, and even questions as to what would be the value of doing so. Even with machines processing the bulk of it, where would the analysts come from to deal with the exceptions? 

A government response via the education system, working with industry remains the key – a skills shortage recipe familiar to the cyber-security industry.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.