Network Security, Security Strategy, Plan, Budget

Reset 2018: All-female expert lineup for cybersec conference breaks mould

Reset 2018, held in central London yesterday, is a cyber-security conference with a difference, comprising insights from 15 female experts in cyber-security explaining the evolving cyber-threat landscape, making it perhaps the only cybersec event where women are in the majority (though men were welcome and perhaps some 25 to 30 percent of attendees).


Joint organiser Saher Naumaan  threat intelligence analyst BAE Systems commented to SC Media UK,  that the event is: "Very much about cyber-security content rather than gender, so its women not simply talking about what it is like to be a woman in the industry, but about cyber-security issues."

Saher Naumaan (left) and Kirsten Ward

Co-organiser Kirsten Ward, threat intelligence analyst BAE Systems explained the background to setting up the event, telling SC Media UK:  "Attending cyber-security events internationally, there was a lack of diversity in speakers and audiences. But we knew that there are lots of women working in the sector and we wanted to counter the misconception that there are not enough women experts in cyber-security to be speakers.  We quickly compiled a list of 100 and started reaching out to them to prove the pipeline shortage was a myth - and it was."


BAE Systems is the primary event sponsor.


Naumaan  adds, "Discussion of women in STEM  and in the industry is important but today we hope to have realised a content-driven event focussed on the actual work women in the sector do, talking about their subject matter experience and expertise."


SC Media UK runs cyber-security events and strives to ensure diversity among its panelists and presenters, but sometimes it fails, so we asked, what should we do we do if women cancel and leave us with an all-male panel?


Ward responded,  "You should try harder to get a woman to replace her  - there is no shortage, despite the imbalance in the industry," though she admits it can sometimes take time and might require more effort, however, for this event, she says that they, "Didn't find it difficult."  So we'll be trying harder at SC and reaching out to this clearly identified wider pool.


The event itself, given the founders' backgrounds, was strong on threat intel, but ranged across all branches of the industry, from crime to network security, industrial controls to social engineering, some technical, some strategic - reflecting the breadth of expertise available.


Ward pointed out that the speakers, “Have diverse backgrounds, experience and education, ranging from computer science to social sciences, and there is a place for all of those in cyber-security - so  not just STEM, but say War Studies is just as important. It's not just handling data, but also putting it into context - geopolitical, dynamic environments etc, so different backgrounds and different ways of analysing the problem are a benefit. It's not  just one type of person with one type of skillset.


She concluded: “If I were hiring I'd be focussing on logical thinking rather than hard tech which can be learned.”


There were some 350  registered for the event with an anticipated two thirds turnout.


So what about the content?


How do cyber-immune systems compare to biological immune systems?


In the opening presentation of Reset 2018, Mary Haigh, product director BAE Systems dissected the analogy of cyber-immune systems and biological immune systems, concluding there were indeed parallels - but that it was not an exact fit.


The idea that a cyber-immune system - adaptive defence - is a self healing system that adapts to mutations and environmental threats is an attractive one, but it suggests that it will cope  by itself - that the analytics will learn without feedback. In reality these systems always need feedback on what is good and bad says Haigh.


However,she notes that it is also true that a human immune system also needs a lot of help to remain healthy - eg for the flu virus - we research how it mutates and what vaccinations we need - which is going on in the background. Then there are the things we choose to do ourselves.  If we go to exotic locations like a jungle, we would get jabs before we go. So there are environmental choices we make ourselves, and background factors.


Unfortunately we do not have the kind of world mat of threats, showing what threats we face in each country and what mitigation strategies we should choose.   But we can feed in threat intelligence to understand the threat landscape - but it can't be done in isolation. Eg if you were to isolate and close down a server to patch malware and it were mission critical, it could be the wrong business decision.  And sometimes we will choose to accept the extra risk for the extra opportunity - whether that is moving to the cloud, taking on new partnerships in a supply chain or undertaking an acquisition. And as with our jabs for our exotic trip, we should be aware of the risk and take appropriate mitigation measures.


Moving on to how we handle the information we get, Haigh cited figures from a Ponemon study that found 77 percent describing threat intel reports as a good idea, whereas only 50 percent of incident responders use threat data when deciding how to respond to threats. And only 27 percent actually find threat intel effective to pinpoint cyber-threats.


Reasons suggested included too much data, and too much complexity -  with 32 percent reporting blocking legitimate traffic due to misinterpreting threat intel. It was suggested that this is down to a failure to have a holistic view - as the data is collected and analysed by different individuals and groups.  Also workflow is not always as timely and accurate as is wanted, and there is a need for normalsing data from disparate sources. “It's not easy - users need to embrace the concept (of a holistic approach) and then break it down to usable amounts - but don't just expect self healing.


Stuxnet and industrial control attacks


The day's keynote speaker was Kim Zetter, an investigative journalist and author of an acclaimed book on Stuxnet (Countdown to Zero Day: Stuxnet and the launch of the world's first digital weapon).


She took the audience through the chronology, not just from introdcution to discovery, but from the US investigating the idea that Russia, China or North Korea  could cause physical damage with digital code.


The various iterations of Stuxnet were considered, how they were introduced - naming supply companies used - as well as a detailed explanation of how the worm worked.


For those few of our readers who may not know, the Stuxnet worm was the first known digital weapon to impact the physical world. It was initially introduced into the Iranian nuclear facility at Nantanz in mid November 2007 using a USB.  After recording normal activity, which would subsequently be show on the monitors during its destructive phase, it would speed up or slow down the centrifuges that processed and enriched unranium hexaflouride gas, so that they were out of sync, causing rotors to crash.  Safety controls were disabled during this phase. Catastrophic failure was avoided and instead the attack slowed down progress, requiring constant centrifuge replacements, while wasting the gas used - presumed as a ploy to buy time for diplomacy.

To avoid being caught It scrubbed code blocks on their way to a monitoring station, and then restored the malicious code if new blocks of code were injected.


Then in June 2010 researchers at anti-virus company VirusBlokAda's offices in Belarus were given remote access and found suspicious files,using zero day exploits to install malware on the site that only deployed for the specific configuration of this plant, then unleashed seven attacks.


Zetter explained how the discovery followed the Israelis adding reporting/spreading mechnanisms to the worm.  But Zetter also quoted General Cartwright of the US during a leak investigation saying that there was no point in having a digital weapon if your opponent doesn't know you have one. Hence the suggestion that it was enough to let the Iranians know that everything they were doing was seen.


While Iran would understandably regard the action as sabotage by an enemy state, Zetter notes how it was a very calibrated attack that had strenuously avoided collateral damage.


In contrast, on 23 December 2015, a power plant in the Ivano-Frankivsk region of Western Ukraine had 60 substations taken offline by cyber-attack, with 230,000 residents in  the dark for one to six hours, as the malware prevented closing of breakers, and locked out controllers, changing their passwords. There was also a DDOS attack on the plant's telephone customer call centre, making it harder to know who was impacted and slowing initial reporting of the problem.


So we do know that attackers can achieve physical destruction with code as a fact.  Multiple nations now plan to launch their own offensive capability - “at least 20 countries we know of”.  And the capability is there for anyone to use for any purpose. Zetter concluded that while Stuxnet was a precision weapon, not everyone will be as capable and careful.


Open banking, opening up to third party risk?

 

Adizah Tejani, director of marketing EMEA, Token and advisor, UCL School of Management, was next up with a very articulate presentation about the need for security to be a consideration by organisations and the outset, and banks in particular, taking on third party apps and providing API access.  Regardless of Brexit, the UK is alligned to the EU with its approach to Open Banking, and thus there is a need to achieve a balance between usability and security. Which can be difficult as we don't know exactly how secure third parties will be. They will have registered to be approved, but we still don't know exactly how secure they are, and thus how secure is customer data.  And there is a need for this discussion to be outside the security team and in the business teams too, hence we need to be able to talk cross functionally.

 

Public/Private roles in securing cyberspace


A panel discussion addressed Public/Private roles in securing cyberspace, chaired by Naumaan, panellists included Elke Bachler, Chief Information Security Officer, Hiscox;  Miriam Howe, cyber security architect, BAE Systems Applied Intelligence; and Emma W, commissioning editor for advice and guidance, NCSC Comms Directorate.


Howe noted that in the private sector security is subjective and down to each organisation's perception of risk.  So it can be difficult for them to prioritise, as: “All clients have more cyber security requirements than budget.”

She added that with Wannacry, we learnt:  “That it is possible [for the public sector to help] to limit the outbreak of a big attack by sharing information - but it did not help us learn about why it happened.”


Bachler observed: “We spend a lot of times trying to stop things happening, but we spend less often looking at what we actually do when something happens.  We need to prioritise practice, preparing for the worst. It is definitely worth doing as you can't stop everything.”


Howe reiterated that here too: “Agile sharing of information helps tackle the impact of an incident and long term understanding of the threat.”

 

Emma W pointed out that the government (via the NCSC) puts out all sorts of advice, but it's difficult to measure its impact - but the aim is to get everyone on the same page.


Bachler, coming from an insurance background, explained the problems that industry has  in trying to quantify how often these things [cyber attacks/breaches] happen, telling the audience: “The insurance industry creates financial models for say a 1 in 200 event, a one in 2,000 and can price accordingly - but in cyber security we don't know how likely a successful attack is to happen and how much it will cost.  If nothing happens, you don't know if it is because your investment was good. There is nothing that says that X amount of investment will give you Y amount of cover.”


She adds that: “Many in the private sector don't fully understand the implications of cyber-security risk, so they may take an entrepreneurial risk, and there can be an unintended consequence.” As a result it is not clear if a company took a decision not to be protected as a calculated risk, or simply did not realise the potential consequences of the risk they took.  

 

It was also pointed out that private companies will take action for their own protection but their role is not to act for the greater good, as it is with government. Howe said: “ Enterprise security does not equal national security, and so the country [government] has to dictate how companies will fit into that, after identifying it.” So regulation and legislation can be used to ensure suppliers and infrastructure that is  in the private sector complies with public sector standards to protect the public.”

 

So while companies need to protect their own assets for their own benefit; “Government needs to judge if an enterprise's activity is relevant to national security, CNI etc, and inform them if they may be vulnerable to attack by a more sophisticated adversary than they realise.”


Bachler agreed that there is: “Definitely a role for regulation as companies can be naive about what they need.”


Though Emma W cautioned that we need to consider how regulation drives outcomes as it does, “.. not always drive the right outcomes.”  This is because sometimes legislation is written quickly, but legislators find it difficult to keep up with the pace of technological development. She suggests we need: “People-centred security -  with the emphasis on people not technology.” She also observed that often we are calling for users to “prioritise security ahead of their job - which is not realistic.”


The complexity of the issue and the diverse range of stakeholders was cited as another reason for encouraging a diversity of intake into the sector, to represent those different interests and approaches.  But apart from an observation from Howe, when looking at the audience, that: “I didn't know there were this many women in cyber-security,” the focus remained on the public/private sector issue.

 

The conclusion was that in both spheres we need to work with people on a level that works for them - eg good enough passwords that work for them in their role. Emma W noted, “Most people do care and want to do things the right way and feel that responsibility and want security at home,” and by building on that we can help educate them about cyber-security.


Bots, trolls and warriors


Other speakers through the day included Andrea Little Limbago, chief social scientist at Endgame - covering warriors, trolls and bots - explaining how today's propagandists are incresingly leveraging AI and automation.

The current unpreparedness to cope with hacking of elections and fake news was summed up in an older quote when Limbago cited Alphabet CEO Eric Schmidt, saying: “ The internet is the first thing that humanity has built that humanity doesn't understand,  the largest experiment in anarchy that we have ever had.”

 

And the conclusion was that we will have to build better defences, as it is necessary to avoid fighting the last war, hence we must prepare for increasing adversarial information across bots, trolls and warriors. And a future - Artificial Intelligence sputnik moment can be expected.


How to become a cyber-criminal 

 

Rashmi Knoles, Field CTO, EMEA, RSA Security, also began her presentation by observing that she is used to being only woman at cyber security event, hence this was a novel experience.


She then went on to teach her audience how to be a cyber criminal using the darkweb, with a presentation on six steps to becoming a cyber criminal. Without giving too much away, the step entailed, deciding your role or specialisation within the gang or organisation; learning or outsoucing the tech skills needed; buying the tools for the job; marketing the information such as credentials; then cashing out.


Bitcoin figured highly in cashing out, but many of the darkweb sites shown were remarkable for how they mimic the legitimate world, with different levels of courses for crime, from entry level to expert carder (the MSc of crime) as well as money-back offers, customer testimonials, loyalty-pricing models, and anything else that a savvy business might deploy to maximise revenue.  These gang leaders clearly have the ability to be CEOs in the real world, but maybe they just can't be bothered with buying their product when they can steal it, paying tax when it can be avoided, and dealing with competitors when you can crush or kill them.


The above only represents half the day.  SC unfortunately could not attend the afternoon and thus missed: Denial of Trust: The new attack, by Wendy Nather, director of Advisory CISOs, Duo Security.

 

Panel session Securing the unsecureable; Amber Baldet, co founder and CEO Clovyr; Stephanie Edwards, security consultant MWR InfoSecurity; Zoe Rose, ethical hacker, Baringa Partners; Ade Adewunmi, industry consultant, Teradata.

 

Then Rebekah Brown, head of threat intelligence, Rapid7 on Leveraging Threat intel, and North Korea's IT sector and cyber security exposed by Andrea Berger, senior research associate and senior program manager MIIS.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms and Conditions and Privacy Policy.