Unfortunately and unsurprisingly, website breaches have become an everyday occurrence. In fact, hacked websites have become so common that typically only the biggest data breaches capture enough attention to make headlines. Experts have known this eventuality was coming and honestly, the prediction was easy.
All one had to do was look at the pervasiveness of web use in modern society, the amount of data and money being exchanged online and read any industry report about the volume of vulnerabilities exposed on the average website.
With this information in hand, the final magic ingredient is a motivated adversary willing to take advantage of the vulnerability.
Historically it was difficult to predict or quantify the consequences of having little or no website security. Now, after countless breaches, the industry has a fairly good idea.
> See also: Dangerous liaisons: how the Ashley Madison hack ended the age of innocence in cybersecurity
Website breaches lead directly to fraud, identity theft, regulatory fines, brand damage, lawsuits, downtime, malware propagation and loss of customers. Whilst a victimised organisation may ultimately survive a cybercrime incident, the business disruption and losses are often severe.
Website security in 2015
WhiteHat Security’s annual security statistics report this year examined web vulnerabilities in more than 30,000 websites throughout 2014. The data turned out to be far more serious than anticipated.
The results found that 86% of these had at least one serious vulnerability, whilst 56% of websites had multiple serious vulnerabilities. On average, 61% of these vulnerabilities were resolved, but doing so took an average of 193 days from the first customer notification.
Application vulnerability likelihood has significantly changed in the last few years. In 2012, an application was most likely to have an information leakage (58% likelihood) or cross-site scripting (55%) vulnerability.
However, in 2014, applications were most likely to have insufficient transport layer protection (70%) or information leakage (56%).
The sharp rise in insufficient transport layer protection can be highlighted by the discovery of zero-day vulnerabilities such as Heartbleed. In response, the likelihood of content spoofing, cross-site scripting and fingerprinting has sharply declined in recent years.
Content spoofing was 33% likely in 2012 but only 26% in 2014. Meanwhile, fingerprinting and cross-site scripting have declined 18% and 6% respectively.
In terms of a vulnerability’s window of exposure – the defined number of days an application has one or more serious vulnerabilities open during a given time period – the 2015 report found that 55% of retail trade sites, 50% of healthcare and social assistance sites, and 35% of finance and insurance sites were always vulnerable.
This means these sites had at least one serious vulnerability exposed every single day of the year. Conversely, only 16% of retail trade sites, 18% of healthcare and social assistance sites and 25% of finance/insurance sites had one or more serious vulnerabilities exposed for less than 30 days of the year.
Educational Services was identified as the best performing industry with the highest percentage of rarely vulnerable sites. Arts, entertainment and recreation are the next best industries with sites in the rarely vulnerable category.
Remediation is what counts
Website security is an ever-moving target. New websites are launched all the time, new code is released constantly and new web technologies are created and adopted every day. As a result, new attack techniques are frequently disclosed that can put every online business at risk.
In order to stay protected, enterprises must receive timely information about how they can most efficiently defend their websites and gain visibility into the performance of their security programmes.
While no true best practice exists, the key is in identifying the security metrics that mean the most to the organisation, and focusing on those activities to fix specific vulnerabilities. Remediation, more than anything, is the hardest thing to do when it comes to application security.
The first order of business is to determine what websites an organisation owns and then prioritise as much metadata about those websites as possible. Through dynamic or static vulnerability assessment, an application security metrics programme should be created.
The programme should track the volume and type of vulnerabilities that exist, how long reported issues take to get fixed and the percentage that are actually getting fixed. With visibility through data, the answers to the problem become much clearer.
Researchers from Whitehat Security found that the best way to lower the average number of vulnerabilities, speed up time-to-fix and increase remediation rate is to feed vulnerability results back to the development teams though established bug tracking or mitigation skills.
This approach makes application security front-and-centre in a development group’s daily activity and creates an effective process to solve problems.
Average remediation rates for industries vary significantly from 16% for professional, scientific and technical services sites, to 35% for arts, entertainment and recreation sites. It is also interesting to note that organisations that are compliance-driven to remediate vulnerabilities have the lowest average number of vulnerabilities (12 per website) and the highest remediation rate (86%).
> See also: This security procedure takes five seconds and could save your business millions
Conversely and curiously, organisations driven by risk reduction to remediate have an average of 23 vulnerabilities per website and a remediation rate of 18%. The skeptical theory is that compliance-driven programmes are simply incentivised to discover the vulnerabilities they are legally required to look for, which is obviously less than the totality.
To summarise, if you look for fewer vulnerabilities, you will find less. At the same time, compliance is a big corporate issue when it comes to fixing known issues and is likely what drives remediation rates up. Risk reduction, right or wrong, often finds itself an accepted business risk and ultimately drives remediation rates down.
Vulnerabilities are plentiful, they stay open for weeks or months and typically only half get fixed. The websites and organisations that are more secure than others have a solid understanding of the performance of their software development lifecycle and have a developed a security metrics programme that best reflects how to maintain security throughout.
Sourced from Jeremiah Grossman, Founder, WhiteHat Security