Will a line of code be your downfall?

What are the best methods of training to ensure your employees don't make the kind of mistakes that can lead to a serious, damaging breach?

A mistake or uninformed decision by a developer or IT employee can easily lead to a serious breach. Conscious of this, organizations have been investing in security training to prevent such mistakes, but are they delivering effective training?

October 2015. A 17 year old discovers a flaw in a TalkTalk website that exposes more than 150,000 customers’ personal data, resulting in reputational damage and a hefty £400,000 fine. The culprit, a line of code. Writing that line of code correctly in the first place would not have been more expensive or difficult for the developers, neither would it have required a more advanced technology than what was already available to them. They simply did not know better.

With an ever increasing number of attacks (the 2016 Verizon Data Breach Report included more than 100,000 security incidents across 82 different countries), organizations have started to recognize the importance of providing technical security training to IT employees as a key piece of their security puzzle.

However, not all training programs are created equal and organizations need to ensure that the training they give their staff achieves the desired outcomes and is not forgotten or not applied. The type of training that you choose will determine how effective it will be at inspiring and empowering your employees to take responsibility of the security of the applications and systems they build proactively.

How then do you design training that causes meaningful and measurable changes in developers and IT teams? The ideas discussed in this article will help you evaluate what features to look for when considering security training for yourself or your organization. If up for the challenge, it will also provide you with sound, proven principles on how to structure your own security training. 

The guiding principles for security training

The two guiding principles of training at MWR are:

  • Derive the defense strategy from a deep understanding of the attackers’ mind-set and methodology.
  • Combine different activities to appeal to different learning styles. 

The first principle is self-evident if you think of a vulnerability as a bug. If the developers cannot understand or reproduce a bug, the chances of it being fixed effectively are low. Likewise, if we don’t understand how an attacker would look at our systems and exploit a vulnerability, we are less likely to be able to address that vulnerability effectively. This is exactly why in our training courses we focus on teaching offensive security techniques, so that students can fully understand the capabilities of modern attackers and therefore how best to defend against them.

The second principle reminds us that we learn in different ways or styles. Training courses can leverage different types of activities to take into account the different learning styles and deliver an engaging and lasting experience that function as a catalyst to drive positive changes.

The rest of this article will be dedicated to summarize some fundamental research on learning styles and MWR’s own interpretation and approach of the learning cycle in our training courses.

How do we learn?

Many theories and models exist about how people, in particular adults, learn new concepts and skills. A good starting point is to look at a well-known and accepted model published in the 80s by educational theorist David Klob. This “experimental learning” model is based on the idea that learning involves cycling through four different stages:

  • Experience.
  • Reflection.
  • Conceptualization.
  • Actively Experiment.

This high-level concept will likely resonate: we all need to go through different stages to learn something new. At the same time, however, it is also apparent that not everybody learns in the same way. Different people may prefer to start learning from different stages and prefer to spend the majority of their time on certain aspects rather than others. Some tend to start by reading relevant literature and documentation on the topic, so that they can get accustomed with its fundamental principles. Others may start directly by experimenting and prefer a hands-on approach – as they progress and encounter questions/issues that they struggle with out, they can refer back to the documentation.

Based on the observation that people learn in different ways, Peter Honey and Alan Mumford later developed a model that identifies four distinct learning styles:

  • Activist: focused on doing;
  • Theorist: focused on understanding theory behind actions using models and abstract concepts;
  • Pragmatist: focused on linking what they learn to their real world scenarios;
  • Reflector: focused on observing and analyzing what happened.

In order to go through the learning cycle and learn effectively, most people will be able to use all four learning styles, but they will normally have a natural preference for a subset of them.

Alongside these models is another interesting piece of the learning puzzle, represented by the VAK (Visual, Auditory and Kinesthetic) model. Visual learners prefer to absorb information by seeing pictures, diagrams and charts; auditory learners prefer to learn from a lecture or a group discussion; and kinesthetic learners prefer a hands-on approach where they can experiment practically. People will normally express a dominant style, but will be able to learn using all the other styles.

Applying the learning cycle to security training

It is important to note that these models tend to be academic and have their own weaknesses, mainly in that they divide people into categories, whereas in real life people are normally more flexible and can express a plethora of traits in different situations and environments. However, the main take-away point from this research is that people learn by going through a varied cycle of activities and, although they may have one or two preferred learning styles, effective learning requires them to use all the styles to some degree. We could simplify the matter by saying that every person has a theorist, activist, reflector and pragmatist side to them and that training needs to appeal to each of those sides to be as effective as possible.

Based on these observations, you may want to use your own version of Klob's learning cycle to make your training courses more effective. At MWR, we use this simple adaptation of Klob’s model: 

Training cycle2

Let’s take a look at each stage of this cycle with some examples taken from our training courses.

New Concept

At this stage, the instructor introduces a new concept, normally through a demo or an example. For instance, the instructor may introduce the key concepts around Cross-Site Scripting (XSS) by explaining a base scenario and, when possible, offering an example of the impact of that attack in real life. The instructor may then proceed by giving a quick demo of the vulnerability.

This should not last more than 15-20 minutes. The aim of this initial stage is to prepare students for the upcoming lab activity and ensure they get the most out of it by:

  • Introducing the concept to people who have never heard of it;
  • Refreshing the memory of those who are already familiar with the information without boring them (i.e. making sure the explanation is short and direct to the point).

This stage addresses the theorist learning style.

Guided Labs & Challenges

This is a key part of the training, addressing the activist side of learners, where participants get a chance to experiment with the new concept. It is important that the activities and labs provided throughout the training are meaningful, engaging and have clear, measurable, learning objectives.

For example, for our Proactive Web Defense (PWD) training lab we have created a realistic scenario modelled after what we have observed across the years in our Security Assurance and Incident Response engagements. This scenario explored in the exercises revolves around hacking a fully-fledged web portal for a fictitious e-commerce organization called This allows delegates to see vulnerabilities within context and fully understand their impact.

The guided labs section walks students through the common steps an attacker would take to discover and exploit a certain vulnerability. To ensure students are not simply following instructions, at the end of each lab they are required to perform some extra steps on their own to exploit fully the vulnerability, after which they will obtain a flag that they can submit to our CTF(Capture The Flag) portal. The addition of this mini CTF throughout the training challenges delegates and helps them to stay focused on an objective while offering us a way to measure how well they are performing in the labs.


After the exercise, the reflector aspect of the students is called upon, as they are given some time to think of what they have done and make notes on their own.

The reflection is typically guided by questions such as:

  • What happened? What was the result?
  • What is the root cause of the vulnerability?
  • What strategies could be employed to mitigate and/or detect exploitation of this vulnerability?
  • Do you know if this vulnerability is currently mitigated in your applications at all? If so, what mitigation strategies are you employing?

The answer to the last question is often surprising in that most IT employees discover they are not aware if the applications and systems they manage are vulnerable or not to a certain flaw. In this case, the training has been effective at raising awareness and creating a question in the delegates’ minds. Finding the answer to that question is part of the “Action Plan” that students will implement after the training. More on this later.

Discussion & Conclusion

In the last stage of the learning cycle, the instructor prompts people to share the outcome of their reflection. This is a great chance for delegates to discuss what they have learnt and share their experience. The role of the instructor is to:

  • provide clarification around doubts/misunderstandings;
  • consolidate and summarize the main take-away points;
  • offer additional information about caveats and corner cases;
  • point to further case studies, reference and reading material;

Finally and, most importantly, the instructor will help delegates draw conclusions about how to apply what they have learnt to their own environment. This normally results in delegates taking away not just knowledge, but also a “plan of action” with things to try out and further explore when they get back to their desks. This appeals to the pragmatist side of learners.

Returning to our PWD course, we can see an example of how this could unroll. Sample discussion points about the Cross-Site Scripting would include:

  • Root cause analysis: trusting user input, lack of strict input validation, context confusion (developer treats input as data, browser treats it as metadata/tags).
  • Pros & cons of different mitigation strategies: output encoding vs blacklisting? What type of output encoding to perform based on context? Defense-in-depth measures: Content Security Policy and HTTPOnly flag.
  • High level case studies of libraries / frameworks that can be used to encode data for different contexts.
  • Additional information / corner cases: how to deal with cases where encoding is not an option (consider alternative markup languages, use libraries to sanitize untrusted HTML based on whitelists of allowed tags/attributes).
The Action Plan

One final and important piece of the puzzle is the action plan that the learning cycle helps delegates come up with. Having an initial action plan empowers individuals to apply what they have learnt and start making positive changes that will help improve the overall security of their organization.  

In PWD, this is what an action plan put together by the developers could look like after they go through the XSS module:

Sample Action Plan – XSS

  • Verify whether the templating language used to render the views in application X performs output encoding automatically. If not, adopt the OWASP Java Encoder library and document how and when to use it in the team’s coding guidelines.
  • Review source code to identify untrusted data included in views and check whether the encoding context is right (if not, use the OWASP Java Encoder Tag Library to perform the right encoding).
  • Review use of jQuery for XSS sinks. Refer to here for a list of sinks currently known.


These are mostly short term actions, as the XSS module focusses on a single vulnerability. Other modules of the course, such as the one dedicated to SDLC, usually yields wider-spectrum actions, such as:

Sample Action Plan - SDLC

  • Establish/elect security champion in development teams
  • Introduce non-functional security requirements in line with OWASP Application Security Verification Standard (ASVS)
  • Evaluate introduction of security automation into continuous integration pipeline
    • Contact vendors to get demos of static code analysis and vulnerability scanners
    • Integrate OWASP ZAP Scanner into Jenkins pipeline
    • Compare performance of different solutions on core projects and use the results to guide the final choice of adequate tooling



There is strong intuitive evidence that many vulnerabilities could be effectively mitigated by simply providing security training to employees. To maximize the effectiveness of such training, at MWR we strongly uphold two core principles:

  • Derive the defense strategy from a deep understanding of the attackers’ mind-set and methodology.
  • Combine different activities to appeal to different learning styles.

In our experience, a varied training approach which involves different activities (New Concept -> Guided Labs & Challenges -> Reflection -> Discussion/Conclusion) and shows participants the world from the perspective of the attacker has proven very effective in changing their attitude towards security. The hands-on, scenario-based labs section helps to engage and challenge delegates giving them a more in-depth understanding the attackers’ mind-set and methodology: this inspires them to undertake positive change in their everyday job and empowers them to ask the right questions and find the answers by themselves.



Accreditations & Certificates

MWR is an accredited member of The Cyber Security Incident Response Scheme (CSIR) approved by CREST (Council of Registered Ethical Security Testers).
MWR is certified under the Cyber Incident Response (CIR) scheme to deal with sophisticated targeted attacks against networks of national significance.
We are certified to comply with ISO 9001 and 14001 in the UK, internationally accepted standards that outline how to put an effective quality and environmental management systems in place.
MWR is certified to comply with ISO 27001 to help ensure our client information is managed securely.
As an Approved Scanning Vendor MWR is approved by PCI SSC to conduct external vulnerability scanning services to PCI DSS Requirement 11.2.2.
We are members of the Council of Registered Ethical Security Testers (CREST), an organisation serving the needs of the information security sector.
MWR is a supplier to the Crown Commercial Service (CCS), which provides commercial and procurement services to the UK public sector.
MWR is a Qualified Security Assessor, meaning we have been qualified by PCI to validate other organisation's adherence to PCI DSS.
As members of CHECK we are measured against high standards set by NCSC for the services we provide to Her Majesty's Government.
MWR’s consultants hold Certified Simulated Attack Manager (CCSAM) and Certified Simulated Attack Specialist (CCSAS) qualifications and are authorized by CREST to perform STAR penetration testing services.