Software Security - Application Security Verification Project

OWASP ASVS Souce Code Review Project

In groups of about 4 students, you do a security code review of a small web application, using the OWASP ASVS (Application Security Verification Standard), available as ASVS 3.0.1 (PDF) and using static analysis tools to support the manual code review.


See here the current list of groups!


Goals of the project include We realise that this is throwing you in at the deep end. Some of you will have more experience with web applications, PHP, etc. than others. Finding security vulnerabilities in the application is less important than having a sensible & well-argued opinion about the ASVS, static code analysis tools, and the quality and design of the code in the end.

Whether you find all or indeed any security vulnerabilities is not so important, so resist the temptation of getting ideas from other groups. Indeed, we are hoping to use this experiment to get some empirical data on code reviews -- how many eyeballs make for secure code?


As a guide, the goal is that everyone spends around 30 hours in total for this project. You still have around 10 weeks till Xmas, so spend a morning or afternoon a week.

The first time you get together with your group, fill in the questionnaire [odt] [docx] together and email it to Erik Poll.

What to do

For the given application, check Verification Requirements V2 up to and incl. V9 and V11 in the OWASP ASVS 3.0.1 [PDF] Begin with a Level 1 verification (i.e. Opportunistic); if you have time, move on to a Level 2 verification (i.e. Standard).

Some things you may run into:

What to report

At the end, you have to provide
  1. a report which gives a motivated verdict for all the verification requirements, and which provides some reflection on the whole process, including the ASVS and the use of static code analysis tools. This report MUST be in PDF and mention your names and group number on the front page. For TRU/e students from the TU/e, please list both your RU and TU/e student nr.
  2. a summary of your verdict in this ASVS-checklist.xlsx, to allow is to quickly compare results when we discuss them in class.

A suggested organisation of the report would be in the sections as listed below, but feel free to diverge of this if you think makes sense. In the organisation below the section giving the verdict would be the longest, simply because it has to list all the ASVS security requirements. Describing the organisation or process for the review might only take half a page or so, and the reflection would probably be longer than that.

  1. Organisation
    Briefly describe the way you organised the review.

    E.g. did you split the work by files in the code, or by category of security requirements? Did you double-check important findings? Or did you use several pairs of eyeballs on the same code/ security requirement, in the hope that more eyeballs spot more problems? (How) did you integrate using the static code analysis tools into this process? Did you use other tools and methods?
    Have you tried to run the application? (If so, was this useful, and did you find than running the application was helpful to then review the code, understanding its functionality better? But you might want to dicuss this in the Reflection section.)

  2. Verdict
    Give your judgement for each of the verification requirements, with a short motivation. The judgement could be If you need addition judgments besides these, feel free to introduce them, but -- as always -- do make sure your use of terminology is consistent.

    With respect to motivation: ideally you'd like to give a brief and concise justification, of say one short sentence, for your verdict of a verification requirement. But if a verification requirement is very clear and it is straightforward how one would check it, it might not be worthwhile to write up anything. Conversely, sometimes it may be quite hard to argue that some verification requirement is met: the violation of a requirement is often easier to motivate (namely with a counterexample) than the satisfaction.

    If you resorted to sampling to judge some requirement (or group of requirements) or if you considered some aspects out of scope (e.g. because it is related to the configuration rather than the code), that would be something to mention too. You could also say that in Section 1, if that makes more sense.

  3. Reflection
    Reflect on the whole process, including For example, questions to consider are:

  4. (optional) Appendix: vulnerabilities
    Instead of describing vulnerabilities that you came across as part of the motivation in your verdict in Section 2, you could also move the details of these vulnerabilities to an appendix, say in a numbered list, that you can then refer to in the `Verdict' section.
At the end of the semester we'll have a discussion in class to compare results.

Tool support for the ASVS

  1. The Security Knowledge Framework, also available here, is a tool that supports using the ASVS in development, with pointers to more background knowledge, code samples, etc. It's more geared towards development rather than code reviews, but the pointers to background info on the ASVS maybe useful. For more info, there a screencast walking you through the online demo.

Pointers to static code scanning tools

Pointers to commercial code scanning tool will be emailed separately.

Misc sources of information about PHP

Documentation generation tools

The tools below automatically generate some documention and API information from source code, which might be useful to browse the code. Of course, your favourite IDE for PHP might also have support for this.