How to Account for the Human Element in AppSec

James Chiappetta
better appsec
Published in
5 min readJan 6, 2021

--

An exploration of how cognitive biases and human behaviors impact the secure development of applications.

By: James Chiappetta

Disclaimer: The opinions stated here are my own, not necessarily those of my employer.

Background

There is no escaping the obvious fact that humans play a role in both creating and solving problems in the naturally occurring world. Software and application development is no exception. They are designed and implemented by humans, which means the outcomes are directly influenced by human behavior and cognitive biases.

In the world of Cybersecurity, everyone seems to be chasing the shift left approach of DevSecOps. One of the most valuable but most difficult tasks an AppSec team faces is ensuring Developers actively think about the security of the products they influence with their code. In this post, I will cover some interesting cognitive biases that impact the outcomes of how products and applications are developed.

Ostrich Effect

Cognitive Biases

Most are aware of the impacts of human nature that affect things like personal finance. I recently watched a video by James Jani about Escaping the Rat Race. He makes mention to 3 cognitive biases, which I would like to explore deeper as it relates to secure software development:

  1. Ostrich Effect
  2. Hyperbolic Discounting
  3. Social Proof

Whether they’re aware of it or not, Developers and Product Managers (PMs) will exhibit these cognitive biases as they build products. This may actively weaken the security of their products and applications.

Understanding these biases and how to counteract the potential negative outcomes is key for an Application Security team. This will effectively ensure proper security is built into products.

Let’s dig into each cognitive bias in action and see how to mitigate some of the risk.

Ostrich Effect

Definition: Ignoring or not believing inconvenient facts that cause issues or risks (e.g. an ostrich burying its head in the sand to pretend a bad situation doesn’t exist).

Potential Outcome: Developers/PMs believe that if vulnerabilities are not visible in the UI then attackers won’t find and exploit them.

Application to AppSec: Pentest uncovers vulnerabilities in the backend API that aren’t visible in the UI; developers/PMs think this is fine because it won’t be found and exploited.

Mitigation(s): AppSec shares real-world examples from breaches of attackers finding and exploiting vulnerabilities in this scenario on a recurring or regular basis with the Developer community. Perhaps through a Slack channel, regular training course, or email distribution.

Hyperbolic Discounting

Definition: A sacrifice that favors a smaller short term reward rather than a larger long term reward (think: smaller upfront lump sum lotto payout vs larger annual payout over multiple years).

Potential Outcome: Developers/PMs skip security controls in order to meet a tight deadline.

Application to AppSec: Authentication is not added to the product.

Mitigation(s): AppSec joins the process of deciding product launch requirements. AppSec conducts a design review early to call out all necessary security controls. Be sure to think about how to scale this to avoid bottlenecks through automation and security champions!

Pro tip: Devs/PMs must be made aware if they make decisions that will push security down the line and introduce risk in favor of speed. However, AppSec must be open to compromising. Sometimes the potential business benefit outweighs the risk being introduced, but these situations can be tricky for an AppSec team to navigate. Allowing certain risks to be introduced requires careful consideration, full documentation of the risk, and usually a reasonable agreed upon timeline to remediate the risk. More on this in How AppSec Can Help Balance Product Usability With Security.

Social Proof

Definition: Tendency to think and act as others do around us. Or using precedent of a bad practice as justification for making a bad decision.

Potential Outcome: Developers/PMs saying that another team or product does something insecure so it’s okay if they do it too.

Application to AppSec: A new product launches without HTTPS because another product was able to launch without HTTPS.

Mitigation(s): AppSec to help Product and Developers build a secure design review process to add the attacker mindset with threat modeling. Relay to Devs/PMs that previous bad decisions do not justify current or future bad decisions.

Being an Empathetic Partner

As an AppSec Engineer, it may be difficult to empathize with Software Developers and Product Managers about these cognitive biases. I implore all AppSec folks to try as hard as you can to be empathetic. I make mention of this practice in almost all of my posts as it is an important driver of building trust and a strong relationship with those that need your help.

Additionally, AppSec folks need to put a focus on balancing building high quality software without impacting the speed or usability. It would be very difficult to achieve this with zero empathy.

Takeaways

  1. Be aware of the cognitive biases that influence decisions on the day to day and ensure your AppSec processes to account for them.
  2. Human nature often plays a silent but noticeable role on the application security outcomes of a product. These can be both embraced but mitigated with basic AppSec practices and empathetic partners.
  3. Provide Developers the tools and knowledge to make good security decisions. You don’t want AppSec to be a bottleneck. Use automation, education, and empowerment to scale AppSec.
  4. Make sure there are solid security review practices in place that properly balance the needs of Product with Security. This is no act of impossibility!

Words of Wisdom

It’s difficult to put yourself in someone else’s shoes sometimes. I am not saying you have to do so in order to understand where another person is coming from. What I am saying is we should acknowledge the fact we are all uniquely human and securing software is often hard. This level of awareness of the limitations of humankind coupled with the digital world we are building is truly an opportunity for folks out there to do some great work.

“No one cares how much you know, until they know how much you care” ― Theodore Roosevelt

Cognitive Biases Table

Application Security Cognitive Biases Table

Contributions and Thanks

A special thanks to those who helped peer review and make this post as useful as it is: list of people: John Nichols, Luke Matarazzo, Tim Lam, and Kyle Suero.

A special thanks to you, the reader. I hope you benefited from it in some way and I want everyone to be successful at this. While these posts aren’t a silver bullet, I hope they get you started.

Please do follow my page if you enjoy this and my other posts. More to come!

--

--

Started my career pentesting and building security tools. I have built several security teams. I believe in a balanced approach to cybersecurity.