Build the security you expect

September 5, 2023

Instead of arguing with product builders about why security is important, reframe and instead suggest they build the security they would expect.

Explaining security is hard. Convincing someone they should drop other work to do security is even harder. In a weird way, making it personal to someone makes the conversation more customer-focused by getting them to be the customer. Making adverse security trade-offs on behalf of yourself is much more difficult than arguing them away to a security schmuck. It’s also the difference between user-driven and shareholder-driven security design.

At Linktree, Bradley Shawyer created an amazing set of engineering principles to guide day-to-day software development decisions that every individual contributor faces. He asked me to include some for security. After trying many times, “build the security you expect” is the only one that was versatile enough for most situations but didn’t require a book to understand.

The premise is simple - ‘should we encrypt that thing?’ becomes ‘would you expect that thing to be encrypted if it was your data?’.

Linktree has kindly agreed to allow me to republish the principle in full with some edits to make it more generally applicable and to remove confidential information.

The principle

Rationale

Our users can’t see inside our organization but they do have expectations about our security & privacy posture, just like each of us have those expectations of the platforms we use. Because security posture is so opaque and easy to ignore, we have to make a concerted effort to represent the user as they can’t represent themselves.

The costs and risks associated with handling a security breach are significantly greater than securing the system in the first place, so we do our best to identify and mitigate risks as soon as possible, ideally in the development cycle rather than production environments.

By bringing security “left” and thinking about it from the beginning, we limit not only the financial impact but also the significant reputational impact that would be incurred in the event of a breach.

Implications

  • If you’re not sure if the way you’re building something is secure, or you think there might be security risk, bring them up in existing advisory processes such as CISO Office Hours or Architecture Reviews.
  • If in doubt, ask! People at all points of the product delivery process must have the time and resources to develop and maintain a strong awareness of current security techniques and mechanisms, and how they should be applied.
  • Our security standards, policies and guidelines are readily accessible to everyone, as well as ongoing training and guidance programs. Take the time to review them.
  • Build upon the collective knowledge and expertise in the organization by building security into the platform where possible. Optimize globally, not locally.
  • All teams are expected to be responsible for the legal compliance and privacy implications of the data within their systems. This means that data is appropriately classified and secured, both in transit and at rest.
  • We run a bug-bounty program with security researchers from across the globe to unearth vulnerabilities and potential exploits of our systems. All vulnerabilities must be fixed within the defined SLA for their severity level.
  • We automatically scan our systems for vulnerabilities whenever we integrate changes to identify them as early in the development process as possible. This includes dependencies and first-party assets.
  • The security of our underlying infrastructure platform and the software we deploy to it (including any open source and 3rd party dependencies) must be considered. We monitor and review the security of our systems on an ongoing basis, as threats continually evolve.

When to contradict this principle

Different people have different expectations. Consider the Man [Person] on the Clapham omnibus and discuss it with your teammates to get a sense if your expectations are reasonable. If you are an outlier, it may be worth contradicting the principle and adopting the team’s expectations.

Examples of this principle in practice

  1. My team is building a new service and we’re iterating on what personal data we will need to store and process.

    • ✅ You wouldn’t want Google to give more people access to your personal data than is absolutely necessary, so we shouldn’t either. Start by ingesting only the personal data we need and add more over time if it is required. Mask, anonymise, or aggregate personal data if at all possible and process it only if necessary. (Datensparsamkeit)
    • ❌ Ingest all the personal data we might need so we can go faster. Ultimately it’s okay if our service processes personal data or user generated content since we already use it elsewhere.
  2. I notice there’s something wrong with a new authentication feature. For a very small proportion of users, it doesn’t work and their accounts can be easily hacked. It’s going to take a lot of effort to fix and I’m probably going to be the one that has to do the work.

    • ❌ It’s only an edge case and it doesn’t affect many users. Put it on the backlog and we’ll look to fix it down the line some time.
    • ❌ If you were in that cohort of users, you would want to know you aren’t protected. Let the small cohort of users know and let them make up their minds on what to do.
    • ✅ If you were in that cohort of users, you would want to know that your data is just as important as everyone else’s. Talk it over with your team, prioritize a fix, and communicate the implications of investing time in the fix to your stakeholders.

Resources