Nothing is secure
It is the year 2024 and yet there are people believing that there is such a thing as secure software. Customers that commission an app and expect it to be secure. Large / mid / small partners, freelancers or developers that promise secure apps. End users that expect secure apps. I have no idea why someone thinks that such a thing like a secure app is possible. It is even more a surprise when the app is accessible on the internet, or even a cloud native app. I can understand the end user perspective. That the app offered is secure, is not leaking data, is protecting the private user data, this is something an end user can expect. Expect, OK. But demand? Also end users should be prepared to have their data leaked. Software is written by humans, and humans are not perfect.
The audacity to think that a developed app is secure, I have no idea why a customer or partner come to think that they can demand or offer this. There are technical aspects that should make it clear that this is not possible. We thought that SSLv3 is secure, and then: Heartbleed. We thought that logging is best practice: Log4Shell. Aside the tech stack, an app with a more or less modern architecture comes with several parts, like frontend, backend, middleware, API, database, etc. Each of these is a potential security issue. Adding implementation partners for each component to it, you get additional complexity on top of the complex app architecture. Humans are not perfect, so there are many, many layers where a security issue can be introduced.
The question that everyone involved in an app should ask is not: is the app secure. Just accept that the chance that the app is secure is close to zero. Or: the chance that the app will need to be fixed due to an error, a security issue, aims at 100%. The longer the app is around, it will be 100%.
Don’t act surprised when a security issue is found. Don’t be surprised as a customer that your app has a problem. Don’t be surprised as a partner and developer that the app you delivered contains a security issue. End users should only be surprised about the timing: if early: why so early? If only after months/years after go-live: why so late? But never be surprised that a security issue was found. Never.
Rather: be prepared. Prepare the app before going live. Check the app for possible security issues. Have (outside) people check the app. These can be people from another team, or from a different company. For apps intended to be used by an unknown number of end users, or even unknown users (aka: a public app), consider a pen test. Do this before setting the app live. After the go-live, schedule a security test after every x month. Even when the app does not change: libraries might get patched and introduce new attack vectors. Remember that apps are developed by humans. Check the coding. Developers try their best. But they also learn with each app. Therefore: customers must have a process in place to ensure that when something is found, it can be reported – how doesn’t matter – and that actions are executed internally. It is the task of the customers to establish this process. It is their duty to ensure that the process works, continuously tested and optimized. When it is time that something is found: no one should act surprised, be angry, sad or upset. It happened. Time to execute order 66. Follow strictly the process to solve the issue. Inform the stake holders, check what was leaked, how severe is it, take the service down, fix it, plan to get it back up online, inform users, etc.
The, let’s call it, process SNAFU needs to be trained. Not having it in place before setting a (public) app live is risky. Remember: SUSFU will be the new normal. There are billions of people on planet earth, several billions with internet access, millions with the skills to analyze apps and still thousands and thousands that might want to cause havoc.
With that in mind, customers must act. It is not enough to just order an app. Customers are the owner not only of the app, but are also responsible for it. They can delegate certain tasks, but not the responsibility. This demands a change in partnership. Not only with the developers (internal, external), but also with the partners. No: workbench. No: just deliver. No finger pointing, no hiding, no ignoring. Open communication is mandatory. Strong partnerships where everyone understands that actions must be taken fast, that processes must be followed stringently.
If something is found, everyone must know that independent of their role, the customer in the end is responsible for the app. Everyone must work together with the goal to solve the problem. And, just to remember: the customer is important, but even more important, if not the most important stakeholder: the user. Did the security issue involve user data? Personal data? Then everyone must work together to ensure that the user and the personal data is protected. You might think that this is easy. That this is normal. But no: customers should be happy when they have a partner that works together with them. That stands up and knows their role. That do not cover up their errors. That communicate transparent with their customer. That know what to do and help the customer to establish the SUSFU process. Any company / partner that once went through a security issue and is open about this is a partner every customer should cherish. These kind of partners are hard to find.
And after the issue is solved? Collect the lessons learned, improve your SUSFU process and count the days until the next security incident.
0 Comments