and the problem there (as I see it) is that they don't care about security, they care about passing their audit.
"Passing our audit" has been presented with measurable consequences (cannot sell to customers) and finite, well-defined actions (this is what the audit list looks like).
What I'd like (the goal of the follow up article, coming soon) is to present the value of security in a way that makes the justification of the effort viable and palatable.
Those are great points. And what you're saying is why I used the "nobody cares about backups" analogy.
It's NOT that nobody cares about the results of security. It's that those results ("not losing our sales database")are often not presented clearly or coherently enough for the decision makers to recognize the value of the activity ("doing regular backups, paying for offsite storage, etc.")
No, I think I get you. my point was, unlike backups, security is formally defined as those results. it isn't just the decision makers but the technical professionals that don't get what security is. if you design a database, you probably care about the type of security (which is just secure coding/design) you said nobody cares about, but if you admin a database, then security is all about protecting the data that will impact the business in a meaningful way. i.e.: even if it contains a meaningless data, an exposed db on the internet can impact reputation and potential revenue. or if it's a DoS attack, the availability of the service provided will be impacted (a security property).
To sum it up, what business people think about the term "secure" in terms of computer information is "The data we need for business has confidentiality, I can rely on its integrity and it will be available when we need it for business reasons". They may not necessarily be concerned abut quantifiable and/or short-term profits. appearances, morale, ability to recruit new hires, come up with new solutions/products better than the competition can, because the systems we use are reliable and secure with less hoops to jump through because of "security theatrics".
I don't think this is true. The opposite, really. I think that we continue to present security as a "shift left" ("SHIT left") strategy, dumping the responsibility on devs without any framework for why they should care.
But if we built a culture and practice that low-security code is low-quality code, and made security issues a software defect like any other, it would get handled. Plenty of developers (and leads, and PMs) are fine with shipping low-security code, but would fight to the death if accused of shipping low-quality code.
"more on this" got pushed to the follow-up piece, which is coming soon. Sorry to keep you in suspence. I had to balance people's time to read with the length of the information I was sharing.