This is one of the reasons why I feel my job in security is so unfulfilling.
Almost nobody I work with really cares about getting it right to begin with, designing comprehensive test suites to fuzz or outright prove that things are secure, using designs which rule out the possibility of error.
You get asked: please look at this gigantic piece of software, maybe you get the source code, maybe it's written in Java or C#. Either way, you look at <1% of it, you either find something seriously wrong or you don't[0], you report your findings, maybe the vendor fixes it. Or the vendor doesn't care and the business soliciting the test after purchasing the software from the vendor just accepts the risk, maybe puts in a tissue paper mitigation.
This approach seems so pointless that it's difficult to bother sometimes.
edit:
> #4) Hacking is Cool
I think it's good to split unlawful access from security consultancy.
You don't learn nearly as much about how to secure a system if you work solely from the point of view of an engineer designing a system to be secure. You can get much better insight into how to design a secure system if you try to break in. Thinking like a bad actor, learning how exploitation works, etc. These are all things which strictly help.
[0]: It's crazy how often I find bare PKCS#7 padded AES in CBC mode. Bonus points if you either use a "passphrase" directly, or hash it with some bare hash algorithm before using various lengths of the hash for both the key and IV. Extra bonus points if you hard code a "default" password/key and then never override this in the codebase.
It's very easy for a big organisation with a leadership that needs to show it is doing something to pass down a mandate that relies on throwing money at the problem for little tangible benefit. "Everything needs to be pen tested" is the sort of thing that sounds like the right thing to do if you don't understand the problem space; it's exactly as wrong as using lines of code as a productivity metric.
All it does is to say, very expensively, "there are no obvious vulnerabilities". If it even manages that. What you want to say is "there are obviously no vulnerabilities" but if you're having to strap that onto a pre-existing bucket of bugs then it's a complete rebuild. And nobody has time for that when there's an angry exec breathing down your neck asking why the product keeps getting hacked.
The fundamental problem is the feature factory model of software development. Treating software design and engineering as a cost to be minimised means that anything in the way of getting New Shiny Feature out of the door is Bad. And that approach, where you separate product design from software implementation, where you control what happens in the organisation with budgetary controls and the software delivery organisation is treated as subordinate because it is framed as pure cost, drives the behaviour you see.
This is one of the reasons why I feel my job in security is so unfulfilling.
Almost nobody I work with really cares about getting it right to begin with, designing comprehensive test suites to fuzz or outright prove that things are secure, using designs which rule out the possibility of error.
You get asked: please look at this gigantic piece of software, maybe you get the source code, maybe it's written in Java or C#. Either way, you look at <1% of it, you either find something seriously wrong or you don't[0], you report your findings, maybe the vendor fixes it. Or the vendor doesn't care and the business soliciting the test after purchasing the software from the vendor just accepts the risk, maybe puts in a tissue paper mitigation.
This approach seems so pointless that it's difficult to bother sometimes.
edit:
> #4) Hacking is Cool
I think it's good to split unlawful access from security consultancy.
You don't learn nearly as much about how to secure a system if you work solely from the point of view of an engineer designing a system to be secure. You can get much better insight into how to design a secure system if you try to break in. Thinking like a bad actor, learning how exploitation works, etc. These are all things which strictly help.
[0]: It's crazy how often I find bare PKCS#7 padded AES in CBC mode. Bonus points if you either use a "passphrase" directly, or hash it with some bare hash algorithm before using various lengths of the hash for both the key and IV. Extra bonus points if you hard code a "default" password/key and then never override this in the codebase.