Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a mostly terrible 19-year old list.

Here is an example:

"Your software and systems should be secure by design and should have been designed with flaw-handling in mind"

Translation: If we lived in a perfect world, everything would be secure from the start.

This will never happen, so we need to utilize the find and patch technique, which has worked well for the companies that actually patch the vulnerabilities that were found and learn from their mistakes for future coding practices.

The other problem is that most systems are not static. It's not release a secure system and never update it again. Most applications/systems are updated frequently, which means new vulnerabilities will be introduced.



Steelmanning for a moment: I think what the author is trying to address is overly targeted "patches" to security vulnerabilities which fail to address the faulty design practices which led to the vulnerabilities. An example might be "fixing" cross-site scripting vulnerabilities in a web application by blocking requests containing keywords like "script" or "onclick".


I agree, an example that if you say something dumb with enough confidence a lot of people will think it's smart.


The CTO at my last company was like this.

In the same breath he talked about how he wanted to build this “pristine” system with safety and fault tolerance as priority and how he wanted to use raw pointers to shared memory to communicate between processes which both use multiple threads to read/write to this block of shared memory because he didn’t like how chatty message queues are.

He also didn’t want to use a ring buffer since he saw it as a kind of lock


That sounds pretty deep in the weeds for a CTO. Was it a small company?


It was. I was employee number 10. Just started and was entirely bankrolled by that CTO

The CTO sold a software company he bootstrapped in 2008 and afaik has been working as an exec since.

The CEO, a close friend of Mr CTO, said that the system was going to be Mr CTO’s career encore. (Read: they were very full of themselves)

The CIO quit 4 days before I started for, rumor has it, butting heads with the CTO.

Mr CTO ended up firing (with no warning) me and another dev who were vocal about his nonsense. (Out of 5 devs total)

A 3rd guy quit less than a month after.

That’s how my 2024 started


I've had the CTO who was also a frustrated lock-free data structure kernel driver developer too.

Fun times.


I forgot to mention that we were building all this in C#, as mandated by Mr CTO.

He also couldn’t decide between windows server and some .RHEL or Debian flavor

I doubt this guy even knew what a kernel driver was.

He very transparently just discounted anything he didn’t already understand. After poorly explaining why he didn’t like ring buffers, he said we should take inspiration from some system his friend made.

We started reading over the system and it all hinged on a “CircularBuffer” class which was a ring buffer implementation.


Okay, that would be a normal amount of bonkers thing to suggest in C or another language with real pointers.

But in C#, that is a batshit insane thing to suggest. I'm not even sure if it's even legal in C# to take a pointer to an arbitrary address outside of your memory. That's.. That's just not how this works. That's not how any of this works!


It is legal to do so. C# pointers == C pointers, C# generics with struct arguments == Rust generics with struct (i.e. not Box<dyn Trait>) arguments and are monomorphized in the same way.

All of the following works:

    byte* stack = stackalloc byte[128];
    byte* malloc = (byte*)NativeMemory.Alloc(128);
    byte[] array = new byte[128];
    fixed (byte* gcheap = array)
    {
        // work with pinned object memory
    }
Additionally, all of the above can be unified with (ReadOnly)Span<byte>:

    var stack = (stackalloc byte[128]); // Span<byte>
    var literal = "Hello, World"u8; // ReadOnlySpan<byte>
    var malloc = NativeMemory.Alloc(128); // void*
    var wrapped = new Span<byte>(malloc, 128);
    var gcheap = new byte[128].AsSpan(); // Span<byte>
Subsequently such span of bytes (or any other T) can be passed to pretty much anything e.g. int.Parse, Encoding.UTF8.GetString, socket.Send, RandomAccess.Write(fileHandle, buffer, offset), etc. It can also be sliced in a zero-cost way. Effectively, it is C#'s rendition of Rust's &[T], C++ has pretty much the same and names it std::span<T> as well.

Note that (ReadOnly)Span<T> internally is `ref T _reference` and `int _length`. `ref T` is a so-called "byref", a special type of pointer GC is aware of, so that if it happens to point to object memory, it will be updated should that object be relocated by GC. At the same time, a byref can also point to any non-GC owned memory like stack or any unmanaged source (malloc, mmap, pinvoke regular or reverse - think function pointers or C exports with AOT). This allows to write code that uses byref arithmetics, same as with pointers, but without having to pin the object retaining the ability to implement algorithms that match hand-tuned C++ (e.g. with SIMD) while serving all sources of sequential data.

C# is a language with strong low-level capabilities :)


I was surprised to learn c# could do all that, honestly.

Though when you’re doing that much hacking, a lot of the security features and syntax of C# get in the way


It's also outright stupid. For example, from the section about hacking:

> "Timid people could become criminals."

This fully misunderstands hacking, criminality, and human nature, in that criminals go where the money is, you don't need to be a Big Burly Wrestler to point a gun at someone and get all of their money at the nearest ATM, and you don't need to be Snerd The Nerd to Know Computers. It's a mix of idiocy straight out of the stupidest 1980s comedy films.

Also:

> "Remote computing freed criminals from the historic requirement of proximity to their crimes."

This is so blatantly stupid it barely bears refutation. What does this idiot think mail enables? We have Spanish Prisoner scams going back centuries, and that's the same scam as the one the 419 mugus are running.

Plus:

> Anonymity and freedom from personal victim confrontation increased the emotional ease of crime, i.e., the victim was only an inanimate computer, not a real person or enterprise.

Yeah, criminals will defraud you (or, you know, beat the shit out of you and threaten to kill you if you don't empty your bank accounts) just as easily if they can see your great, big round face. It doesn't matter. They're criminals.

Finally, this:

> Your software and systems should be secure by design and should have been designed with flaw-handling in mind.

"Just do it completely right the first time, idiot!" fails to be an actionable plan.


though, frequent updates mainly serve to hide unfit engineering practices and encourage unfit products.

the world is not static, but most things have patterns that need to be identified and handled, which takes time that you don't have if you sprint from quick-fix to quick-fix of your mvp.


There's definitely a few worthwhile nuggets in there, but at least half of this reads like a cringey tirade you'd overhear at the tail end of the company holiday party from the toasted new helpdesk intern. I'm surprised to see it from a subject matter expert, that he kept it on his website for 20 years, and also that it was so heavily upvoted.


I really didn't think "write secure software" would be controversial, but here we are. How is the nihilist defeatism going? I'll get back to you after I clean up the fallout from having my data leaked yet again this week.


*Translation: If we didn't just pile on dependencies upon dependencies, everything would be secure from the start.

Come on. The piss-poor security situation might have something to do with the fact that the vast majority of software is built upon dependencies the authors didn't even look at...

Making quality software seems to be a lost art now.


> Making quality software seems to be a lost art now

No it is not. Lost that is

Not utilised enough....


[flagged]


No one is objecting to writing secure software, but saying "just do it" is big "draw the rest of the owl" energy. It's hard to do even for small-medium programs, nevermind enterprise-scale ones with 100+ different components all interacting with each other.


It's not controversial to write secure software.

Saying that it's what should be fine is useless. Since it's not instructive.

Don't fix implementation issues because that just papers over design issues? Great. Now we just need a team that never makes mistakes in design. And then a language that doesn't allow security issues outside business logic.


Trying to do security wrong often leads to much worse outcomes for data leakage than not doing it optimally. It's counter intuitive, but a lot of things in security are such.


> This is a mostly terrible 19-year old list.

This is an excellent list that is two decades over due, for some

> software and systems should be secure by design

That should be obvious. But nobody gets rich except by adding features, so this needs to be said over and over again

> This will never happen, so we need to utilize the find and patch technique,

Oh my giddy GAD! It is up to *us* to make this happen. Us. The find and patch technique does not work. Secure by design does work. The article had some good examples

> Most applications/systems are updated frequently, which means new vulnerabilities will be introduced.

That is only true when we are not allowed to do our jobs. When we are able to act like responsible professionals we can build secure software.

The flaw in the professional approach is how to get over the fact that features sell now, for cash, and building securely adds (a small amount of) cost for no visual benefit

I do not have a magic wand for that one. But we could look to the practices of civil engineers. Bridges do collapse, but they are not as unreliable as software


> The flaw in the professional approach is how to get over the fact that features sell now, for cash, and building securely adds (a small amount of) cost for no visual benefit

Because Capitalism means management and shareholders only care about stuff that does sell now, for cash.

> But we could look to the practices of civil engineers

If bridge-building projects were expected to produce profit, and indeed increasing profit over time, with civil engineers making new additions to the bridges to make them more exciting and profitable, they'd be in the same boat we are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: