Totally agree with this article, there are no immediate consequences to bad CPU bound code. I am seeing the consequences of this at my current position as a Senior Engineer at a startup where a lot of things were written naively, pushed the burden to cloud costs to keep pushing out features.
Something that really taught me to look for things like the "HDD Light" or Fan speed was starting my career in embedded systems. 16 bit MCUs really let you know if you are trying to much on them. They also let you know if you toggle the wrong pin by going up in flames.
The disconnect between your fingers and what actually runs the code is becoming greater and greater for newer developers. It will be interesting to see how computing power keeps up with bad code (it had been doing a good job so far).
Programmers don't (can't?) dog food[1] their code because their development machines are Intel Xeon or AMD Threadripper monstrosities with abominable GPUs to match.
Most apparent are the web programmers, most of them assume everyone has 10gbit fiber connections with 16 core CPUs and 128GB of RAM to feed to Chrome. And then they wonder why their shit runs like shit in the real world.
> Most apparent are the web programmers, most of them assume everyone has 10gbit fiber connections with 16 core CPUs and 128GB of RAM to feed to Chrome
Facebook marketplace. I don't know how you can make a grid of images max out a 5800x, but they've managed it. A markedly inferior product to every one of the classified ad competitors they've squashed out of existence.
> their development machines are Intel Xeon or AMD Threadripper monstrosities with abominable GPUs to match.
Not all developers. In my entire career, I've never had a development system that was better than the average consumer machine. And, honestly, I wouldn't want one that was, for the exact reasons you state.
Are you certain about that? I’d wager that the average (median?) consumer machine these days is a a smartphone, and not an especially high-end one at that.
Yes, I'm sure. I'm not counting smartphones in this at all, though, because I don't develop apps for smartphones anymore (it's not a market that interests me). By "average machine", I meant a budget laptop or tower.
I've never been terribly bothered by compile times in general. 90% of my work is with compiled languages, and (in most languages) incremental compilation does a great job of keeping compile times low. Even for large projects on slow machines, I rarely see compile times exceed even a minute. More usually, compiling just takes a few seconds.
Sure, you do occasionally need to do a full recompile, and if the project is very large then this can take a nontrivial amount of time, but that doesn't really happen often enough for me to be bothered about.
If the project is of really significant size, or if the compiler/language/code structure is such that incremental compilation can't give you significant gains (or isn't even possible), you may need more grunt of course.
Programmers developing webshits should really be forced to add 200ms to all their roundtrips. It’s worrying how many people assume everyone is 20ms from their “edge”, and openly advocate on HN for “new” technology that requires a roundtrip on every single state change, unwinding twenty years of client side advancements.
As someome who doesn't live in the US I appreciate the nod. Unless the company has edges in the southern hemisphere, every single widget has a round trip of 2-300ms. Something like Jira with it's cascading react components was a jumbling absurd mess until all the roundtrips completed, which could take 10+ seconds. A liftime from a UX perspective.
That is one advantage of being a firmware engineer. In my case, this code runs on exactly one machine (usually) so whatever performance I get is very indicative of what the customer experience is like. My dev machines are usually just unfused versions of whatever hardware I'm working on.
I used to be a computer engineer working on embedded systems for automobiles. Quiescent current is what the normal proper draw is called for these systems when the car is off. We worked very hard getting these numbers in spec but it was hard to catch everything especially in this case where the issue is probably due to software missing the sleep state for that module. This could be from bad code or your CAN/LIN bus is messed up in your car. 99% of mechanics (and engineers) have no idea where to start with debugging these issues and the answer will be "replace the module".
Do they literally do the whole hiring process, or do take use a central hiring department to help them. It sounds really inefficient to have every team duplicate hiring personnel? Though of course you need the team to be in control of the process.
Do the teams organise any other company wide functions in that way? Hardware procurement, maybe?
There is a central HR department that creates job postings, employs recruiters and sourcers, and handles standardized onboarding processes, but conducting interview loops and making hiring decisions is handled by each team individually. As opposed to companies like Google, where as I understand it, you get hired and then find a specific position.
Before becoming a web developer I worked as an embedded systems dev for a few years. That experience left me with valuable skills in data structures (Everything is C), understanding how a processor works and executes instructions and hardcore debugging.
Not sure if I would recommend building a career in embedded but you will start learning transferable skills doing it in your free time.
Hey, maybe off-topic but I felt compelled to finally create a HN account to ask a question: would you mind sharing your thoughts and motivations behind your switch from embedded to web dev?
Currently, I am an embedded system developer and I concur, I've learned a lot about data structures and how computers work because "everything is C". Especially true as I graduated as EE with more interest in software. I've been enjoying my short career so far, and like the way my thinking has changed from all learned experience. But I can't lie that I worry about whether I should be focusing on web tech and that embedded stuff will fall off. You not recommemding building a career in this area and actually making the switch intrigues me a lot.
If your concern is the ability to get jobs, I can see the point of that question. Keep in mind (I think) that it's increasingly difficult to pull off a full career of any kind of programming.
One thing that happened in embedded, and bear in mind that it ain't just low-end microcontroller stuff, is that it tends to be tightly bound to hardware. Hardware tends to be built in Asia (which I remember ramping up in the mid 1980's). Successful/higher volume products tend to be built in Asia. Engineering tends to follow manufacturing over time.
For fun, it's hard to beat specialty companies that build complicated gizmos with software in some sort of sexy business, but they are hard gigs to track down.
There were a few reasons to switch. I was working in the automotive industry where building anything is a race to the bottom. "How much cheaper can we make this than our competitor" was how design decisions were usually made. Software was the bottom of this list as manufacturing doesn't see it as a profit driver and mostly as an expense. I was in a pickle because I loved the projects I was working on... Ultrasonics, Radar combined with some sort of actuator or motor. Lots of FFTs, debugging boards, soldering, learning Altium and a great boss but I was always pulling nails to get money for projects. I knew I wanted to work somewhere else where the product was software / electrical hardware and I was having a hard time finding the right fit.
After talking with some friends who do webdev (Node, Python) I learned JS in a couple of weeks and got hired into a consulting firm where I worked for 6 months building a document search engine in C# on top of ElasticSearch. I now lead a team at a small firm that builds specialized search applications for different parts of the internet.
I now know of places that have the software culture (and software money) that do embedded systems well but I love what I do now with no shortage of cool problems to work on. Not sure if I would go back into embedded but I still dabble with Arduino's and Pi's.
> "But I can't lie that I worry about whether I should be focusing on web tech and that embedded stuff will fall off."
While they're never going to be as plentiful as web jobs, it seems unlikely embedded jobs are going to be decreasing anytime soon. The market for industrial and consumer IoT and wireless mobile devices just keeps on growing and growing. Unless you actually prefer webdev over embedded, there's no reason to jump ship and every reason not to waste your specialized education.
If anything, it's more likely that web development will hit a peak and then start declining. As people on HN are fond of proclaiming, much of a college education isn't needed to be a successful web developer (very much unlike embedded development). So every year, more and more people keep flooding into web development and that is bound to depress wages and demand at some point.
My educational background is more towards EE (ECE, focused on control theory) but my work background is more towards software and IT. I've been working on embedded software for a decade or so. If anything, it seems like the embedded industry is getting bigger rather than smaller, like my role in the industry will be around for a lot longer than I'll want to be in it.
People get quite invested in their physical things, and I think that's going to translate in to ongoing software updates, and making interfaces between older things and new things (I'm currently working on an interface to old-school telephone gear, working in Rust). More things are being made with computers in them, those computers are getting more sophisticated, people are caring more about software quality and security. And, as a sibling post mentions there's always pressure to (re)engineer things to cost less or earn more money.
Sidetracking a bit, I think one of the best pieces of career advice is to be comfortable at the intersection of a couple different areas/fields. While it may be easy to find/train a developer or EE to some particular skill level (maybe not the best example given the closeness of these fields), it's much harder to get someone at that level in both fields. It sounds like you're already doing that by working as a developer but having an EE degree - you'll do well to stay sharp in both.
My first job out of college was doing embedded development and since then I've moved onto Mac and iOS development with a little bit of web frontend and backend work.
I have to say, the embedded stuff feels way more like you have control over the machine, I guess for obvious reasons. The tooling is specifically designed for your hardware, so what you need to do to achieve your goals is more obvious. I've found native app development similar to that. There's nothing quite like just focusing on a specific problem and plugging away at it and not having tools get in your way.
On the other hand, web development is kind of a chore in that there are so many tools, frameworks, and just different ways of doing things. For focused work, I think I prefer embedded development, but it's kind of limited in the type experiences you can gain for sure.
I've gone the opposite direction (kind of). I started learning web dev (self taught). PHP, Ruby, Python, JS. Full stack but never got into the cloud era much. But pretty confident I could build anything I set my mind to. This began in the 90s and a couple years ago I started messing with Arduino and finally using C after years of using C-inspired things (it's much less fun for me, 90% seamless but at times feels like work trying to figure out the simplest things seems difficult sometimes or the generally accepted solution is way more complex than I really want). During about Fall of 2020, extra bandwidth from COVID WFH, I finally thought of a big project I wanted to pour some effort into (excitement of making lights blink wore of fast). More difficult than the code, is all the EE stuff. And actually having to engineer things together to create a project. At least in my case, I'm using stepper motors, which means stepper motor drivers and secondary power supplies. I'm also using air pumps, solenoid valves, linear actuators, limit switches, rails systems (open builds is awesome), all kinds of things. The breadboard wiring was insane. Then I built a PCB to organize some of it. I still have no idea when or if I should be using resistors, capacitors, etc. I'm super unfamiliar with electronics but I feel pretty confident at this point I can learn/solve anything on YouTube. Emphasis on Youtube. I Google for software and Youtube for hardware; that's a key distinction I've noticed. I need to see how other people are wiring things together. I can't read a diagram at all. I'm pretty close to a working prototype and have somehow pieced it all together.
Glad you’re making headway and progress on your idea and glad you’re catching up on knowledge on youtube. My honest opinion is that you don’t need to know everything EE to get stuff done and if you indeed have a great idea that can potentially make some money, you can hire someone with experience in the field.
I work as a back-end developer so I usually get don't work at the "bare-metal" level. I did get to use a CPU profiler to isolate a lock contention issue with one of our services recently and it was super fun.
But I have realized that I have some gaps in understanding how a processor works (bad idea to not pay attention in your Comp Arch class), I am trying to cover that up by reading up on Comp Arch (Onur Mutlu lectures).
would working on Embedded (in my spare time) help in understanding the processor at a deeper level ? Any pointers on how I can accomplish that ? I am well aware that modern processors are complicated beasts, more complicated than Embedded systems for sure not sure how transferrable those skills are.
Are you more interested in understanding the processor itself, or low-level software, eg operating systems concepts and how that interacts with the hardware?
Embedded can certainly help with both, but I would go in the opposite direction from hobbyist-oriented systems like Arduino: learn to set everything up yourself. Bare metal programming, particularly with C and assembly, will require you to understand things like stack pointers, setting up your clocks, initialising hardware.
While 8-bit micros are simpler and you could start with one of those if you wanted (AVR is probably the most accessible), I'd probably go straight to a 32-bit Cortex-M4, get a cheap dev board like [0], and get the sample programs running and try to understand every part of those.
I found that AVR devices are more approachable microcontrollers for learning the concepts of setting up and using peripherals at the register level, and studying the part's datasheet. They are so much simpler than ARM chips, with far fewer peripherals and registers.
I think a good approach to this is to take working examples, look up the used registers in the datasheet, understand why setting up the peripherals that particular way accomplishes the goal of the example, and then start experimenting with changes to those registers.
Something that really taught me to look for things like the "HDD Light" or Fan speed was starting my career in embedded systems. 16 bit MCUs really let you know if you are trying to much on them. They also let you know if you toggle the wrong pin by going up in flames.
The disconnect between your fingers and what actually runs the code is becoming greater and greater for newer developers. It will be interesting to see how computing power keeps up with bad code (it had been doing a good job so far).