> TFA author - I have worked on low-cost embedded systems. (Tens/hundreds of megs of RAM, not KBs.)
I worked with kilobytes, small team, everyone sat together. No one was hoarding resources, because just to ship out the door we had to profile every single function for memory and power usage!
IMHO Hundreds of MB of RAM isn't embedded, it is just "somewhat constrained". :-D Regular tools run just fine, you can use higher level languages and garbage collectors and such, you just have to be a little bit careful about things.
> Researchers actually have a limited and smallish hardware budget, so academia is likely to come up with cost-saving ideas even when hardware performance grows very quickly.
Agreed, but I also think it is difficult to determine when forward progress is stymied by too many resources VS too few!
I worked with kilobytes, small team, everyone sat together. No one was hoarding resources, because just to ship out the door we had to profile every single function for memory and power usage!
IMHO Hundreds of MB of RAM isn't embedded, it is just "somewhat constrained". :-D Regular tools run just fine, you can use higher level languages and garbage collectors and such, you just have to be a little bit careful about things.
When you drop under a MB, but are still expected to have a fully modern UI, then things get interesting. (I wrote up a blog post about my experiences working in embedded @ https://meanderingthoughts.hashnode.dev/cooperative-multitas...)
> Researchers actually have a limited and smallish hardware budget, so academia is likely to come up with cost-saving ideas even when hardware performance grows very quickly.
Agreed, but I also think it is difficult to determine when forward progress is stymied by too many resources VS too few!