I wonder how much is just EEs looking at SWE resumes and going "why would I pay that much for this?! writing code isn't that hard" I definitely get that vibe from some of the local hw-eng companies.
And they may not be wrong, but.. sorry, that's supply and demand. If I have to go write stupid NodeJS stuff to get paid decently, I guess I'll have to go do that.
I worked at a place once where one of the EEs who wrote firmware told me that algorithms and data structures were pointless because in the end it's just bits in a linear address space in RAM.
The industry has basically screwed itself. It's pretty typical for companies to consider embedded/firmware as EE work that is done in the gaps of the hardware schedule. EEs generally make bad programmers which shouldn't be a surprise as their background is usually not in software development; I similarly shouldn't be hired to do EE work. Because of this the code bases tend to be abysmal in quality.
The salary for these positions tends to be tied to EE salaries which for some reason are quite low. So it's hard to attract good talent willing to deal with the extremely poor code quality and all of the other extra challenges this field has on top of normal software challenges.
Since few software developers are attracted to this niche there's not a lot in terms of libraries or frameworks either, at least not in comparison to most other software ecosystems. I've had a start-up idea for a while now to really close that gap and make embedded development far more sane in terms of feature development and such, but I worry nobody would even bother to use it.
I've been in the embedded space for years now and I've been considering bailing because the problems just aren't worth the pay.
> one of the EEs who wrote firmware told me that algorithms and data structures were pointless because in the end it's just bits in a linear address space in RAM.
This is, of course, wrong. However, I think I understand where this EE was coming from.
At the end of the day, once all is said and done, there's a minimal set of instructions necessary for a CPU to perform any task. One could add to that two more variables: minimum time and minimum resources (which is generally understood to be memory).
So, at least three optimization vectors: instructions, time and resources.
Today's bloated software, where everything is layers upon layers of object-oriented code, truly is pointless from the perspective of a CPU solving a problem along a stated combination of the three vectors listed above.
The way I think of this is: OO exists to make the programmer's life easier, not because it is necessary.
I believe this statement to be 100% correct. OO isn't a requirement for solving any computational problem at all.
Of course, this cannot be extended to algorithms. That part of the EE's is likely indefensible.
How about data structures?
Some, I'd say. Again, if the data structure exists only to make it easier for the programmer, one could argue it being unnecessary or, at the very least, perhaps not optimal from the perspective of the three optimization vectors.
It's nothing groundbreaking, although my idea alone wouldn't really help in the safety critical space.
If web development were like embedded development every single company would be building their own web server, browser, and protocol the two communicate over. It would take a phenomenal amount of time and the actual end product, the website, would be rushed out the door at the very tail end of this massive development effort. As the complexity of the website grows, the worse it gets. All of the features being sold to customers take a backseat to the foundational work that costs the company money either through initial development or ongoing maintenance. Plus there's very little in the way of transferable skills since everything tends to be bespoke from the ground up which poses a problem when hiring.
In this analogy that base layer is really just hardware support. This is starting to change with projects like mbed, zephyr, etc. There's still a lot to be desired here and these realistically only work in a subset of the embedded space.
My idea comes in after this. Keeping with the analogy, consider it Ruby on Rails or NodeJS for the embedded world. Certainly not appropriate for all things, but a lot of what I have worked on professionally would benefit from this.
> one of the EEs who wrote firmware told me that algorithms and data structures were pointless because in the end it's just bits in a linear address space in RAM.
At a previous job, the project lead (mechanical) assigned the embedded team (2 people) writing the firmwares for 3 boards (multi-element heater control, motor controller and move orchestrator with custom BLDC setup, multi-sensor temperature probes) in 2 weeks over christmas, because the junior EE said “I can control a motor with arduino in 30 minutes.” My only guess as to why such a disconnect from reality was possible is that the EE had a MIT degree, while I’m self-taught, and that we had always delivered our firmwares on time and without bugs.
I mean, it's the same phenomenon I've seen even in webdev where a PM or UX person who has produced a whole series of mocks then hands it off to the "programmers" and demands a short schedule because... well... they did all the hard stuff, right? You're just making it "go."
People naturally see their own hard work and skills as primary. I know enough about HW Eng and EE to know that it's actually really hard. That said, it doesn't have the same kind of emergent complexity problems that software has. Not to say that HW eng doesn't have such problems, but they're a different kind.
If you see the product as "the board", then the stuff that runs on the board, that can end up just seeming ancillary.
Oh, no, this was super common. When the Arduino (and, soon afterwards, the Pi) were launched, for several years, about 20% of my time was spent explaining higher-ups why there's a very wide gap to cross between a junior's "I can control a motor with Arduino in 30 minutes" and "We can manufacture this and make a profit and you can safely ship it to customers".
Don't get me wrong, the Arduino is one of the best things that ever happened to engineering education. Back in college I had to save money for months to buy an entry-level development kit. But it made the non-technical part of my job exponentially harder.
Ha. Try telling a customer that even though he's prototyped his machine with three arduinos (he used three because he couldn't figure out how to do multitasking with just a single one...) in a couple of weeks, it will be a $100k project to spin up a custom circuit board and firmware to do the same thing. And no, we can't reuse the code he already wrote.
I wonder how much is just EEs looking at SWE resumes and going "why would I pay that much for this?! writing code isn't that hard" I definitely get that vibe from some of the local hw-eng companies.
And they may not be wrong, but.. sorry, that's supply and demand. If I have to go write stupid NodeJS stuff to get paid decently, I guess I'll have to go do that.