As an EE turned "software engineer" this bothers me, a lot.
I like the EE part of it, but I prefer thing that change more easily and are more "playful" (not to mention today hardware is at the mercy of software, so you take the reference design and go with it)
But I've come into situations where I uncovered a HW bug (in the chip reference board implementation, no less) that only manifested itself because of something specific in software (in the HDMI standard - or better, things from the standard inherited from things like VESA)
The Software Engineer see ports/memory to be written to and doesn't know what happens behind that
The Hardware engineer sees the "chip" and its connections but doesn't realise the rabbit hole goes deeper "ah this is a simple USB device, only 8 pins" now try communicating with it
That is a VERY general statement, most software engineers who do hardware stuff also know what takes to make it. You can't design a driver for a card without knowing everything about the card. And I'd say the same with hardware engineers. If you don't know how the software is going to run, how are you suppose to architect it. You can't make a piece of hardware without thinking about how the driver will work.
True, but those examples you cited are the minority.
Some software engineer working with drivers are distant from the hardware developers (especially in Linux) and even inside corporations there's a wall somewhere.
And of course, sometimes there's an abstraction between hardware and driver (usually through a firmware). Commonly relating to a standard, like USB storage, ATAPI, etc
" You can't make a piece of hardware without thinking about how the driver will work."
Unfortunately I've had to work with some devices that had very hard requirements on the software (basically, response time) (or you would add extra hardware to deal with it). In the second revision this problem was "fixed" by increasing a certain buffer size.
So yeah, sometimes hardware engineers don't think about that comprehensively enough.
though I agree that it's probably too general of a statement, I'm more with raverbashing on this one: while now mainly doing software I have a strong hardware background and it's more often than not just baffling to see the approach of software-only engineers when having to code over the software/hardware border. Then again, maybe I only met some exceptions. I also have no idea if/how these topics are covered in a typical software engineering's education.
Well, back when I was an MIT undergraduate, one of the core CS classes handed us a breadboard and a bunch of 74XX TTL chips, and we needed to construct a general register and stack computer from that. (We did get some PC boards that gave us a ALU and a UART that plugged into the breadboard, as well as the ability to program an EPROM to implement the firmware, but none of this "here's an Arduino or Rasberry Pi".)
Maybe there's so much complexity in the software stack that we can't start CS majors starting from the hardware level any more, but I can't help thinking we've lost something as a result. These days, there are Java programmers who get that "deer in headlights" look when confronted with terms such as "cache line miss".
> These days, there are Java programmers who get that "deer in headlights" look when confronted with terms such as "cache line miss".
Forget that, most of these folks can't reason about a program that doesn't have automatic garbage collection. Even if they have direct experience with C or similar, I have asked recent grads how they imagine reference counting or malloc/free works, and they very often start pulling out GC-influenced magical thinking about "the system" reclaiming things under the covers.
lol, I see what you mean there. Now as long as the person you're talking to has a true engineering mind he/she will be happy to learn about the subject. But there's unfortunately also those that start looking you with eyes begging you to go stop the hardware mumbojumbo talk and go back to oftware only. I don't really consider them true engineers.
It's 6.004, and it looks like they are still teaching it, which I was pleased to discover. Students are no longer asked to carry around the suitcase-sized breadboard "nerd kit" and they aren't using 74XX TTL chips any more. Now it's all done using simulators.
The name of the course is "Computation Structures", or, as MIT students would know it (since nearly everything at MIT is numbered, including buildings and departments), 6.004:
> You can't design a driver for a card without knowing everything about the card
Strongly disagree with that statement, though I sincerely wish it were true. My company manufactures hardware and does not provide a reference driver for any OS. We provide binary blobs and textual "guidelines".
For our hardware, driver authors operate without knowing any details beyond the interface.
So... they're actually not writing drivers for the hardware, but for the interface provided by your blob. I'm not sure I agree with the original statement, but I don't think your argument holds entirely.
It is always reason to celebrate when one engineer successfully communicates with another of different specialty. Big kudos to Intel for actually encouraging them to do so!
Props to Intel for hiring leading Linux developers and turning them loose.