So they took components from a previous system, didn't test properly on the new system and today we inherit a whole lot of regulation for medical equipment because of this. Not that it's a bad thing that we ensure proper testing has occurred but there are so many cases where legally this applies even if it's very peripheral (like say a UI change in a menu item of something not related to medical equipment but is never the less under compliance)
> (like say a UI change in a menu item of something not related to medical equipment but is never the less under compliance)
I think regulating UI changes in medical equipment (or in other devices that could potentially kill people) is very much justified: Even if the software is nominally working as intended, if some update made unexpected ui changes that end up confusing the operator, this could lead to dangerous situations just as well as if the software had a bug.
I agree that this would be overblown for ui in non-critical places, but I didn't have the impression there is any heavy-handed regulation in those areas (otherwise, I'd expect we'd see fewer dark patterns and other "ux" redesigns that are clearly for "engagement optimizing" and not to make any task easier for the user)
Could you give some examples of non-medical situations where regulation gets in the way of ui redesigns?
They you haven't spent any time at all searching. Software-induced plane crashes have been happening for decades, with the extremely high profile case of the 737 MAX being quite recent.
Honestly I don't think the 737 MAX was a software failure so much as a weird intersection of failures in:
- regulation (FAA incentivized not modifying the primary frame/wings),
- aerospace engineering (weight too high for existing control surfaces, necessitating a weird control surface on the tail),
- systems engineering (specifying activating MCAS during a 1-out-of-2 sensor failure instead of 2-out-of-3/etc),
- and training. The procedures for new MCAS system changes weren't drilled into pilots to perpetuate the perception that this was still the same 737 and didn't need in-depth FAA review. In fact, Boeing managers and the FAA removed references to MCAS from Boeing's flight crew operations manual (FCOM).
The software didn't seem like it had a bug or even particularly poorly designed for the intended system. I'm not sure I'd find the software engineers at fault...they followed the specification given by the systems engineers, and it would have looked reasonable to a software engineer. The intended overall system was fucking atrocious, however. I'd absolutely lay the blame for the 737 MAX incidents on the business managers, aerospace engineers, and especially the systems engineers.
MCAS involved a good amount of software, but it wasn't the software which caused the failure.