As someone with a disability (quadriplegic), who types/codes with one finger, I find it appalling that Nuance, Apple and Google haven't opened up their speech recognition systems through a rudimentary API that would allow innovation that would _directly_ help the lives of me and many other disabled people whether it's RSI or worse.
It was a shock to me to discover that the livelihood and happiness of so many people depends on a dubiously-reliable unofficial API that was hacked into Dragon years ago and that has been lovingly preserved ever since, just below the radar. It feels like being critically dependent on Windows 95.