I had avoided the terms "sub-standard" and "sub-par" for the same reason that I avoid the term "industry standard practice". Because being widely used does not make anything intrinsically good.
I'm a contract programmer, and while I'm willing to admit that I might be very, very wrong here, I've about 5 years experience to the contrary.
It's not that my project managers have failed, or the companies have failed, or (hopefully) that I have failed: Software development is an expensive process, especially with contract programmers or consultants like myself.
It requires lots of expensive programmer (and designer, and PM) time to bring a product "to market", and for most companies that I've worked with when it ticks the boxes on the functional spec, or does everything that it's supposed to do then it gets the green light.
Now, I'm sure programmers who care about their projects are aware of wanting to just add a little more polish to the application, and the pragmatic among us will stop themselves.
If you're working on a quotation, then you are not getting paid to add polish. If you are working on time & materials, someone else is paying for that polish, and if that polish doesn't bring in revenue equivalent (or better) to what you spent doing it, then you are literally wrong to do it.
Paying programmers by the feature, or by 1000s of lines of code, is akin to paying writers by the word. We live in a world of penny dreadful software.
At this point, some people would break into song about how great OSS is. (That's not good, hackers, that's not good ...). OSS in some cases suffers from a different problem, which is that people scratch an itch.
This leads to absolutely top-notch development environments, programming languages and text editors, because these are the tools of the trade. Programmers use them every day, and most of the popular ones have had years and legions of programmers wear away any of the burrs.
My theory, which might be wrong, is that somewhere there exists a balance between outside measurement (users, and eventually, money) and aesthetics (abstractions, good code, corners rounded and not cut).
I've been reading Hackers & Painters (it shows, doesn't it?), which raises issues similar to this. With a target user base that has a choice to use something else and something to gain, I find it easy to design a user-centric application, but with very harsh monetary and time constraints, I find it hard.
Ergo: The only way to develop quality software, is to work for yourself, and ultimately be the one responsible for if users prefer your product to a competitor's. It's your money you're spending on the development, but then it's you doing the work required to really make it shine, and you reaping the rewards.
I've developed a lot of intranet applications; when there is no choice, users will put up with most usability failings.
In fact, I've had people overload fields rather than ask for extra ones; something I completely don't understand. (For example, naming something "Hotel Name 5 Star" instead of asking me to enter a "Star Rating" field for the system.) "Five Worlds" from Joel on Software - why internal software sucks.