The Big Takeaway
The point of the previous post is that the use of programming languages wasn’t a planned choice or an intentional selection of the best possible option; we just sort of grew into them.
The “hot” technology in the 1700’s were the punched cards used to control Jacquard Looms in the textile mills. The sequential nature of weaving was similar enough to mechanically calculating numerical results that the cards made a good enough solution. A pre-existing technology was adopted and modified instead of methodically analyzing the requirements and defining a best solution.
Similarly, the use of numeric computer instructions led to the adoption of human-language abbreviations to stand for the numeric operations to be performed by the computer programs. This worked well enough that pre-existing human language was adopted and modified to describe the sequence of events to be performed in computer programs.
Our use of human languages for programming languages was a huge improvement initially and has generally been successful since it was first introduced. One line of programming code could generate many machine instructions, significantly multiplying the volume that a developer could write in a single day.
But the earliest languages tended to be either too cryptic or too verbose. They also had the side-effect of encouraging developers to try larger and more ambitious projects. Reaching the practical limits of the earliest languages, new and more ambitious programming languages were developed to overcome these limitations.
And as our programs grew, our ambitions grew even larger, requiring new languages with newer features grafted on. Careful work has to be done adding these new features so they don’t “step on” existing functionality of the existing language. Ideas in one language are propagated into other languages until we now have a massive sea of roiling languages.
And yet, with all the combinations and permutations and possible approaches that have been tried with the hundreds of programming languages now available, we still have the problems that we had in the beginning:
- most development projects are either late or take longer to develop than was originally estimated. Some take so long that they end up being canceled
- most programs spend as much time being tested and debugged as it took to originally code them
- most programs require as much time for support (fixes, changes, updates, etc) during their lifetime of use as it took to originally code and debug the program
It seems likely that if changing the programming language was going to work, it would have worked by now.
Perhaps what we should have learned from programming languages is that sometimes a “good enough” solution isn’t the right long-term solution.