Programming Is Too Complicated
2024-09-20
Programming Is Too Complicated
Here's how I see it: everything is complicated.
Some humans dig into and gain appreciation of how something works. That "something" is chosen in an essentially ad-hoc manner, basically “that something” has to capture a person’s interest. Then, they try to explain how it works in simpler terminology. When their terminology is ill-suited to explaining the problem space, the result is "accidental complexity" - complex explanation of an already complex phenomenon.
All of so-called "computer science" arises from the idea of using simple step-wise sequences to abstractly describe the functioning of machines. This idea / notation-for-description breaks down at the edges where describing complex interactions in a step-wise sequencing model doesn't work so well. This results in workarounds, e.g. attempting to describe truly asynchronous phenomena using only truly synchronous notations, giving rise to workarounds like callbacks, promises, etc., etc. A complex phenomenon can't be described using only one perspective, e.g. using only a truly synchronous view. Richard Feynman intuited this - he invented Feynman Diagrams to think about and describe an aspect of physics that didn't lend itself to character-based type-setting. Only later, the description was re-cast in character form, but that was only after the problem was first described clearly in non-character form.
We're clearly at some edge now. We find it hard to describe internet things, we find it hard to describe robots, we find it hard to describe the workings of LLMs. We're managing,,, but, the explanations are becoming more and more complicated. Complication on top of complication.
The idea of using step-wise sequencing worked for the problems of the 1950s, but ain't working so well for the problems of the 2024s. With the benefit of hindsight, we see that step-wise notation started failing us, so we applied band-aids in the form of cave-man levels of "visual programming", i.e. indentation of character sequences that helped humans, but were not required by machines, the use of Gutenberg characters “{ ... }” instead of the use of rectangles and ellipses, etc., etc.
If we insist on using sequences of characters, we should, at the least, adopt the idea that "syntax is cheap" and build many syntaxes to describe and choreograph the many aspects of real problems. For example, Prolog has a wonderful syntax for describing exhaustive search, but, that same syntax ain't so hot for describing problems of report generation and output formatting. In fact, Javascript does that kind of thing better than Prolog (yoiks!).
The idea of creating general purpose programming languages is a loser. We need to be able to compose solutions to problems using many syntaxes, each syntax geared toward specific aspects of the problem domain. In 1950, there was one throbbing red problem - how to control / describe the operation of a single machine. In 2024, the problem is different - how to control / describe a collection of single machines. In 1950, it was difficult to whip up new syntaxes. In 2024 it is easy to whip up new syntaxes. In fact, in 2024, we aren't even constrained to using only Gutenberg character sequences, like we were forced to use in 1950 due to 1950s hardware limitations.
See Also
References: https://guitarvydas.github.io/2024/01/06/References.html
Blog: https://guitarvydas.github.io
Videos: https://www.youtube.com/@programmingsimplicity2980
Discord: https://discord.gg/qtTAdxxU
Leanpub: [WIP] https://leanpub.com/u/paul-tarvydas
Gumroad: https://tarvydas.gumroad.com
Twitter: @paul_tarvydas

