My belief: massive synchrony leads to a dead end. It is not worth propagating the concept of synchronous programming languages other than for pure research, beyond what has already been achieved. The argument for inertia - doing more of the same because “we’ve always done it this way” and because “rapid innovation will be ignored by most people, anyway” - would be scuttled if it were truly, deeply believed that the idea of programming via massive synchronization, will lead to a dead end.
The currently popular, overriding belief is: the idea of massive synchrony can't possibly be wrong.
Most researchers and developers believe, deep down in their bones, that functional programming and synchronous programming languages and computational thinking are almost “there”, and, that just the addition of some more baubles will make the concept(s) be perfect. Some famous smart-guy once said something about expecting different results…
I haven't managed, yet, to convince the majority that this approach is a dead end.
After all, LLMs appear to be a huge leap forward in technology, but,,,
LLMs ain’t Engineering. The results are not repeatable.
LLMs employ massive concurrency. LLMs are just big games of Plinko that fire lots of weighted nodes at once, in parallel.
The real advance in LLMs is their access to the New Library of Alexandria, i.e. huge volumes of data on the internet.
Advances in LLMs are expected to come from better hardware, and, not from better software.
Observation: In 1950, electronics involved (a) massive parallelism due to asynchronous propagation of analog signals, (b) vacuum tubes. There was inertia for doing more things this way. Then, this inertia was overturned to become (a) massive synchrony, (b) transistors replacing vacuum tubes, (c) digital-thinking swamping out analog-thinking.
Observation: Massive synchrony in software has caused a lot of grief, but, researchers and developers choose to ignore the facts and just paper failures over. A very obvious incident was the Mars Pathfinder fiasco. A papered over incident was the invention of Javascript callbacks - a poor choice for expressing async concurrency, but, researchers and developers soldiered on anyway in the belief that the idea of massive synchrony can't possibly be wrong.
Observation: Moore's Law applies to electronics, wherein async concurrency reigns. Moore's Law, though, does not apply to software, wherein over-synchronization reigns.
Observation: inertia of the then-popular belief structure propagated Ptolemaic cosmology while ignoring hard observations, until Galilean cosmology overturned that belief structure. There were high amounts of inertia for the Ptolemaic system, but, something caused us to dump the inertia and go with the Galilean system. Does the Sun go around the Earth, or, does the Earth go around the Sun? Which belief is better? Define "better". Humanity thrived for 1,000s of years under the belief in Ptolemaic cosmology. It's been only a few hundred years since Galilean cosmology took hold. Define "better".
Are we currently only adding band-aids to a Ptolemaic Software Cosmology? How can we tell if this is what is going on? What are some fundamental beliefs that steer our current thought patterns?
Belief: Kids should learn to think computationally and to learn synchronization.
Belief: Art, music, and, freedom from synchronization, is not as good as synchronization and computational thinking.
Belief: It is popular to believe that non-programmers want to control their own computers. I observe, though, that most people don't really care to do that. I do want to control my computer, but, most of my friends don't care much about that. They just want faster horses, they don't wish to know how to breed them.
Observation: I dumped Android and Linux in lieu of using Apple-everything. Apple concentrates on delivering a coherent UX across devices, whereas Google and Linux simply provide a tidal wave of disconnected options. Linux is - schizophrenically - trying to copy Apple desktop ideas while, also, providing walls of options for building servers. Google is trying to copy Apple UX ideas, but, mostly after-the-fact.
Belief: DIY is better than non-DIY. Free is better than pay-walled. Corporations are fundamentally dirty. Hence, Linux is better than MacOS.
Observation: DIY is cheaper in raw $'s than non-DIY but contains a high hidden cost that is generally not accounted for.
Belief: If we can measure it, then we believe that we fully understand it. If we can't measure it, then we believe that it is safe to ignore it. I used to believe that CDs were better than vinyl until I sat down and did my own A/B comparison. The difference is there, but, we can't measure it nor do we have precise language for talking about the difference. The difference ain't what I thought it would be. My engineering teachers believed that it was safe to ignore the difference and that it was safe not to strive to measure it.
Innovation in software has not come from advances in software, but, has come from advances in hardware, such as miniaturization, packing more CPU power and packing more memory into smaller spaces, etc.
Software innovators are stuck in the 1950s and need a kick in their butts. Software innovators continue to fiddle with 1950s technology in 2024, generally ignoring end-users. End-users draw diagrams on napkins and on white-boards. Software innovators are holding end-users back by disregarding end-users’ penchant for using whiteboards. This is due only to old-fashioned technical concerns. Every office has computers and every office has whiteboards. Which device(s) do end-users jump to using when they have fresh ideas? Which device(s) do end-users use when performing rote work?
See Also
References: https://guitarvydas.github.io/2024/01/06/References.html
Blog: https://guitarvydas.github.io
Videos: https://www.youtube.com/@programmingsimplicity2980
Discord: https://discord.gg/qtTAdxxU
Leanpub: [WIP] https://leanpub.com/u/paul-tarvydas
Gumroad: https://tarvydas.gumroad.com
Twitter: @paul_tarvydas