Synchronous Execution of Code
2024-11-11
Let’s consider a simple function - in mathematical notation and in programming language notation.
Mathematical Notation
I think of mathematical notation as being FTL - Faster Than Light.
One can replace a function with another function in no time at all - instantaneously.
f(g(a,b)), where g=(a+b)
= f((a+b)), this takes 0 nanoseconds to rewrite
= f(a+b), this takes 0 nanoseconds to rewrite
Obviously, 0 nanoseconds is not true in practice, since people are limited in how fast they can write on paper, but, theoretically, there is nothing standing in the way of performing the above replacements instantaneously.
Programming Notation
...
f(g(a,b))
...
def g {return a+b}
...
Let’s say, for the sake of this example, that each low-level action within a CPU takes 1 nanosecond. [This isn’t strictly true in real CPUs, but, is a good-enough approximation for discussing the main point of this article].
The above functions are implemented as SUBROUTINEs:
PUSH b --> 1 nsec
PUSH a --> 1 nsec
CALL g --> 2 nsec + the time it takes to run g
POP --> 1 nsec
POP --> 1 nsec
PUSH result_register --> 1 nsec
CALL f --> 2 nsec + how long it takes to run f
Which, further, boils down into:
PUSH b --> 1 nsec
PUSH a --> 1 nsec
PUSH resume_address --> 1 nsec
MUTATE instruction_pointer_register (IP == address-of g)--> 1 nsec
g:
LOAD stack+1 into R0 --> 1 nsec
LOAD stack+2 into R1 --> nsec
ADD R0,R1,into result_register --> 1 nsec
;; RET
LOAD stack+0 into temp_register --> 1 nsec
POP --> 1 nsec
MUTATE instruction_pointer_register (IP = contents of temp_register)--> 1 nsec
;; resume point, continue f
POP --> nsec
POP --> nsec
PUSH result_register --> 1 nsec
CALL f --> 2 nsec + ??? nsec
Clearly, this does not take 0 nanoseconds. It is not instantaneous and never will be instantaneous regardless of how fast you make CPUs run. If we make CPUs 1000x faster, the above nsecs will become picosecs. Faster, but, still not instantaneous. The synchronous nature of the above code prevents us from ever reaching instantaneousity.
Functions vs. Subroutines
As shown above, subroutines do not work like mathematical functions.
It is possible to use subroutines to implement calculators for mathematical functions, but, CPUs can be used for more than just that single purpose.
I would say that using CPUs to implement mathematical function calculators is a form of Compute-ing Science. This is not, though, How-CPUs-Work Science. If we conflate the word “computer” with the concept of RePRogrammable-Electronic-Machine, we might begin to think that Compute-ing Science is about programming RePREMs, but, as shown above, it’s not.
Compute-ing Science is a valid avenue for pure research, but, its results should not be blindly mapped onto programming-writ-large. Compute-ing Science concerns itself only with a small slice of programming-in-general, i.e. how to use RePREMs to automate manipulation of mathematical functions.
Saying that function-based programming is the ONLY kind of programming, is kinda like saying PROLOG is the only kind of programming, or, FORTH is the only kind of programming, or, Smalltalk is the only kind of programming, or, ...
One might ask whether FORTRAN is on a more interesting track. FORTRAN has SUBROUTINEs and it has FUNCTIONs. FORTRAN needs improvement, but, maybe throwing away the concept of SUBROUTINEs in modern FP-based languages is too extreme? Opinion: I think that we should set both aside and consider something much more modern as the basis for programming “languages”, i.e. something better than Gutenberg-inspired textual type-setting notation.
It would be nice-to-have a mathematical function that maps FTL functional notation into the digital domain, kinda like FFTs that map analog electrical signals into the digital domain.
See Also
References: https://guitarvydas.github.io/2024/01/06/References.html
Blog: https://guitarvydas.github.io
Videos: https://www.youtube.com/@programmingsimplicity2980
Discord: https://discord.gg/qtTAdxxU
Leanpub: [WIP] https://leanpub.com/u/paul-tarvydas
Gumroad: https://tarvydas.gumroad.com
Twitter: @paul_tarvydas

