…and all of that is “a bit” of a:
Why? Because the questionable use of simple(ton) examples has probably occurred before:
-
no doubt there were various snippets of assembly code which purported to show the superiority of assembly over that “new-fangled” Fortran and LISP stuff.
-
there probably were analogous snippets of code showing the superiority of
goto
over structured programming. -
and one can only imagine all the snippets of code purporting to show the superiority of manual memory
manglementmanagement over GC!
They are all examples of “storms in teacups”…the teacups found in child’s-toy dollhouses!
Getting back to the actual topic of this thread:
Mutation.
While The Anatomy of Programming Languages also nominates sequencing as being a problem:
-
arbitrary sequencing but no mutation: confluence.
-
arbitrary mutation but no sequencing: nondeterminism!
Standard ML and OCaml both illustrate this point - at first glance, definitions in both languages have little to no resemblance to the rambling sequences of statements found in the likes of Pascal or C. But that lack of visible sequencing is irrelevant: due to the relative ease of access to mutation both provide, Standard ML and OCaml are also imperative languages.
Alright, so what about Haskell and its use of seemingly-inexplicable interfaces?
…for something better (or a proof that none exists, analogous to that for the Entscheidungsproblem).