OK so counting from after the ‘FTP changes’ brouhaha, which of those breaking changes has base
perpetrated? And how would a reorganised base
have avoided the breakages, or at least minimised the impact?
@Ericson2314 seems to be throwing in the kitchen sink. To pick one point as an example:
- Classes like
Num
with no clear laws.
Has base
made breaking changes to Num
? Indeed has base
made any visible changes to Num
? I rather thought Num
is the same now as Haskell 98. Then don’t mix up complaints about breaking changes with complaints about (what turned out to be with a great deal of hindsight) poor design. In early 1990’s when the Prelude
was developed, who’d have expected typeclasses would be so wildly powerful, and would have connected to Category Theory?
As I mentioned, Num
(and Integral, Fractional
) are pretty much baked into the definition and syntax of Haskell. Nothing’s stopping you creating a whole bunch of other Numerical classes and operators with all the nice properties.
- Partiality, especially in methods like
*1
inFoldable
and withEnum
Has base
changed methods/functions to be more partial? Again don’t mix up complaints about breakages with complaints about a design you don’t like. Again there’s nothing stopping you creating safeHead
, etc. (I’m not going to defend design choices in Foldable
; but any more changes to it had better have an enormous benefit/cost ratio.)
A usage of head
or (!!)
is not necessarily unsafe: it may be surrounded by checks to avoid calling it unsafely. Yes it’s unfortunate those aren’t type-safe checks, in a language which vaunts the benefits of type safety. “Well-typed programs can’t go wrong” is bunkum.
I think a better approach is through education: stop teaching newbies so much about Lists (including String
) and so little about appropriate datatype design/including especially other off-the-shelf recursive data structures.
So is it this (below) what you want? And is this the opinion of you all:
- Reorganise
base
so a program can (for example) excludeNum
and all its Pomps; then - instead import theoretically-pure
ShinyNum
using all the same class and operator ids; and - otherwise use standard
Prelude
.
How about GHC wired-in modules for implementation stuff like arithmetic on pointers and indexes? Is that also to use ShinyNum
? How about modules like Vector
with Int
indexes; or Data.Set
with a size :: Int
embedded in every node and (Num.+)
to calculate it? Checking for numeric over/underflow or index-out-of-array comes at computational cost. Those modules are already making limited checks, enough to avoid IllMemRef
s. ShinyNum
will duplicate work for programs that already don’t throw those exceptions. Who’s then responsible for addressing the performance degradation?