Sure. But I usually tried -XStrict
on a program that already had bangs in the necessary places (accumulators, fields). Adding more bangs usually makes things worse.
The opposite can also be true: -XStrict
on things like head . sort
can convert linear time/space into a log-linear.
-XStrictData
may as well require much more data and overload GC. In general, -XStrict
is a rather dangerous sledgehammer that doesn’t always work (never in my case).
Sorry, perhaps I used the wrong word (although a pun on instruction reordering was intended ). I meant the ability to do evaluation where it’s needed, not where the bang is:
main :: IO ()
main = do
x <- randomRIO (0,9) :: IO Int
let a = trace "a" x
b = trace "b" x
if x < 5 then
print a
else
print b
Both a
and b
will always be evaluated if you add -XStrict
.
Superfluous bangs tend to slow things down for the same reason (forces to evaluate the unnecessary):
go acc [] = acc
go !acc (x:xs) ...
-- is frequently faster than
go !acc [] = acc
go !acc (x:xs) ...
--------------
go !acc rarelyUsedArg (x:xs) ...
-- is frequently faster than
go !acc !rarelyUsedArg (x:xs) ...
You can see the source of Text.HTML.TagSoup.Fast. The code is more than a decade old ugliness, I would write it with pattern synonyms now; but you can notice a lack of bangs in non-recursive cases, and it helped the code to perform better, at least back then.
I have a feeling (cannot prove it) that bangs break the strictness analyzer and lead to a suboptimal code generation. GHC knows better when to evaluate if it’s sure that evaluation is necessary, bangs force GHC to do it in the wrong places (and maybe multiple times).
Bangs should be used sparingly, to give GHC a hint that you don’t want to return a lazy accumulator, and let the strictness analyzer do the rest.