A research application that my lab has created seems to hit a limit at 1TB of memory use (out of memory). The machine(s) I am running on all have > 1TB physical memory. I have checked ulimit and memory use (virtual and max) are both “unlimited”. I was wondering if when the physical memory required hits the Virtual memory allocated (1TB) the “out of memory” is triggered. I have looked, but not found an option to increase the amount of virtual/physical memory available to an application (as was increased in ghc-8.0). The application is used for research on large genomic data sets.
Using “top” to watch memory consumption so can see it rise to 1TB then fail.
One scenario could be that the VM is set on startup at 1TB and
even though other settings are unlimited, the VM setting then acting like
a system limit.
Indeed I was wondering when someone would hit this limit. The problem is that GHC uses a static 1 TB address space reservation for its heap. Sadly, there is currently no way to affect the size of this reservation at runtime. However, I believe that a patch like the following would suffice in raising the limit (but sadly I have no means of testing this):
Good idea–ghc-9.8.1 built fine but I get a ghc panic when compiling my app (does not happen with
pre-built ghc-9.8.1 or ghc-9.9.20240101):
ghc: panic! (the ‘impossible’ happened)
GHC version 9.8.1:
ModOrigin: hidden module redefined
x: unusable module
y: unusable module
Call stack:
CallStack (from HasCallStack):
callStackDoc, called at compiler/GHC/Utils/Panic.hs:191:37 in ghc-9.8.1-inplace:GHC.Utils.Panic
pprPanic, called at compiler/GHC/Unit/State.hs:239:14 in ghc-9.8.1-inplace:GHC.Unit.State
CallStack (from HasCallStack):
panic, called at compiler/GHC/Utils/Error.hs:503:29 in ghc-9.8.1-inplace:GHC.Utils.Error