Skip to content

Change to 1TB VIRT allocation makes it impossible to core-dump Haskell programs

GHC 8.0.2 on Linux changed the memory allocator to always allocate 1TB virtual memory on startup (#9706 (closed)).

I now have a production Haskell program running in a loop and would like to debug where it is stuck, on another machine, thus attaching with gdb -p and running generate-core-file.

But core dumping takes forever, I Ctrl-C'd it when it reached 140 GB in size (my machine only has 64 GB RAM btw.); after the Ctrl-C the size of the core file on the file system was reported as 1.1T (probably it's a sparse file now).

Is there a workaround for this?

For example, if I could dump only the resident or actually allocated pages, that would probably help.

Trac metadata
Trac field Value
Version 8.0.2
Type Bug
TypeOfFailure OtherFailure
Priority normal
Resolution Unresolved
Component Runtime System
Test case
Differential revisions
BlockedBy
Related
Blocking
CC nh2
Operating system
Architecture
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information