Skip to content

internal error: allocation of 1048608 bytes too large

I'm getting this error error when trying to use the ghc-datasize package from Hackage to calculate the in memory size of a data structure. The full error is:

load-test: internal error: allocation of 1048608 bytes too large (GHC
should have complained at compile-time)
    (GHC version 8.0.1 for x86_64_unknown_linux)
    Please report this as a GHC bug:  http://www.haskell.org/ghc/reportabug

However, this message is bogus because this is not a static data structure known at compile time, but data loaded from disk. Currently, my test program just loads this complex data structure (about 27 megabytes of CSV with embedded JSON on disk) into memory and then calls recursiveSize from GHC.Datasize on it. I get this same error message with both ghc7.10.3 and ghc-8.0.1.

If I remove the call to recursiveSize there is no error.

If I Control.Deepseq.force the data structures before calling recursiveSize there is no error.

So, who is to blame? ghc-datasize is just Haskell code that calls public APIs so I think the blame lies with GHC.

I'll work on getting a small re-producable test case.

Trac metadata
Trac field Value
Version 8.0.1
Type Bug
TypeOfFailure OtherFailure
Priority normal
Resolution Unresolved
Component Runtime System
Test case
Differential revisions
BlockedBy
Related
Blocking
CC simonmar
Operating system
Architecture
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information