Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Sign in / Register
  • GHC GHC
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 5,414
    • Issues 5,414
    • List
    • Boards
    • Service Desk
    • Milestones
    • Iterations
  • Merge requests 600
    • Merge requests 600
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
    • Test Cases
  • Deployments
    • Deployments
    • Releases
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Code review
    • Insights
    • Issue
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • Glasgow Haskell CompilerGlasgow Haskell Compiler
  • GHCGHC
  • Issues
  • #12492
Closed
Open
Issue created Aug 15, 2016 by erikd@trac-erikd

internal error: allocation of 1048608 bytes too large

I'm getting this error error when trying to use the ghc-datasize package from Hackage to calculate the in memory size of a data structure. The full error is:

load-test: internal error: allocation of 1048608 bytes too large (GHC
should have complained at compile-time)
    (GHC version 8.0.1 for x86_64_unknown_linux)
    Please report this as a GHC bug:  http://www.haskell.org/ghc/reportabug

However, this message is bogus because this is not a static data structure known at compile time, but data loaded from disk. Currently, my test program just loads this complex data structure (about 27 megabytes of CSV with embedded JSON on disk) into memory and then calls recursiveSize from GHC.Datasize on it. I get this same error message with both ghc7.10.3 and ghc-8.0.1.

If I remove the call to recursiveSize there is no error.

If I Control.Deepseq.force the data structures before calling recursiveSize there is no error.

So, who is to blame? ghc-datasize is just Haskell code that calls public APIs so I think the blame lies with GHC.

I'll work on getting a small re-producable test case.

Trac metadata
Trac field Value
Version 8.0.1
Type Bug
TypeOfFailure OtherFailure
Priority normal
Resolution Unresolved
Component Runtime System
Test case
Differential revisions
BlockedBy
Related
Blocking
CC simonmar
Operating system
Architecture
To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking