Commit 0c114c65 authored by Sylvain Henry's avatar Sylvain Henry Committed by Marge Bot

Handle large ARR_WORDS in heap census (fix #17572)

We can do a heap census with a non-profiling RTS. With a non-profiling
RTS we don't zero superfluous bytes of shrunk arrays hence a need to
handle the case specifically to avoid a crash.

Revert part of a586b33f
parent fad866e0
Pipeline #13867 failed with stages
in 537 minutes and 58 seconds
......@@ -1011,6 +1011,22 @@ heapCensusChain( Census *census, bdescr *bd )
p = bd->start;
// When we shrink a large ARR_WORDS, we do not adjust the free pointer
// of the associated block descriptor, thus introducing slop at the end
// of the object. This slop remains after GC, violating the assumption
// of the loop below that all slop has been eliminated (#11627).
// The slop isn't always zeroed (e.g. in non-profiling mode, cf
// Consequently, we handle large ARR_WORDS objects as a special case.
if (bd->flags & BF_LARGE
&& get_itbl((StgClosure *)p)->type == ARR_WORDS) {
size = arr_words_sizeW((StgArrBytes *)p);
prim = true;
heapProfObject(census, (StgClosure *)p, size, prim);
while (p < bd->free) {
info = get_itbl((const StgClosure *)p);
prim = false;
{-# LANGUAGE UnboxedTuples, MagicHash, BlockArguments #-}
import GHC.Exts
import GHC.Types
doSomething :: Word -> IO Word
doSomething (W# x) = IO \s ->
case newByteArray# 7096# s of -- we need a large ByteArray#
(# s, mba #) -> case shrinkMutableByteArray# mba 7020# s of -- shrunk
s -> case unsafeFreezeByteArray# mba s of
(# s, ba #) -> (# s, W# (indexWordArray# ba 18#) #)
main :: IO ()
main = do
xs <- mapM doSomething [0..300000] -- we need enough elements (to trigger a GC maybe?)
print (length xs)
......@@ -149,3 +149,5 @@ test('T15897',
makefile_test, ['T15897'])
test('T17572', [], compile_and_run, [''])
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment