Figure out why bytes allocated is so sensitive on 32-bit platforms
bytes allocated metric seems to be much more variable on 32-bit platforms than 64-bit platforms (see, e.g., !3309). This tends to be particularly true in the
haddock.* tests. Why is this?
Some possible causes include:
- Perhaps we are seeing overflow due to a 32-bit accumulator or difference somewhere in the accounting codepath?
- The fact that pointers are larger on 64-bit platforms mean that objects are larger; that being said, we track relative metric changes, so this should in principle not matter
- On 64-bit platforms we use the two-step allocator; however, it's hard to imagine how this would affect
bytes allocated. Moreover, we would also expect to see these fluctuations on Windows, where we didn't use two-step reservation until recently.