IO performance regression in 7.2 vs 7.0
The following program:
main = interact id
runs about a third slower in 7.2.1 compared to 7.0.3 (and current 7.3 is about the same as 7.2.1).
$ ls -l stuff -rw-rw-r-- 1 simonmar GHC 128464904 2011-10-07 14:59 stuff $ ./cat <stuff >/dev/null +RTS -s 8,357,007,912 bytes allocated in the heap 793,616,736 bytes copied during GC 83,984 bytes maximum residency (1 sample(s)) 24,936 bytes maximum slop 1 MB total memory in use (0 MB lost due to fragmentation) Tot time (elapsed) Avg pause Max pause Gen 0 15966 colls, 0 par 0.94s 0.94s 0.0001s 0.0001s Gen 1 1 colls, 0 par 0.00s 0.00s 0.0004s 0.0004s INIT time 0.00s ( 0.00s elapsed) MUT time 5.71s ( 5.71s elapsed) GC time 0.94s ( 0.94s elapsed) EXIT time 0.00s ( 0.00s elapsed) Total time 6.65s ( 6.65s elapsed)
and with 7.0.3:
$ ./cat <stuff >/dev/null +RTS -s ./cat +RTS -s 5,299,570,680 bytes allocated in the heap 250,350,336 bytes copied during GC 53,232 bytes maximum residency (1 sample(s)) 30,776 bytes maximum slop 1 MB total memory in use (0 MB lost due to fragmentation) Generation 0: 10086 collections, 0 parallel, 0.50s, 0.50s elapsed Generation 1: 1 collections, 0 parallel, 0.00s, 0.00s elapsed INIT time 0.00s ( 0.00s elapsed) MUT time 4.38s ( 4.38s elapsed) GC time 0.50s ( 0.50s elapsed) EXIT time 0.00s ( 0.00s elapsed) Total time 4.88s ( 4.88s elapsed)
I strongly suspect this has to do with the changes to the encoding machinery in 7.2.1.
In binary mode, the slowdown is even worse. The program:
import System.IO main = do hSetBinaryMode stdin True hSetBinaryMode stdout True getContents >>= putStr
With 7.0.3 this runs in 3.35s, with 7.2.1 it takes 5.49s, a slowdown of 63%.
After fixing this, we need to add a performance regression test.