Commit 9a29b65b authored by Gabor Greif's avatar Gabor Greif 💬
Browse files

Suppress duplicate .T files

As per http://stackoverflow.com/questions/7961363/removing-duplicates-in-lists
use the set() function to zap duplicates from the obtained list of .T files.

I am using
$ python3 --version
Python 3.5.1

and strangely findTFiles() returns some .T files twice:

-- BEFORE
Found 376 .T files...
...

====> Scanning ../../libraries/array/tests/all.T
====> Scanning ../../libraries/array/tests/all.T
*** framework failure for T2120(duplicate) There are multiple tests with this name
*** framework failure for largeArray(duplicate) There are multiple tests with this name
*** framework failure for array001(duplicate) There are multiple tests with this name
*** framework failure for T9220(duplicate) There are multiple tests with this name
*** framework failure for T229(duplicate) There are multiple tests with this name
...

-- AFTER
Found 365 .T files...
...
====> Scanning ../../libraries/array/tests/all.T
...

Even more strangely 'find' begs to differ:
$ find libraries testsuite/tests -name "*.T" | sort | uniq | wc -l
368
parent 574abb71
......@@ -255,7 +255,7 @@ print('Timeout is ' + str(config.timeout))
if config.rootdirs == []:
config.rootdirs = ['.']
t_files = list(findTFiles(config.rootdirs))
t_files = set(findTFiles(config.rootdirs))
print('Found', len(t_files), '.T files...')
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment