test.sh: double VSize limits to prevent frequent test suite failure

especially the DiagnosticContext_test seemst to hit the
previously set limits regularily, which is somewhat strange
This commit is contained in:
Fischlurch 2013-10-13 02:50:04 +02:00
parent 08cae2617d
commit 843d75ac2a
2 changed files with 9 additions and 6 deletions

View file

@ -40,7 +40,7 @@ namespace test{
namespace { // private test setup...
/* WARNING: memory hungry */
const uint NUM_THREADS = 40;
const uint NUM_THREADS = 100;
const uint MAX_RAND = 100*1000;
inline bool
@ -148,9 +148,12 @@ namespace test{
* take a snapshot of the full ContextStack and then unwind.
* Thus the captured numbers must from a decreasing sequence
* of odd values.
* @warning this test case causes memory pressure.
* The reason seems to be the frequent re-allocations
* of the vector used to take the snapshots
* @warning this test case seems to cause memory pressure.
* When running the test suite with VSize limit 500MB,
* we frequently got aborts even with 40 threads.
* This is surprising, since all of the lists
* generated in the individual threads are
* of size below 20 elements.
*/
void
verify_heavilyParallelUsage()

View file

@ -83,10 +83,10 @@ LOGSUPPRESS='^\(\*\*[0-9]*\*\* \)\?[0-9]\{10,\}[:!] \(TRACE\|INFO\|NOTICE\|WARNI
#config
LIMIT_CPU=5
LIMIT_TIME=10
LIMIT_VSZ=524288
LIMIT_VSZ=1048576
LIMIT_VG_CPU=20
LIMIT_VG_TIME=30
LIMIT_VG_VSZ=524288
LIMIT_VG_VSZ=1048576
#configf HEAD~ Configuration Files; configuration files; define variables to configure the test