[vsipl++] [patch] FIR Filter bank benchmark

Jules Bergmann jules at codesourcery.com
Mon Apr 3 15:18:03 UTC 2006


Don McCoy wrote:
> Jules Bergmann wrote:

>>
>> Instead of declaring 'test' to be a local matrix, can you instead 
>> declare a global results matrix, use the local view of that matrix 
>> here, and then check the local portion below?
>>
> 
> 
> It seemed easiest to pass around yet another matrix and keep all the 
> declarations in one place.  That satisfies the above, but I'm not sure 
> that is what you meant.

That sounds fine.




> 
> On that note, however, I found that I can construct data sets that will 
>  fail the data comparison.  I.e. when the output data set contains 
> zeroes and the fast convolution algorithm is used, the 'view_equal' 
> check will fail.  The reason for this is the method we use for comparing 
> values looks at the relative error of say a and b, as (a-b)/a.  This 
> works well for most small values, but evaluates to 1 when b == 0 (the 
> relative error check returns false for anything over 1e-4).
> 
> The example set that cause this error results from leaving the imaginary 
> portion of the inputs set to zero and setting the filter coefficients to 
> all ones.  Doing this results in outputs where the imaginary portion is 
> exactly zero, but the fast conv algorithm will produce numbers on the 
> order of 1e-6, which are only a few bits off of zero (for floats) but 
> will fail the current comparison check.
> 
> In any case, this is a separate topic and I don't think it affects 
> whether or not we check this in at this time.  Sound ok?

That is fine.  In the future, we may want to use 'error_db' to compare 
the results.

> 
> 
> One other item is worth bringing up though.  There was an error in the 
> previous patch for the not-from-file case where the output (now 
> 'expected') vectors were not seeded with the correct answers.  I believe 
> I let this one in by failing to compile and test in debug mode.  Given 
> that assert() does nothing in optimized builds, the data checks will not 
> be run the way the benchmark will normally be configured.
> 
> Perhaps it would be nice to insert something that printed a nice warning 
> if the call to view_equal() fails, then passes the result to assert() 
> which will halt execution in the debug case (handy) but allow it to 
> continue in the optimized case (ok since warning printed?).  Better 
> warnings could be printed from within the view_equal() function, e.g. 
> ones that show the location of the error and the values that failed 
> comparison.
> 
> In addition, perhaps we always want to check the data.  It is done 
> outside the timing loop, so doesn't affect results, but it does slow the 
> overall execution time.  Suggestions?

Is 'test_assert()' officially available for the benchmarks?  It is not 
disabled by -DNDEBUG.  I started using it in the benchmarks of the 
regular 'assert()' for that reason.

For view_equal, we could add a 'verbose' flag that would cause an error 
message to be printed if a miscompare is detected.  view_equal() would 
still return true/false, so it could be used in an assert() or 
test_assert().



> 
> 
> Ok to check in with these changes?
> 

Yes, these look good.  Please check them in.

			thanks,
			-- Jules


-- 
Jules Bergmann
CodeSourcery
jules at codesourcery.com
(650) 331-3385 x705



More information about the vsipl++ mailing list