HowToWritePerfTests

Version 5 (Kirill Kornyakov, 2013-12-05 09:23 am)

1 1
h1. How to Write Perf Tests
2 1
3 1
Lets consider the following example:
4 1
5 1
<pre>
6 4 Kirill Kornyakov
using namespace std;
7 4 Kirill Kornyakov
using namespace cv;
8 4 Kirill Kornyakov
using namespace perf;
9 4 Kirill Kornyakov
10 4 Kirill Kornyakov
/* 1. Define parameter type and test fixture */
11 4 Kirill Kornyakov
typedef std::tr1::tuple<Size, MatType, MatDepth> Size_MatType_OutMatDepth_t;
12 4 Kirill Kornyakov
typedef perf::TestBaseWithParam<Size_MatType_OutMatDepth_t> Size_MatType_OutMatDepth;
13 4 Kirill Kornyakov
14 4 Kirill Kornyakov
/* 2. Declare the testsuite */
15 4 Kirill Kornyakov
PERF_TEST_P( Size_MatType_OutMatDepth, integral1, 
16 4 Kirill Kornyakov
    testing::Combine(
17 4 Kirill Kornyakov
        testing::Values( TYPICAL_MAT_SIZES ), 
18 4 Kirill Kornyakov
        testing::Values( CV_8UC1, CV_8UC4 ),
19 4 Kirill Kornyakov
        testing::Values( CV_32S, CV_32F, CV_64F )
20 4 Kirill Kornyakov
    )
21 4 Kirill Kornyakov
)
22 4 Kirill Kornyakov
{
23 4 Kirill Kornyakov
    /* 3. Get actual test parameters */
24 4 Kirill Kornyakov
    Size size   = std::tr1::get<0>(GetParam());
25 4 Kirill Kornyakov
    int matType = std::tr1::get<1>(GetParam());
26 4 Kirill Kornyakov
    int sdepth  = std::tr1::get<2>(GetParam());
27 4 Kirill Kornyakov
28 4 Kirill Kornyakov
    /* 4. Allocate and initialize arguments for tested function */
29 4 Kirill Kornyakov
    Mat src(size, matType);
30 4 Kirill Kornyakov
    Mat sum(size, sdepth);
31 4 Kirill Kornyakov
32 4 Kirill Kornyakov
    /* 5. Manifest your expectations about this test */
33 4 Kirill Kornyakov
    declare.in(src, WARMUP_RNG).out(sum);
34 4 Kirill Kornyakov
35 4 Kirill Kornyakov
    /* 6. Collect the samples! */
36 4 Kirill Kornyakov
    TEST_CYCLE(100) { integral(src, sum, sdepth);  }
37 4 Kirill Kornyakov
38 4 Kirill Kornyakov
    /* 7. Add simple regression check */
39 4 Kirill Kornyakov
    SANITY_CHECK(sum);
40 4 Kirill Kornyakov
}
41 1
</pre>
42 1
43 1
h2. Define parameter type and test fixture
44 1
45 1
Each parameterized test requires a test fixture class to be able to accept parameters. We recommend also create an alias for parameter type itself (using typedef). 
46 1
The recommended naming style for fixture class is just a enumeration of argument types:
47 1
48 1
<pre>
49 1
typedef std::tr1::tuple<cv::Size, perf::MatType, perf::MatDepth> Size_MatType_OutMatDepth_t;
50 1
typedef perf::TestBaseWithParam<Size_MatType_OutMatDepth_t> Size_MatType_OutMatDepth;
51 1
</pre>
52 1
53 1
Performance testing framework provides a couple of useful wrappers for such constants as Mat type and Mat depth. Their names are MatType and MatDepth. Use them for your test fixture definitions to get well printed values of test arguments in the final performance reports.
54 1
55 1
There are also two helper macro to define printable wrappers for any enumerations used as test parameters - CV_ENUM and CV_FLAGS. First one is intended for simple enumerations or sets of define and second should be used if members of enumeration are flags (can be combined with | operator)
56 1
57 1
Here is a usage example:
58 1
59 1
<pre>
60 1
CV_ENUM (ReduceOp, CV_REDUCE_SUM, CV_REDUCE_AVG, CV_REDUCE_MAX, CV_REDUCE_MIN)
61 1
CV_FLAGS(NormType, NORM_INF, NORM_L1, NORM_L2, NORM_TYPE_MASK, NORM_RELATIVE, NORM_MINMAX)
62 1
</pre>
63 1
64 1
h2. Declare the testsuite
65 1
66 1
Performance testing framework provides special macro to declare a parameterized performance tests:
67 1
68 1
<pre>
69 1
PERF_TEST_P(fixture, name, params)
70 1
</pre>
71 1
72 1
Declaration of value-parameterized performance test looks very close to googletest's TEST_P macro. However it is different and actually combines TEST_P and INSTANTIATE_TEST_CASE_P into a single construction.
73 2 Kirill Kornyakov
* It is possible to use one test fixture for several test suites but each suite needs an unique name.
74 2 Kirill Kornyakov
* name argument can be any valid C++ identifier. Usually it contains name of tested function. - Into the params argument you can pass any list of parameters supported by googletest library.
75 1
TYPICAL_MAT_SIZES used in our examples is a predefined list of most typical sizes of the image for performance tests. It is defined as (actual definition can be slightly different from this):
76 1
77 1
<pre>
78 1
#define TYPICAL_MAT_SIZES ::perf::szVGA, ::perf::sz720p, ::perf::sz1080p, ::perf::szODD
79 1
</pre>
80 1
81 1
h2. Get actual test parameters
82 1
83 1
First block of well-written test should get all test parameters. Also all constants used in test body should be defined in this section.
84 1
85 1
If you need an input file for your test then you should use getDataPath method to get the actual location of the file. (This function uses OPENCV_TEST_DATA_PATH environment variable to get the location of test data.)
86 1
87 1
88 1
std::string cascadePath = getDataPath("cv/cascadeandhog/cascades/lbpcascade_frontalface.xml")
89 1
If you need a temporary output file, then use cv::tempfile() function from core module to generate unique temporary path.
90 1
91 1
h2. Allocate and initialize arguments for tested function
92 1
93 1
Second block inside the test is responsible for data initialization. Allocate and initialize all input data here and prepare structures for output of tested function.
94 1
95 1
h2. Manifest your expectations about this test
96 1
97 1
This section is used to inform performance testing framework about your test. The entry point of all declarations is a declare member inherited from ::perf::TestBase class.
98 1
99 1
The following options are available:
100 1
101 1
Input arguments declaration:
102 1
103 1
<pre>
104 1
 declare.in(arg1 [, arg2 [, arg3 [, arg4]]] [, warmup_type]);
105 1
</pre>
106 1
107 1
Use this declaration to inform framework about input arguments of tested function. It will collect some info about them and this info will be available for further analysis. Method can accept 1-4 arguments of any type compatible with cv::InputArray, the last optional parameter is used to make a warmup/initialization of test data before the run. Available values are:
108 2 Kirill Kornyakov
* WARMUP_READ
109 2 Kirill Kornyakov
* WARMUP_WRITE
110 2 Kirill Kornyakov
* WARMUP_RNG
111 2 Kirill Kornyakov
* WARMUP_NONE
112 3 Kirill Kornyakov
113 1
Output argument declaration:
114 1
<pre>
115 1
 declare.out(arg1 [, arg2 [, arg3 [, arg4]]] [, warmup_type]);
116 1
</pre>
117 1
Use this declaration to inform framework about output arguments of tested function. It will collect some info about them and this info will be available for further analysis. Method can accept 1-4 arguments of any type compatible with cv::InputArray, the last optional parameter is used to make a warmup/initialization of test data before the run. Values of this parameter are the same as in declare.in method.
118 1
Desired number of iterations:
119 1
<pre>
120 1
 declare.iterations(n);
121 1
</pre>
122 1
123 1
Use this declaration to specify how many samples you want to collect. Actual number of collected samples can be less then number set by this method, if test is reached time limit.
124 1
Time limit:
125 1
<pre>
126 1
 declare.time(seconds);
127 1
</pre>
128 1
129 1
This declaration can be used to specify time limit for the whole test run time. This is soft bound, so working function will not be interrupted even if it excess the limit but sampling will be stopped as soon as tested function returns. By default this limit is set to 3 seconds for desktop and to 6 seconds for Android devices.
130 1
And it is possible to chain all these declarations:
131 1
132 1
<pre>
133 1
declare.in(a, b, WARMUP_RNG).in(c)
134 1
       .out(dst)
135 1
       .time(0.5)
136 1
       .iterations(100);
137 1
</pre>
138 1
139 1
h2. Collect the samples!
140 1
141 1
This is the heart of the test - main sampling loop.
142 1
143 1
There are three recommended ways for writing this loop:
144 1
145 1
Simplest loop - will iterate until the termination criteria is reached:
146 1
147 1
<pre>
148 1
 SIMPLE_TEST_CYCLE()
149 1
 {
150 1
     threshold(src, dst, 100.0, 255.0, THRESH_BINARY);
151 1
 }
152 1
</pre>
153 1
154 1
Loop with desired iterations number:
155 1
156 1
<pre>
157 1
 TEST_CYCLE(100)
158 1
 {
159 1
     threshold(src, dst, 100.0, 255.0, THRESH_BINARY);
160 1
 }
161 1
</pre>
162 1
163 1
Actually it is absolutely the same as:
164 1
165 1
<pre>
166 1
 declare.iterations(100);
167 1
 SIMPLE_TEST_CYCLE()
168 1
 {
169 1
     threshold(src, dst, 100.0, 255.0, THRESH_BINARY);
170 1
 }
171 1
</pre>
172 1
173 1
Custom loop where you can place some additional cleanup or initialization code on every iteration:
174 1
175 1
<pre>
176 1
 while(next())
177 1
 {
178 1
     res.clear();
179 1
180 1
     startTimer();
181 1
     {//only this block is measured
182 1
         cc.detectMultiScale(img, res, 1.1, 3, 0, minSize);
183 1
     }
184 1
     stopTimer();
185 1
 }
186 1
</pre>
187 1
188 5 Kirill Kornyakov
h2. Add simple regression check
189 1
190 1
And final touch is adding simple regression check into your test:
191 1
192 1
<pre>
193 1
SANITY_CHECK(sum [, epsilon]);
194 1
</pre>
195 1
196 1
This command does not do full regression check but provides a fast and simple way to ensure that values returned by tested function are adequate.