Chebyshev
Unit testing for scientific software
|
Benchmark module context, handling benchmark requests concurrently. More...
#include <benchmark.h>
Public Member Functions | |
void | setup (std::string moduleName, int argc=0, const char **argv=nullptr) |
Setup the benchmark environment. | |
void | terminate (bool exit=false) |
Terminate the benchmarking environment. | |
benchmark_context (std::string moduleName, int argc=0, const char **argv=nullptr) | |
Default constructor setting up the context. | |
~benchmark_context () | |
Terminate the benchmark module. | |
benchmark_context (const benchmark_context &other) | |
Custom copy constructor to avoid copying std::mutex. | |
benchmark_context & | operator= (const benchmark_context &other) |
Custom assignment operator to avoid copying std::mutex. | |
template<typename InputType = double, typename Function > | |
void | benchmark (const std::string &name, Function func, const std::vector< InputType > &input, unsigned int runs=0, bool quiet=false) |
Run a benchmark on a generic function, with the given input vector. | |
template<typename InputType = double, typename Function > | |
void | benchmark (const std::string &name, Function func, benchmark_options< InputType > opt) |
Run a benchmark on a generic function, with the given options. | |
template<typename InputType = double, typename Function > | |
void | benchmark (const std::string &name, Function func, InputGenerator< InputType > inputGenerator, unsigned int runs=0, unsigned int iterations=0, bool quiet=false) |
Run a benchmark on a generic function, with the given argument options. | |
void | wait_results () |
Wait for all concurrent benchmarks to finish execution. | |
std::vector< benchmark_result > | get_benchmark (const std::string &name) |
Get a list of benchmarks results associated to the given name or label. | |
benchmark_result | get_benchmark (const std::string &name, unsigned int i) |
Get a benchmark result associated to the given name or label and index. | |
Public Attributes | |
benchmark_settings | settings |
Settings for the benchmark context. | |
std::shared_ptr< output::output_context > | output |
Output module settings for the context, dynamically allocated and possibly shared between multiple contexts. | |
std::shared_ptr< random::random_context > | random |
Random module settings for the context, dynamically allocated and possibly shared between multiple contexts. | |
Benchmark module context, handling benchmark requests concurrently.
|
inline |
Run a benchmark on a generic function, with the given options.
The result is registered inside benchmarkResults.
name | The name of the test case |
func | The function to benchmark |
opt | The benchmark options |
|
inline |
Run a benchmark on a generic function, with the given input vector.
The result is registered inside benchmarkResults.
name | The name of the test case |
func | The function to benchmark |
input | The vector of input values (InputType must correspond to the argument of func, but may be any POD or aggregate data type, such as std::tuple). |
runs | The number of runs with the same input (defaults to settings.defaultRuns). |
|
inline |
Run a benchmark on a generic function, with the given argument options.
The result is registered inside benchmarkResults.
name | The name of the test case |
func | The function to benchmark |
run | The number of runs with the same input |
iterations | The number of iterations of the function |
inputGenerator | The input generator to use |
|
inline |
Get a list of benchmarks results associated to the given name or label.
|
inline |
Get a benchmark result associated to the given name or label and index.
|
inline |
Setup the benchmark environment.
moduleName | Name of the module under test. |
argc | The number of command line arguments |
argv | A list of C-style strings containing the command line arguments. |
Terminate the benchmarking environment.
If benchmarks have been run, their results will be printed.
exit | Whether to exit after terminating the module. |