asv_runner.benchmarks.time#

Module Contents#

Classes#

TimeBenchmark

Represents a single benchmark for timing.

Data#

API#

asv_runner.benchmarks.time.wall_timer#

None

class asv_runner.benchmarks.time.TimeBenchmark(name, func, attr_sources)#

Bases: asv_runner.benchmarks._base.Benchmark

Represents a single benchmark for timing.

This class inherits from Benchmark and is specialized for timing benchmarks.

Attributes

name_regex (re.Pattern)

Regular expression that matches the name of the timing benchmarks.

rounds (int)

Number of rounds to execute the benchmark.

repeat (int)

Number of times the code will be repeated during each round.

min_run_count (int)

Minimum number of runs required for the benchmark.

number (int)

The argument to timeit.timeit, specifying the number of executions of the setup statement.

sample_time (float)

The target time for each sample.

warmup_time (float)

The time spent warming up the benchmark.

timer (callable)

The timer to use, by default it uses timeit.default_timer.

Initialization

Initialize a new instance of TimeBenchmark.

Parameters

name (str)

The name of the benchmark.

func (callable)

The function to benchmark.

attr_sources (list)

A list of objects from which to draw attributes.

name_regex#

None

_load_vars()#

Loads benchmark variables from attribute sources.

do_setup()#

Execute the setup method and load variables.

_get_timer(*param)#

Get a timeit.Timer for the current benchmark.

run(*param)#

Run the benchmark with the given parameters.

Parameters

param (tuple)

The parameters to pass to the benchmark function.

Returns

result (dict)

A dictionary with the benchmark results. It contains the samples taken and the number of times the function was called in each sample.

Notes

The benchmark timing method is designed to adaptively find an optimal number of function executions to time based on the estimated performance. This number is then used for the final timings.

The warmup time is determined based on the Python interpreter in use. PyPy and GraalPython need longer warmup times due to their JIT compilers. For CPython, a short warmup time is used to account for transient effects such as OS scheduling.

The repeat attribute specifies how many times to run the function for timing. It can be an integer, meaning the function is run that many times, or a tuple of three values, specifying the minimum number of runs, the maximum number of runs, and the maximum total time to spend on runs.

After obtaining the timing samples, each sample is divided by the number of function executions to get the average time per function call, and these values are returned as the “samples” in the result.

benchmark_timing(timer, min_repeat, max_repeat, max_time, warmup_time, number, min_run_count)#

Benchmark the timing of the function execution.

Parameters

timer (timeit.Timer)

The timer to use for the benchmarking.

min_repeat (int)

The minimum number of times to repeat the function execution.

max_repeat (int)

The maximum number of times to repeat the function execution.

max_time (float)

The maximum total time to spend on the benchmarking.

warmup_time (float)

The time spent warming up the benchmark.

number (int)

The number of executions of the setup statement.

min_run_count (int)

The minimum number of runs required for the benchmark.

Returns

result (tuple)

A tuple with the samples taken and the number of times the function was called in each sample.

Notes

The too_slow internal function is used to stop taking samples when certain limits are exceeded. These limits are the minimum run count, the minimum repeat count, and the maximum time.

If number is zero, a suitable number of function executions is estimated, and the system is warmed up at the same time.

If the warmup time is greater than zero, a warmup phase is initiated where the function is called repeatedly until the warmup time has passed.

After these initial steps, the function execution times are sampled and added to the samples list, stopping when reaching the maximum repeat count or when the too_slow function indicates to stop.

asv_runner.benchmarks.time.export_as_benchmark#

None