NOTE: Starting from version 0.3.0 of the library:
- The library runtime is published to Maven Central and no longer published to Bintray.
- The Gradle plugin is published to Gradle Plugin Portal
- The Gradle plugin id has changed to
org.jetbrains.kotlinx.benchmark
- The library runtime artifact id has changed to
kotlinx-benchmark-runtime
NOTE: When Kotlin 1.5.0 or newer is used make sure the
kotlin-gradle-plugin
is pulled from Maven Central, not Gradle Plugin Portal. For more information: https://github.com/Kotlin/kotlinx-benchmark/issues/42
kotlinx.benchmark is a toolkit for running benchmarks for multiplatform code written in Kotlin and running on the next supported targets: JVM, JavaScript.
Technically it can be run on Native target, but current implementation doesn't allow to get right measurements in many cases for native benchmarks, so it isn't recommended to use this library for native benchmarks yet. See issue for more information.
If you're familiar with JMH, it is very similar and uses it under the hoods to run benchmarks on JVM.
Requirements
Gradle 6.8 or newer
Kotlin 1.4.30 or newer
Gradle plugin
Use plugin in build.gradle
:
plugins {
id 'org.jetbrains.kotlinx.benchmark' version '0.3.1'
}
For Kotlin/JS specify building nodejs
flavour:
kotlin {
js {
nodejs()
…
}
}
For Kotlin/JVM code, add allopen
plugin to make JMH happy. Alternatively, make all benchmark classes and methods open
.
For example, if you annotated each of your benchmark classes with @State(Scope.Benchmark)
:
@State(Scope.Benchmark)
class Benchmark {
…
}
and added the following code to your build.gradle
:
plugins {
id 'org.jetbrains.kotlin.plugin.allopen'
}
allOpen {
annotation("org.openjdk.jmh.annotations.State")
}
then you don't have to make benchmark classes and methods open
.
Runtime Library
You need a runtime library with annotations and code that will run benchmarks.
Enable Maven Central for dependencies lookup:
repositories {
mavenCentral()
}
Add the runtime to dependencies of the platform source set, e.g.:
kotlin {
sourceSets {
commonMain {
dependencies {
implementation("org.jetbrains.kotlinx:kotlinx-benchmark-runtime:0.3.1")
}
}
}
}
Configuration
In a build.gradle
file create benchmark
section, and inside it add a targets
section. In this section register all targets you want to run benchmarks from. Example for multiplatform project:
benchmark {
targets {
register("jvm")
register("js")
register("native")
}
}
This package can also be used for Java and Kotlin/JVM projects. Register a Java sourceSet as a target:
benchmark {
targets {
register("main")
}
}
To configure benchmarks and create multiple profiles, create a configurations
section in the benchmark
block, and place options inside. Toolkit creates main
configuration by default, and you can create as many additional configurations, as you need.
benchmark {
configurations {
main {
// configure default configuration
}
smoke {
// create and configure "smoke" configuration, e.g. with several fast benchmarks to quickly check
// if code changes result in something very wrong, or very right.
}
}
}
Available configuration options:
iterations
– number of measuring iterationswarmups
– number of warm up iterationsiterationTime
– time to run each iteration (measuring and warmup)iterationTimeUnit
– time unit foriterationTime
(default is seconds)outputTimeUnit
– time unit for results outputmode
– "thrpt" for measuring operations per time, or "avgt" for measuring time per operationinclude("…")
– regular expression to include benchmarks with fully qualified names matching it, as a substringexclude("…")
– regular expression to exclude benchmarks with fully qualified names matching it, as a substringparam("name", "value1", "value2")
– specify a parameter for a public mutable propertyname
annotated with@Param
reportFormat
– format of report, can bejson
(default),csv
,scsv
ortext
Time units can be NANOSECONDS, MICROSECONDS, MILLISECONDS, SECONDS, MINUTES, or their short variants such as "ms" or "ns".
Example:
benchmark {
// Create configurations
configurations {
main { // main configuration is created automatically, but you can change its defaults
warmups = 20 // number of warmup iterations
iterations = 10 // number of iterations
iterationTime = 3 // time in seconds per iteration
}
smoke {
warmups = 5 // number of warmup iterations
iterations = 3 // number of iterations
iterationTime = 500 // time in seconds per iteration
iterationTimeUnit = "ms" // time unity for iterationTime, default is seconds
}
}
// Setup targets
targets {
// This one matches compilation base name, e.g. 'jvm', 'jvmTest', etc
register("jvm") {
jmhVersion = "1.21" // available only for JVM compilations & Java source sets
}
register("js") {
// Note, that benchmarks.js uses a different approach of minTime & maxTime and run benchmarks
// until results are stable. We estimate minTime as iterationTime and maxTime as iterationTime*iterations
}
register("native")
}
}
Separate source sets for benchmarks
Often you want to have benchmarks in the same project, but separated from main code, much like tests. Here is how:
Define source set:
sourceSets {
benchmarks
}
Propagate dependencies and output from main
sourceSet.
dependencies {
benchmarksCompile sourceSets.main.output + sourceSets.main.runtimeClasspath
}
You can also add output and compileClasspath from sourceSets.test
in the same way if you want to reuse some of the test infrastructure.
Register benchmarks
source set:
benchmark {
targets {
register("benchmarks")
}
}
Examples
The project contains examples subproject that demonstrates using the library.