Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
H
hivemind
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Deploy
Releases
Package registry
Container Registry
Model registry
Operate
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
hive
hivemind
Merge requests
!352
Changed methodology for benchmarks
Code
Review changes
Check out branch
Download
Patches
Plain diff
Merged
Changed methodology for benchmarks
dk-benchmarks-ver-2
into
develop
Overview
0
Commits
3
Pipelines
0
Changes
5
Merged
Dariusz Kędzierski
requested to merge
dk-benchmarks-ver-2
into
develop
4 years ago
Overview
0
Commits
3
Pipelines
0
Changes
5
Expand
Usage of pytest-benchmarks is dropped since it call tests sequentially which very long time to complete, instead original tavern tests are used
Time measuring is done via durations switch to pytest and tavern
During benchmarking validate_response.py is disabled using nevly introduced env variable
Each run creates xml junit file with time data
After all runs data from files are combined and report file is generated
Tests above threshold are marked with red
0
0
Merge request reports
Compare
develop
version 2
8c1f9c27
4 years ago
version 1
7d59a053
4 years ago
develop (base)
and
latest version
latest version
375e8a69
3 commits,
4 years ago
version 2
8c1f9c27
2 commits,
4 years ago
version 1
7d59a053
1 commit,
4 years ago
5 files
+
144
−
13
Side-by-side
Compare changes
Side-by-side
Inline
Show whitespace changes
Show one file at a time
Files
5
Search (e.g. *.vue) (Ctrl+P)
scripts/ci_start_api_benchmarks.sh
0 → 100755
+
20
−
0
Options
#!/bin/bash
set
-e
pip3
install
tox
--user
export
HIVEMIND_ADDRESS
=
$1
export
HIVEMIND_PORT
=
$2
export
TAVERN_DISABLE_COMPARATOR
=
true
echo
Attempting to start benchmarks on hivemind instance listeing on:
$HIVEMIND_ADDRESS
port:
$HIVEMIND_PORT
ITERATIONS
=
$3
for
((
i
=
0
;
i<
$ITERATIONS
;
i++
))
do
echo
About to run iteration
$i
tox
-e
tavern-benchmark
--
-W
ignore::pytest.PytestDeprecationWarning
-n
auto
--junitxml
=
../../../../benchmarks-
$i
.xml
echo
Done!
done
./scripts/xml_report_parser.py
.
./tests/tests_api/hivemind/tavern
Loading