Iperf test definition comparison

From eLinux.org
Revision as of 16:17, 27 March 2019 by Tim Bird (talk | contribs) (test.yaml)
Jump to: navigation, search

This page has a comparison between the test definitions from Fuego and Linaro for the OpenSSL test.

Differences

  • Fuego only runs ...
  • Linaro runs ...

High Level Assumptions

  • Fuego does not disturb the system
    • if something is installed, it is removed, by default
    • if something is started, it is stopped
  • Fuego assumes you can run another test upon completion of one test
  • Linaro assumes a clean install, that will be replaced on next test
    • Things can be modified (packages installed, and forgotten about)
  • Fuego treats system like final product that is immutable
  • Linaro treats system like development system, that is mutable

preparation

building

  • Fuego cross-builds the test software
  • Linaro does not build the software

Pre-requisites

  • Fuego checks for cross-compiler variables
  • Linaro checks for root account

Alterations

  • Linaro can install packages required by openssl on the board
  • Fuego deploys the test software to the board

Execution

  • Linaro runs test for each crypto algorithm separately
  • Fuego runs test for all crypto algorithms together
  • Factorization of the test is different
    • dependency check, alterations, test execution, parsing are done on board for Linaro
    • dependency check, test execution, parsing are done on the Host for Fuego

Parsing

  • Linaro parses the output for each crypto test on the target using awk
  • Fuego parses the combined output on the host using python (parser.py)

Results

  • output is different

Presentation

  • Linaro doesn't include presentation control for the test results in the test

Metadata

  • Fuego specifies author, license gitrepo, for test program
  • Linaro specifies the devices for the test to run on
  • Linaro specifies distros where test can run

questions:

  • Linaro install_deps: does this also install the package itself (with the openssl binary)?
  • Linaro: what does send-to-laval.sh do?


Field comparisons

Field items
Fuego Linaro Notes
item use item use
fuego_test.sh:test_pre_check check required test pre-requisites (none in this test) sysbench.sh:! check_root && error_msg check required test pre-requisites Linaro code is inline in test script
fuego_test.sh:test_build cross-build the test program from tar sysbench.sh:install_sysbench download and build the test program -
- - sysbench.sh:install_sysbench install required packages for build Linaro has different build and dependency info per distro, Fuego has no notion of installing auxiliary packages on the board
fuego_test.sh:test_deploy Put test program on the board sysbench.sh:install_sysbench install test program on board (locally)
fuego_test.sh:test_run instructions to execute the test program on the board sysbench.yaml:run:steps: instructions to execute the test program on the board -
parser.py code to parse the test program log sysbench.sh:general_parser and awk lines code to parse the test program log Linaro parsing is done on board
spec.json indicates values for test variables (none for this test) sysbench.sh:NUM_THREADS= indicates values for test variables Linaro options are read on command line of test script
test.yaml:fuego_package indicates type/format of test sysbench.yaml:metadata:format indicates type/format of test -
test.yaml:name name of test sysbench.yaml:metadata:name name of test similar
test.yaml:description description of test sysbench.yaml:metadata:description description of test similar
test.yaml:license/author/version test program information - - Informational data
test.yaml:maintainer Maintainer of this Fuego test sysbench.yaml:metadata:maintainer Maintainer of this Linaro test similar
test.yaml:fuego_release Fuego revision of this test - - -
test.yaml:type type of test sysbench.yaml:metadata:scope type of test? -
- - sysbench.yaml:metadata:os OSes that this test can run on Linaro only?
- - sysbench.yaml:metadata:devices devices that this test can run on Linaro only? (Fuego board selection is done by user when creating jobs for boards?)
test.yaml:tags tags for this test - - Fuego only?
test.yaml:params test variable names, values, options (note: none in this test) sysbench.yaml:params test variable names and values similar
test.yaml:gitrepo upstream git repository for test program - - Fuego only?
test.yaml:data_files manifest used for packaging the test - - Fuego only?

Fuego source

fuego_test.sh

tarball=iperf-2.0.5.tar.gz

function test_build {
    # get updated config.sub and config.guess files, so configure
    # doesn't reject new toolchains
    cp /usr/share/misc/config.{sub,guess} .
    ./configure --host=$HOST --build=`./config.guess`
    sed -i -e "s|#define bool int|//#define bool int|g" config.h
    make config.h
    sed -i -e "s/#define HAVE_MALLOC 0/#define HAVE_MALLOC 1/g" -e "s/#define malloc rpl_malloc/\/\* #undef malloc \*\//g" config.h
    sed -i -e '/HEADERS\(\)/ a\#include "gnu_getopt.h"' src/Settings.cpp
    make
}

function test_deploy {
	put src/iperf  $BOARD_TESTDIR/fuego.$TESTDIR/
}

function test_run {
	cmd "killall -SIGKILL iperf 2>/dev/null; exit 0"

	# Start iperf server on Jenkins host
	iperf_exec=`which iperf`

	if [ -z $iperf_exec ];
	then 
	 echo "ERROR: Cannot find iperf"
	 false
	else
	 $iperf_exec -s &
	fi

	assert_define BENCHMARK_IPERF_SRV

	if [ "$BENCHMARK_IPERF_SRV" = "default" ]; then
	  srv=$SRV_IP
	else
	  srv=$BENCHMARK_IPERF_SRV
	fi

	report "cd $BOARD_TESTDIR/fuego.$TESTDIR; ./iperf -c $srv -t 15; ./iperf -c $srv -d -t 15" $BOARD_TESTDIR/fuego.$TESTDIR/${TESTDIR}.log
}

function test_cleanup {
	kill_procs iperf
}

parser.py

#!/usr/bin/python

import os, re, sys
import common as plib

#------------------------------------------------------------
#Client connecting to 10.90.101.49, TCP port 5001
#TCP window size: 16.0 KByte (default)
#------------------------------------------------------------
#[  3] local 10.90.100.60 port 38868 connected with 10.90.101.49 port 5001
#[ ID] Interval       Transfer     Bandwidth
#[  3]  0.0-15.0 sec   117 MBytes  65.4 Mbits/sec
#------------------------------------------------------------
#Server listening on TCP port 5001
#TCP window size: 85.3 KByte (default)
#------------------------------------------------------------
#------------------------------------------------------------
#Client connecting to 10.90.101.49, TCP port 5001
#TCP window size: 21.1 KByte (default)
#------------------------------------------------------------
#[  5] local 10.90.100.60 port 38869 connected with 10.90.101.49 port 5001
#[  4] local 10.90.100.60 port 5001 connected with 10.90.101.49 port 40772
#[ ID] Interval       Transfer     Bandwidth
#[  5]  0.0-15.0 sec  99.9 MBytes  55.7 Mbits/sec
#[  4]  0.0-15.2 sec  50.8 MBytes  28.0 Mbits/sec

# The following was also possible in the past for tx test:
#[  3]  0.0- 3.7 sec  9743717424271204 bits  0.00 (null)s/sec

ref_section_pat = "^\[[\w\d_ ./]+.[gle]{2}\]"
cur_search_pat = re.compile("^.* ([\d.]+) Mbits/sec\n.*\n.*\n.*\n.*\n.*\n.*\n.*\n.*\n.*\n.*\n.*\n.* ([\d.]+) Mbits/sec\n.* ([\d.]+) Mbits/sec", re.MULTILINE)

cur_dict = {}
pat_result = plib.parse(cur_search_pat)
if pat_result:
        for item in pat_result:
                #print item
                cur_dict["tcp.tx"] = item[0]
                cur_dict["tcp.bi_tx"] = item[1]
                cur_dict["tcp.bi_rx"] = item[2]

if "tcp.tx" in cur_dict:
        sys.exit(plib.process_data(ref_section_pat, cur_dict, 's', 'Rate, MB/s'))
else:
        print "Fuego error reason: could not parse measured bandwidth"

spec.json

{
    "testName": "Benchmark.iperf",
    "specs": {
        "default": {
            "SRV":"default"
        }
    }
}

chart_config.json

{
    "iperf":["tcp"]
}

test.yaml

None provided.

Linaro source

iperf.sh


sysbench.yaml