Difference between revisions of "Labgrid survey response"

From eLinux.org
Jump to: navigation, search
(highlight a few key (IMHO) items)
(Add links to my ELCE talk.)
 
Line 200: Line 200:
  
 
= Additional Data =
 
= Additional Data =
 +
ELCE 2017 Talk "Automation beyond Testing and Embedded System Validation" by Jan Lübbe: [[Media:PRE-trunk-ELCE-Automation-beyond-Testing.pdf | PDF]] [https://www.youtube.com/watch?v=S0EJJM5bVUY Video]

Latest revision as of 07:56, 4 October 2018

labgrid survey response

labgrid survey response provided by Jan Lübbe

diagram - missing element

Regarding the diagram, one aspect of labgrid that is not represented there is the interactive access to boards in the lab during development (build a system image locally, login&debug, run a testsuite from the commandline).

Survey Questions

  • What is the name of your test framework? labgrid

Which of the aspects below of the CI loop does your test framework perform?

Does your test framework:

source code access

  • access source code repositories for the software under test? No. It normally tests complete system images build by a build system like OE, ptxdist or buildroot.
  • access source code repositories for the test software? Provided by user. Usually Jenkins.'
  • include the source for the test software? No. labgrid installs no SW on the DUT, that needs to be provided in the image.
  • provide interfaces for developers to perform code reviews? No.
  • detect that the software under test has a new version? No. Handled by Jenkins.
    • if so, how? (e.g. polling a repository, a git hook, scanning a mail list, etc.) Polling or hook via Jenkins
  • detect that the test software has a new version? Polling or hook via Jenkins

test definitions

Does your test system:

  • have a test definition repository? No. Created individually per project.
    • if so, what data format or language is used (e.g. yaml, json, shell script) yaml, Python scripts with pytest'

Does your test definition include:

  • source code (or source code location)? Yes, but not the SW running on the DUT
  • dependency information? Yes, for HW features (such as camera, board variant)
  • execution instructions? Yes.
  • command line variants? Yes, as part of the test cases.
  • environment variants? Yes, as part of the test cases.
  • setup instructions? Yes. Called "Strategy" in labgrid.
  • cleanup instructions? No. Test setup can/should cope with arbitrary DUT state."
    • if anything else, please describe:

Does your test system:

  • provide a set of existing tests? No.
    • if so, how many?

build management

Does your test system:

  • build the software under test (e.g. the kernel)? No. Handled by build system under Jenkins.
  • build the test software? No. Built as part of the image by the build system.
  • build other software (such as the distro, libraries, firmware)? No.
  • support cross-compilation? Indirectly, as above.
  • require a toolchain or build system for the SUT? Yes, but not a specific one.
  • require a toolchain or build system for the test software? Yes.
  • come with pre-built toolchains? No.
  • store the build artifacts for generated software? No. Handled by Jenkins.
    • in what format is the build metadata stored (e.g. json)? Jenkins XML?
    • are the build artifacts stored as raw files or in a database? Raw files on Jenkins master, accessible by the Jenkins API.
      • if a database, what database?

Test scheduling/management

Does your test system:

  • check that dependencies are met before a test is run? No. Test cases with missing dependencies are skipped.
  • schedule the test for the DUT? Yes.
    • select an appropriate individual DUT based on SUT or test attributes? No. (Not yet ;)
    • reserve the DUT? Yes.
    • release the DUT? Yes.
  • install the software under test to the DUT? Yes, under control of the selected Strategy.
  • install required packages before a test is run? Yes, as part of the image.
  • require particular bootloader on the DUT? (e.g. grub, uboot, etc.) No. Drivers exist for U-Boot and Barebox.
  • deploy the test program to the DUT? No.
  • prepare the test environment on the DUT? Yes. Handled by the Strategy.
  • start a monitor (another process to collect data) on the DUT? No. (Not yet)
  • start a monitor on external equipment? Yes. Logic analyzer, oscilloscope, camera
  • initiate the test on the DUT? Yes.'
  • clean up the test environment on the DUT? Optionally, when done by the strategy.

DUT control

Does your test system:

  • store board configuration data? Yes.
    • in what format? yaml
  • store external equipment configuration data? Yes.
    • in what format? yaml
  • power cycle the DUT? Yes.
  • monitor the power usage during a run? Optionally
  • gather a kernel trace during a run? Only as part of normal console output.
  • claim other hardware resources or machines (other than the DUT) for use during a test? Yes.
  • reserve a board for interactive use (ie remove it from automated testing)? Yes. This is primary use-case for labgrid.
  • provide a web-based control interface for the lab? No.
  • provide a CLI control interface for the lab? Yes. "labgrid-client"

Run artifact handling

Does your test system:

  • store run artifacts No. Handled by Jenkins
    • in what format?
  • put the run meta-data in a database? No. Handled by Jenkins
    • if so, which database?
  • parse the test logs for results? Done as part of each test case.
  • convert data from test logs into a unified format? Yes. Handled by pytest, can contain additional benchmark results.
    • if so, what is the format? junit-XML
  • evaluate pass criteria for a test (e.g. ignored results, counts or thresholds)? Yes. Handled by pytest in each test case.
  • do you have a common set of result names: (e.g. pass, fail, skip, etc.) Yes.
    • if so, what are they? pass, fail, error, skip, xfail (provided by pytest)
  • How is run data collected from the DUT?
    • e.g. by pushing from the DUT, or pulling from a server? Pulled from the DUT (serial, ssh, ...)
  • How is run data collected from external equipment? Pulled during test execution
  • Is external equipment data parsed?

User interface

Does your test system:

  • have a visualization system? No.
  • show build artifacts to users? No. Handled by Jenkins.
  • show run artifacts to users? Yes with the CLI, otherwise handled by Jenkins.
  • do you have a common set of result colors? No.
    • if so, what are they?
  • generate reports for test runs? Yes, via pytest.
  • notify users of test results by e-mail? No. Handled by Jenkins.
  • can you query (aggregate and filter) the build meta-data? No.
  • can you query (aggregate and filter) the run meta-data? No.
  • what language or data format is used for online results presentation? (e.g. HTML, Javascript, xml, etc.) Handled by Jenkins via XML.
  • what language or data format is used for reports? (e.g. PDF, excel, etc.) junit-XML, HTML
  • does your test system have a CLI control tool? Indirectly.
    • what is it called? pytest

Languages:

Examples: json, python, yaml, C, javascript, etc.

  • what is the base language of your test framework core? Python 3

What languages or data formats is the user required to learn? yaml-based configuration for DUT access (as opposed to those used internally)

Can a user do the following with your test framework:

  • manually request that a test be executed (independent of a CI trigger)? Yes.'
  • see the results of recent tests? Yes.
  • set the pass criteria for a test? Yes.
    • set the threshold value for a benchmark test? Yes.
    • set the list of testcase results to ignore? No.
  • provide a rating for a test? (e.g. give it 4 stars out of 5) No.
  • customize a test? Yes.
    • alter the command line for the test program? Yes.
    • alter the environment of the test program? Yes.
    • specify to skip a testcase? Yes.
    • set a new expected value for a test? Yes.
    • edit the test program source? Not the part running on the DUT.
  • customize the notification criteria? No. Handled by Jenkins.
    • customize the notification mechanism (eg. e-mail, text) No. Handled by Jenkins.
  • generate a custom report for a set of runs? No.
  • save the report parameters to generate the same report in the future? No.

Requirements

Does your test framework:

  • require minimum software on the DUT? posix shell
  • require minimum hardware on the DUT (e.g. memory) usually a serial console and power control
    • If so, what? (e.g. POSIX shell or some other interpreter, specific libraries, command line tools, etc.)
  • require agent software on the DUT? (e.g. extra software besides production software) No.
    • If so, what agent?
  • is there optional agent software or libraries for the DUT? No.
  • require external hardware in your labs? Power control, serial console (USB or ethernet), optionally many others.

APIS

Does your test framework:

  • use existing APIs or data formats to interact within itself, or with 3rd-party modules? Python modules.
  • have a published API for any of its sub-module interactions (any of the lines in the diagram)? No.
    • Please provide a link or links to the APIs?

Sorry - this is kind of open-ended...

  • What is the nature of the APIs you currently use?

Are they:

    • RPCs? Yes, between parts of labgrid. We use WAMP/Autobahn with the crossbar.io router
    • Unix-style? (command line invocation, while grabbing sub-tool output) Yes. For ssh, rsync, imx-usb-loader, ...
    • compiled libraries? No.
    • interpreter modules or libraries? Many python modules: onewire, modbus, snmp, xena, pyudev, xmodem, ...
    • web-based APIs? Yes, for some power switches.
    • something else?

Relationship to other software:

  • what major components does your test framework use (e.g. Jenkins, Mondo DB, Squad, Lava, etc.) pytest, Jenkins
  • does your test framework interoperate with other test frameworks or software?
    • which ones? pytest (so any pytest plugins can be used)

Overview

Please list the major components of your test system.

Please list your major components here:

  • exporter: provides access to the DUT and additional test HW
  • coordinator: monitors exported resources and stores lab configuration/state
  • client: CLI to control a lab and DUTs
  • python API: used by pytest testsuites to access the DUTs

Glossary Notes

Here is a glossary of terms. Please indicate if your system uses different terms for these concepts. Also, please suggest any terms or concepts that are missing.

  • Dependency - indicates a pre-requisite that must be filled in order for a test to run (e.g. must have root access, must have 100 meg of memory, some program must be installed, etc.)
 These are called "features" in labgrid

Missing is the concept of a collection of similar DUTs, as used when scheduling a test on one of many DUTs.

Additional Data

ELCE 2017 Talk "Automation beyond Testing and Embedded System Validation" by Jan Lübbe: PDF Video