[OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

Wang, Yang (Young) yang.wang at windriver.com
Tue Aug 21 15:04:34 UTC 2018


Hi All,

I'm working on this ticket:
https://bugzilla.yoctoproject.org/show_bug.cgi?id=12372

As far as I know, the following are all true nowadays:
- Ptest needs to be run on real hardware and it takes few hours to finish
- Ptest can be run within OEQA, it can also be run independently
- LAVA is a good open source test  framework which:
   - can manage both real hardware and different kinds of simulators as the test devices
   - provides well managed logging system and test reports
 
How to automatically run Ptest? I think running it with LAVA is a good solution, but ...
 
LAVA is running as a server which can manage test jobs submitted to it, here is a typical LAVA job:
https://staging.validation.linaro.org/scheduler/job/231942/definition
As you can see, it defines the device type, test images which will used, the test cases and a lot of others.
 
So the typical automatic way to run a test through LAVA is to write a script which use a LAVA job template, replace images with the expected ones, and then submit it to LAVA though a command, for example:
$ lava-tool submit-job http://<user>@<lava-server> x86_64_ job_oeqa-ptest.yaml
 
This command will return a job id (take #231942 as an example), and then the script can get all logs and reports based on LAVA server address and this job id, for example:
- execution log: https://staging.validation.linaro.org/scheduler/job/231942
- test report: https://staging.validation.linaro.org/results/231942/0_smoke-tests
 
So, as far as I can tell, it may not be an appropriate way to integrate LAVA test into a bitbake command as we run it with simple test harness, LAVA is an advanced test framework and it manages all jobs submit to it well.
 
Please comment if you have better idea about this ticket.

Regards,
-Yang Wang




More information about the Openembedded-core mailing list