[OE-core] [PATCH] image: add mechanism to run QA checks on the image once it's built

Joshua G Lock joshua.g.lock at linux.intel.com
Tue Jun 7 20:48:31 UTC 2016


On Tue, 2016-06-07 at 08:48 -0700, Christopher Larson wrote:
On Tue, Jun 7, 2016 at 7:50 AM, Joshua Lock <joshua.g.lock at intel.com>
wrote:
> Add a mechanism to run QA checks on a constructed image once it's> 

> complete. All checks will be run with any one failure resulting in> 

> a failed build.> 
> 
> 

> QA checks should be > bitbake>  functions which throw a> 

> NotImplementedError when the check QA fails, with any error> 

> messages passed to the exception.> 
> 
> 

> Specify which checks to run by adding them to IMAGE_QA_COMMANDS.> 
> 
> 

> i.e.> 
> 
> 

> IMAGE_QA_COMMANDS += " \> 

>     image_check_everything_ok \> 

> "> 
> 
> 

> python image_check_everything_ok () {> 

>     raise NotImplementedError('This check always fails')> 

> }> 
> 
> 

> This code is based heavily on the configuration upgrade code in> 

> sanity.> bbclass> .> 
> 
> 

> [YOCTO #9448]> 
> 
> 

> Signed-off-by: Joshua Lock <joshua.g.lock at intel.com>

> > Add a mechanism to run QA checks on a constructed image once it's
> > complete. All checks will be run with any one failure resulting in
> > a failed build.
> > 
> > QA checks should be bitbake functions which throw a
> > NotImplementedError when the check QA fails, with any error
> > messages passed to the exception.
> > 
> > Specify which checks to run by adding them to IMAGE_QA_COMMANDS.
> > 
> > i.e.
> > 
> > IMAGE_QA_COMMANDS += " \
> >     image_check_everything_ok \
> > "
> > 
> > python image_check_everything_ok () {
> >     raise NotImplementedError('This check always fails')
> > }
> > 
> > This code is based heavily on the configuration upgrade code in
> > sanity.bbclass.
> > 
> > [YOCTO #9448]
> > 
> > Signed-off-by: Joshua Lock <joshua.g.lock at intel.com>
> > 
> 
> 
> What's the behavior if your qa function fails with a different
> exception? What if a user writes a shell qa check function, what's
> the behavior? Also, this seems like overloading the purpose of
> NotImplementedError. IMO It'd be cleaner to either use a custom
> exception or re-use python unit testing bits / use assert rather than
> subverting this one to a different purpose.
> 
> 
> 
> 
> 
> What's the behavior if your qa function fails with a different
> exception? What if a user writes a shell qa check function, what's
> the behavior? Also, this seems like overloading the purpose of
> NotImplementedError. IMO It'd be cleaner to either use a custom
> exception or re-use python unit testing bits / use assert rather than
> subverting this one to a different purpose.
> 
> 
> 
Different exceptions to NotImplementedError result in a backtrace… 
The python exception pattern using NotImplementedError I copied from
the configuration upgrade code in sanity.bbclass, because I want to
replicate that same behaviour where the checks are able to pass a
reason back to the task which calls the check functions.
Shell functions are a good point, not least of all because I suspect
many checks could be much more concise with sh.
I'll go back to the drawing board for a more generic solution that
supports both sh and python tasks.
Thanks for the review,
Joshua
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openembedded.org/pipermail/openembedded-core/attachments/20160607/d89add37/attachment-0002.html>


More information about the Openembedded-core mailing list