[RTEMS Project] #2963: Add a testsuite top level confguration file that is common to all tests.
RTEMS trac
trac at rtems.org
Fri Mar 31 07:05:39 UTC 2017
#2963: Add a testsuite top level confguration file that is common to all tests.
-------------------------+------------------------------
Reporter: Chris Johns | Owner: joel.sherrill@…
Type: defect | Status: assigned
Priority: normal | Milestone: 4.12.0
Component: testing | Version: 4.12
Severity: normal | Resolution:
Keywords: testing |
-------------------------+------------------------------
Comment (by Chris Johns):
Replying to [comment:3 Sebastian Huber]:
> Replying to [comment:2 Chris Johns]:
> >
> > I do not think adding 500+ files to state `fileio` is a `user-input`
test and will never complete is not good. Maybe global is not a great
word, maybe common is better. We need accurate data to determine the
results of tests.
>
> Why 500+ files, its just one:
>
> {{{
> diff --git a/testsuites/samples/fileio/init.c
b/testsuites/samples/fileio/init.c
> index 07ec2c6..68942e8 100644
> --- a/testsuites/samples/fileio/init.c
> +++ b/testsuites/samples/fileio/init.c
> @@ -34,6 +34,7 @@
> #include <rtems/nvdisk-sram.h>
> #include <rtems/shell.h>
>
> +/* FANCY TEST COMMENT: user-input */
> const char rtems_test_name[] = "FILE I/O";
>
> #if FILEIO_BUILD
> }}}
>
Sure I thought you were , but there is no standard for this plus and how
does the comment get to `rtems-test`.
I am leveraging the `expected-fail` mechanism to handle this. That needs
to be external to test.
All I am doing is collecting these things into a common place and a common
framework.
> >
> > It is similar to the work you have been doing to have a common
linkercmd file where ever possible. It is the same thing or are you saying
we should create a separate linker command file for every bsp as well? ;)
> >
> > Look at the results with a work in progress `rtems-test` for
`erc32-run`:
> >
> > {{{
> > Passed: 546
> > Failed: 1
> > User Input: 4
> > Expected Fail: 0
> > Indeterminate: 0
> > Timeout: 6
> > Invalid: 1
> > ------------------
> > Total: 558
> >
> > Failures:
> > spcontext01.exe
> > User Input:
> > fileio.exe
> > top.exe
> > termios.exe
> > monitor.exe
> > Timeouts:
> > jffs2_fssymlink.exe
> > mrfs_fserror.exe
> > dhrystone.exe
> > fsdosfsformat01.exe
> > imfs_fsrdwr.exe
> > whetstone.exe
> > Invalid:
> > minimum.exe
> > Average test time: 0:00:00.481800
> > Testing time : 0:04:28.844749
> > }}}
> >
> > Note, the benchmark tests have broken parallel testing because of the
time they now take.
>
> On my host these benchmark tests did run less than 3 minutes.
All cores fully loaded?
--
Ticket URL: <http://devel.rtems.org/ticket/2963#comment:4>
RTEMS Project <http://www.rtems.org/>
RTEMS Project
More information about the bugs
mailing list