<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    <p><br>
    </p>
    <div class="moz-cite-prefix">On 8/8/19 3:33 PM, Joel Sherrill wrote:<br>
    </div>
    <blockquote type="cite"
cite="mid:CAF9ehCWFC3XtuMbGp8uCO5Wu0LnuZ-hOCb6JXdMJs+-KBBkTyA@mail.gmail.com">
      <meta http-equiv="content-type" content="text/html; charset=UTF-8">
      <div dir="ltr">Hi
        <div><br>
        </div>
        <div>If you are subscribed to the build@ mailing list, then you
          saw the flurry of test</div>
        <div>results from over night. I built every variant and ran the
          test suite with RTEMS </div>
        <div>debug on and off.  Here are some observations:</div>
        <div><br>
        </div>
        <div>+ rv64imafd only has one test pass<br>
        </div>
        <div>+ rv64_iamd_medany only has one test pass<br>
        </div>
        <div>+ Generally speaking, 17-19 tests failed or timed out on
          every variant with <br>
        </div>
        <div>   551-553 passing. It would be great for someone to mark
          the tests in the </div>
        <div>   tcfg files as expected fails.</div>
        <div><br>
        </div>
        <div>Hopefully this gives someone incentive to look into the
          failures.</div>
        <div><br>
        </div>
        <div>I would also run them on qemu but I don't think we have an
          RSB recipe for a </div>
        <div>Qemu with RISC-V support.</div>
      </div>
    </blockquote>
    <p><br>
    </p>
    <p>I don't know about RV64, but most RV32 tests pass on sis using
      the griscv bsp:</p>
    <p><tt>$ rtems-test --rtems-bsp=riscv-sis
        riscv-rtems5/c/griscv/testsuites --log=all.txt</tt><tt><br>
      </tt><tt><br>
      </tt><tt>Passed:        633</tt><tt><br>
      </tt><tt>Failed:          0</tt><tt><br>
      </tt><tt>User Input:      5</tt><tt><br>
      </tt><tt>Expected Fail:   0</tt><tt><br>
      </tt><tt>Indeterminate:   0</tt><tt><br>
      </tt><tt>Benchmark:       3</tt><tt><br>
      </tt><tt>Timeout:         2</tt><tt><br>
      </tt><tt>Invalid:         0</tt><tt><br>
      </tt><tt>Wrong Version:   0</tt><tt><br>
      </tt><tt>Wrong Build:     0</tt><tt><br>
      </tt><tt>Wrong Tools:     0</tt><tt><br>
      </tt><tt>------------------</tt><tt><br>
      </tt><tt>Total:         643</tt><tt><br>
      </tt><tt>User Input:</tt><tt><br>
      </tt><tt> monitor.exe</tt><tt><br>
      </tt><tt> termios.exe</tt><tt><br>
      </tt><tt> top.exe</tt><tt><br>
      </tt><tt> capture.exe</tt><tt><br>
      </tt><tt> fileio.exe</tt><tt><br>
      </tt><tt>Benchmark:</tt><tt><br>
      </tt><tt> whetstone.exe</tt><tt><br>
      </tt><tt> dhrystone.exe</tt><tt><br>
      </tt><tt> linpack.exe</tt><tt><br>
      </tt><tt>Timeouts:</tt><tt><br>
      </tt><tt> crypt01.exe</tt><tt><br>
      </tt><tt> smpmrsp01.exe</tt><tt><br>
      </tt><tt>Average test time: 0:00:00.354592</tt><tt><br>
      </tt><tt>Testing time     : 0:03:48.002439</tt><br>
      <tt></tt></p>
    <p>The crypt01 would succeed if the timeout limit would be a bit
      longer<tt>.</tt> smpmrsp01 never terminates, this is most likely a
      simulator issue. Remaining tests including SMP pass as expected.</p>
    <p>Jiri.<br>
      <tt></tt></p>
    <blockquote type="cite"
cite="mid:CAF9ehCWFC3XtuMbGp8uCO5Wu0LnuZ-hOCb6JXdMJs+-KBBkTyA@mail.gmail.com">
      <div dir="ltr">
        <div><br>
        </div>
        <div>--joel</div>
      </div>
      <br>
      <fieldset class="mimeAttachmentHeader"></fieldset>
      <pre class="moz-quote-pre" wrap="">_______________________________________________
devel mailing list
<a class="moz-txt-link-abbreviated" href="mailto:devel@rtems.org">devel@rtems.org</a>
<a class="moz-txt-link-freetext" href="http://lists.rtems.org/mailman/listinfo/devel">http://lists.rtems.org/mailman/listinfo/devel</a></pre>
    </blockquote>
  </body>
</html>