RFC: Value of New Section on Tools Build Time Expectations
joel at rtems.org
Mon Oct 22 14:18:27 UTC 2018
On Sun, Oct 21, 2018 at 6:59 PM Chris Johns <chrisj at rtems.org> wrote:
> On 22/10/2018 09:11, Joel Sherrill wrote:
> > On Sun, Oct 21, 2018 at 2:16 PM Christian Mauderer <list at c-mauderer.de
> > <mailto:list at c-mauderer.de>> wrote:
> > Am 21.10.18 um 19:07 schrieb Joel Sherrill:
> > > Hi
> > >
> > > I am in the middle of reconfiguring my old (~2013 i7) laptop as an
> > > and remote workstation for home. Between helping students and
> doing two
> > > Kick Starts in the past 6 weeks, VM configuration, disk space
> > > expectations, and time required to build the tools seem to be a
> > > that needs to be addressed. Under-configured VMs don't finish
> > > or take a LONG time.
> My only concern with VMs is promoting the idea of needing a VM for RTEMS
> development. A lot of work has gone into making the tools native across a
> number of hosts and native tools is the best solution for a user. I hope we
> encourage and teach using native tools before a VM.
I am sorry if it seemed the emphasis was on VMs. My intent was to include
build times for sparc-rtems5 with the source pre-downloaded on a variety
of host environments and computer levels. The time varies a lot.
Yes there would be some VM advice but it would be secondary to the idea
that if you build on a Pi or an i3 with 5200 RPM laptop drive, expect it to
a long time. Plus even on a fast machine, all I can say is that Cygwin is
I can't give an estimate.
> I have used native RTEMS Windows and Mac tools for development. It can mean
> learning some different work flows but in the end it is all rather boringly
> similar. The most difficult bit on Windows is debugging and the solution
> to be remote tcp to a GDB server else where. Who else has used Windows or
> for development?
I did list including times for MSYS2 and Cygwin in my list. I don't have a
> > > I am proposing that I gather performance and configuraton notes for
> > > building SPARC tools after downloading source on a few
> What about a link to the builds mailing list archive with something about
> values to look for? Could the host memory and CPU type be added to the
That won't help the people with problems because anyone who posts to that
list has (1) a fast machine and (2) has tuned it. I used 8 and 12 core
with SSDs to report those from. I doubt they are representative of what a
GCI student uses.
> > >
> > > + Laptop 2013 i7: Centos on VM
> > > + Laptop 2013 i7: MSYS2
> > > + Laptop 2013 i7: Cygwin
> Are there any cygwin build results posted to builds?
I can include that in my next build sweep. Part of my plan was just to
a lot and gather the same information. Report it so people could look at a
with some advice and know what to expect.
I mention my 2013 i7 because even though it is old, it really is not that
in performance from a new low-mid end laptop.
> > > + Laptop 2017 i7: Same three
> > > + 8 core Xeon Centos native
> > > + 12 core i7 Fedora native
> I would prefer we did not gathering and publish in documentation
> that is hard to be consistent, could be misleading and is of often out of
> as soon as it is published. I remember removing some stuff like this when I
> moved the docs to Sphinx as the data was over ten years old. I cannot find
> now in the old doco.
OK. I will see if I can generalize and maybe make a blog post.
> I am fine with the amount of disk space needed to build a tool set and
> then a
> more general comment that the more cores, memory and fast disks you use the
> faster the build will be. My Windows builds are on stripped disks.
My point exactly. The core developers build on machines that are not
of the average user.
> For Windows you can document the POSIX layer for the shell etc adds
> plus virus checking slows the build down so the build directory should be
> as a directory not to check. I have no idea how cygwin and Windows Defender
> interact but I suspect it will slow things down by a lot and excluding it
> On Windows does the virus scanner scan the VM disk file in real-time or
> are the
> VM's smart enough to have that file excluded?
I don't know. I have seen us have to disable virus scanners on some
here at OAR but it tends to be on the Windows VMs themselves.
> > > One the 2017 7th generation i7, differences in VM configuration can
> > > result in the build time for sparc tools almost tripling.
> > >
> > > Does this sound useful or confusing? I know it is potentially
> > > volatile information. But my old 3rd generation i7 CPU benchmark
> > > are comparable to an i5 that is much newer. Plus my old i7 has an
> > > which many newer i3/i5 laptops do not.
> > >
> > > Feedback appreciated.
> > >
> > > --joel
> > >
> > Hello Joel,
> > in my experience, the biggest difference in the build time is the
> > of cores (in a VM or on a real machine). The processor generation
> > seem to have that much influence. But I never measured exact numbers.
> > I only mention the processor generation because we don't tend to have i3
> or i5
> > CPUs available but students and users do. My 6-year old i7 benchmarks
> like a
> > newer i5. But often i5's don't come with SSDs so they can suffer even
> > Number of cores and RAM are the two big factors. As is making sure you
> > enough disk space allocated to avoid turning the entire thing in an
> exercise in
> > frustration.
> Agreed, the RSB has been updated recently to report usage.
And this helps. I just want to give advice based on that before someone
a VM or partitions a disk.
> > It might would be a good idea to add some rough numbers somewhere
> > in the RSB-manual) so that a new user knows that he has to expect for
> Hmm, User manual instead?
I think it should be there. I would like the RSB manual to have "internals"
"developer" in the title. Using it should not be in it.
> I am happy for general guide lines or instructions on improving the
> for new users with new systems, for example with the latest patch I pushed
> the weekend running a set builder command with `--dry-run` will let you
> know if
> the Python libraries are installed before anything is built. I am not
> about the cost/benefit for any specific detail such as build times and host
> processor types.
You haven't spend a day helping a room full of people try to build the tools
and finding out that many fail due to lack of disk space or take forever due
to underconfigured VMs. I am not saying we should recommend VMs, just
that we should give admit people use them and give advice.
Sometimes I have a 50+% fail rate and end up with people resizing disks
or reloading VMs.
> Lets not forget building the tools should be once for a project and not
> something you do each week.
I agree. But it is the first thing you do and that's the first impression.
As the old saying goes, first impressions count.
> > example roughly 4 to 6 hours on a single core or about 1/2 to 3/4 of
> > hour on a 8 core Linux system. It might could be interesting to have
> > some rough numbers for other commonly used systems too (like MSYS2,
> > Cygwin, FreeBSD or MacOS).
> The builds mailing list has real values ...
> For Windows I have ...
> I stopped the build because I am fixing some more places where long paths
> being used which is why the arm build broke ...
> > My 7th generation i7 (Dell last fall from last fall) is ~35 minutes for
> SPARC as
> > I tune my VM.
> A fast i7 macbook pro (PCIe SSD, 32G RAM, APFS) and native tools is around
> for a bfin. The Mac posts to builds are from a Mac Mini with less
> performance ...
> and the bfin is 17mins. I use the bfin to test the RSB cause it is fast to
Good for smoke tests. Yet all our Getting Started is for sparc and that's a
Looks like an hour from the same build run:
That's ~2x a Centos VM on my laptop. So it varies a lot.
We also had someone on the gci@ mailing list who seemed to take days for
the build to complete.
> > A student in a Kick Start with the same laptop turned that into
> > a 90 minute build by having 1 core and less RAM. So even with a fast
> > the guidance on the VM is important.
> How about "Give it everything you have!" and "Don't try and play video
> game!" :)
> > Ignoring those who pick tiny virtual HD sizes and then can't even
> complete the
> > build no matter how long they wait.
> We should document this. I have had real problems with VirtualBox and
> host disks. Last time I tried about 12 months ago it did not work.
AFAIK you can't successfully build on a mounted share drive. That's why it
to allocate enough disk space to the native filesystem for the virtual OS.
> > I don't think that it is useful to compare processor generations.
> > would be an information that would have to be updated on a regular
> > to have any use. I would only add some (few) examples.
> > The CPU generation wasn't the point. Just that the older one is slower.
> > generation ones are often only 20% faster than the old one. Just a
> reference point. >
> > If you find any big influences beneath number of cores (you
> mentioned VM
> > settings), it might would be worth adding a general section with tips
> > for speeding up the build process.
> > Not failing is the biggest one. I recommend downloading all source first
> since that
> > sometimes fails, doublechecking host packages, and Chris and I moved gdb
> It is what happens when we spend a couple of days commuting from Oakland to
> Haywood in the Bay area. :)
> > gcc/newlib in the tools bset so users would fail on that early rather
> than last.
> And the latest patch has code to find Python.h and libpython<M><m>.* ..
> > That's about it beyond VM tuning and expectations. If you have an i3
> with a
> > 5200RPM slow laptop drive, it is going to take a while. We say 30-45
> > and we all have nice computers with tuned VMs.
> > And Windows builds are WAY slower and I can't even give you an estimate
> > this point. I just walk away.
> Building the tools is slower but you can get the overhead to be just the
> Cygwin/MSYS overhead and not much more. A build of libbsd with native
> tools should be fast and my experience it is. To me this is more important
> the tools build time and we should not lose sight of this. My hope is users
> spend more time building applications than tools.
Me too but they have to finish the tools. :)
> > So this was just "here's what we know about what to expect and what you
> > do to help".
> Seem like a good idea.
That's all I was trying to capture. Build times seem to vary by a factor of
core developers and users. Especially students with lower end computers. We
folks to succeed.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the devel