I've recently been experimenting with different toolchains (different
versions of GCC anyways) based on the build instructions Rob Emanuele posted a little while ago. The bare instructions he provides, I believe work, but I've since customized them to include some of the optimizations that are in the eLua docs for GCC compilation (I've attached the makefile) trying to get things as small as possible. In order for it to work with eLua (at least for LM3S) the linker script needs to be updated to support builds for an arm eabi target (the eagle_mmc branch has the updates needed). I think what I have now should actually be able to build for thumb2, thumb, and arm targets, since it has multilib enabled, but I don't have any other targets to test :-) One interesting thing I've noticed in all of this, however is that I tried using my build script with both GCC 4.3.3 and with the latest CodeSourcery G++ Lite sources, and they generate somewhat different code. Aside from some changes in instruction choices, I've noticed that the CodeSourcery version generates code that is 5-10% smaller than regular GCC with otherwise exactly the same options. I'm not sure if this because of some changes to GCC since CodeSourcery did their last branch (they merge patches back to gcc, but sometimes it takes some time), or if they are changes/optimizations CS have made themselves. In any case, it is pretty impressive :-) One minor downside is that after starting up eLua, collectgarbage("count") seems to start out around 6kb instead of 5.7kb, but that seems like it could be not a bad tradeoff given that it can shave 10-20kb off of the flash size. I've not done a lot of additional memory usage testing so I'm not sure if there are other areas where things are better or worse :-) The same general instructions apply from Rob's post from Feb 4, grab gcc, newlib, binutils, gdb and extract them in the same directory. Make sure the dirs are named gcc, newlib, binutils and gcc. The same basic approach works with codesourcery. -- James Snyder Biomedical Engineering Northwestern University jbsnyder at fanplastic.org http://fanplastic.org/key.txt ph: (847) 644-2322 -------------- next part -------------- An HTML attachment was scrubbed... URL: https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/45aded30/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: Makefile Type: application/octet-stream Size: 1806 bytes Desc: not available Url : https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/45aded30/attachment.obj -------------- next part -------------- An HTML attachment was scrubbed... URL: https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/45aded30/attachment-0001.html -------------- next part -------------- A non-text attachment was scrubbed... Name: build.sh Type: application/octet-stream Size: 123 bytes Desc: not available Url : https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/45aded30/attachment-0001.obj -------------- next part -------------- An HTML attachment was scrubbed... URL: https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/45aded30/attachment-0002.html -------------- next part -------------- A non-text attachment was scrubbed... Name: PGP.sig Type: application/pgp-signature Size: 194 bytes Desc: This is a digitally signed message part Url : https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/45aded30/attachment.pgp |
I finally got both the ARM Cortex and i386 toolchains up, successfully built
eLua with both and booted my own i386 build on the PC. Very impressive Bogdan! Some notes: Should I be worried about warning messages during these toolchain builds (NB not eLua - that builds very clean)? There seem to be rather a lot of quite scary sounding warnings and I am more used to source code that disables any warnings that should be ignored. There is an error in the "Building GCC for i386" page: In Step 3: Newlib, in the "build" command "-fdata-sections-DPREFER_SIZE_OVER_SPEED" should be "-fdata-sections -D__PREFER_SIZE_OVER_SPEED". I was wondering if there is any reason (technical, legal, spiritual) why we should not simply post archives of the "cross-xxxx" folders of tested toolchains? I know it would spoil a lot of good clean fun, but it would also save a lot of time! Thanks everyone who helped! - John -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/x-pkcs7-signature Size: 3200 bytes Desc: not available Url : https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/2962d9aa/attachment-0001.bin |
In reply to this post by jbsnyder
> One interesting thing I've noticed in all of this, however is
> that I tried using my build script with both GCC 4.3.3 and with > the latest CodeSourcery G++ Lite sources, and they generate > somewhat different code. ?Aside from some changes in instruction > choices, I've noticed that the CodeSourcery version generates code > that is 5-10% smaller than regular GCC with otherwise exactly the > same options. The CodeSourcery 2008q3 release is based on GCC 4.3.2 so there are differences with the GCC 4.3.3 from GNU. They do make changes to the baseline releases that take a few months to go into the general GNU distribution. The GCC 4.3.2 provided by CodeSourcery probably includes changes that are not part of the baseline GCC 4.3.3. Since ARM Holdings funds CodeSourcery's maintenance of the GNU toolchain for ARM, they do place significant resources in supporting ARM targets. For production work, it is generally better to use a cross toolchain that is tested and updated. You certainly tend to lose less sleep over compiler or linker issues. The CodeSourcery G++ Lite 2008q3 for ARM EABI is currently on build #66 using GCC 4.3.2 (e.g. 66 builds with that same compiler version). That means that they have made at least 66 changes to the general GNU distribution. It is unlikely that on your first build you will get a toolchain that is as reliable as a toolchain with 66 builds. Their installed base is also relatively large so they get much more feedback. Most people that do embedded development for a living have had incidents where they spend hours debugging a problem that turns out to be related to a compiler or linker problem (e.g. the sleep loss argument). Using a tested distribution reduces the likelihood of that. CodeSourcery G++ Lite is free so cost is not an issue. Yet redistribution requires an agreement with CodeSourcery so by definition it is not freely redistributable. If you need a freely redistributable GNU toolchain for ARM, one alternative is devkitARM. It is not as well tested and supported as CodeSourcery but it is updated periodically, can be freely redistributed and has builds for the more popular targets (Linux, Windows and Mac OS X). If code size is important for your project, the IAR EWARM compiler has significant advantages over GCC. The reduction depends on the project but it is common to see code size reductions of 10 to 20% with production applications. In some cases that could allow you to use a microcontroller with a smaller flash. If you use newlib, the reduction is generally more because the IAR C library for ARM is very optimized for that platform. Regards, Jesus Alvarez |
On Feb 20, 2009, at 1:18 PM, Jesus Alvarez wrote: > CodeSourcery G++ Lite is free so cost is not an issue. Yet > redistribution > requires an agreement with CodeSourcery so by definition it is not > freely > redistributable. If you need a freely redistributable GNU toolchain > for ARM, > one alternative is devkitARM. It is not as well tested and supported > as > CodeSourcery but it is updated periodically, can be freely > redistributed and > has builds for the more popular targets (Linux, Windows and Mac OS X). Is that source redistribution or binary redistribution that is restricted? Or is it both? I didn't need to patch it at all to get things working for me, and it builds without any problems on my Mac. I like having sources for my toolchain as well as a reliable method for building them, so that I'm not stuck if someone decides to stop releasing binaries for some platform, or if there's a bugfix and the binaries appear for not all platforms I'm not stuck waiting. I'll check out devkitARM. It looks like they have binaries for a bunch of different platforms, and that the sources are up as well. > If code size is important for your project, the IAR EWARM compiler has > significant advantages over GCC. The reduction depends on the > project but it > is common to see code size reductions of 10 to 20% with production > applications. In some cases that could allow you to use a > microcontroller > with a smaller flash. If you use newlib, the reduction is generally > more > because the IAR C library for ARM is very optimized for that platform. I've heard some good things about them. Unfortunately there's a pretty high price tag associated with using IAR's tools unless you only need 32k. Thanks for the insight on these different options :-) Do you have any thoughts on the EABI vs non-EABI stuff? I assume that when you build GCC targeting arm-elf, as in the instructions given for eLua, that those are not EABI binaries. I know that EABI is meant to allow software built by different toolchains to interoperate. I'm not sure if this really matters for eLua, except if someone wanted to try and distribute binary-only modules or something for it. Are there any other advantages/disadvantages to building things that use EABI? -- James Snyder Biomedical Engineering Northwestern University jbsnyder at fanplastic.org http://fanplastic.org/key.txt ph: (847) 644-2322 -------------- next part -------------- An HTML attachment was scrubbed... URL: https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/1b3fc605/attachment.html -------------- next part -------------- A non-text attachment was scrubbed... Name: PGP.sig Type: application/pgp-signature Size: 194 bytes Desc: This is a digitally signed message part Url : https://lists.berlios.de/pipermail/elua-dev/attachments/20090220/1b3fc605/attachment.pgp |
> Is that source redistribution or binary redistribution that is restricted?
?Or is it both? Their binaries can't be freely redistributed without an agreement. You can get them from their CDs or download them from their web site, but can't include them with your application or post them in your web site. I don't know if their sources can be freely redistributed. They certainly can't restrict redistribution of the baseline GNU sources because of the licensing. One thing they do explicitly allow is downloading their source distribution, building binaries and redistributing those. Some ARM hardware and software vendors do this. One restriction is that the resulting binaries can't include any reference to CodeSourcery (e.g. on --version or elsewhere). > I've heard some good things about them. ?Unfortunately there's a pretty > high price tag associated with using IAR's tools unless you only need 32k. University or volunteer projects generally can't justify purchasing commercial tools. It is difficult to justify paying for tools when you are not paid for your work. Some commercial projects do justify IAR tools, particularly those where a smaller code size or reduced debugging time result in lower project costs. If you do embedded ARM development for a living, IAR can be worthwhile. > Do you have any thoughts on the EABI vs non-EABI stuff? There are some advantages in EABI. Since the object format is well defined so you can link and debug libraries built with different compilers. Software interrupts (swi in assembler) and structure packing are more efficient. Floating point can be significantly better, even on ARMs without an FPU. There's a floating point comparison in http://www.linuxdevices.com/articles/AT5920399313.html Why ARM's EABI matters The EABI advantages may not be significant to many eLua projects. But there is little incentive in staying with ELF. All current ARM compilers (IAR, Keil, GCC and its derivatives) default to EABI. Regards, Jesus Alvarez |
> There are some advantages in EABI. Since the object format is well defined
> so you can link and debug libraries built with different compilers. > Software > interrupts (swi in assembler) and structure packing are more efficient. > Floating point can be significantly better, even on ARMs without an FPU. > There's a floating point comparison in > > http://www.linuxdevices.com/articles/AT5920399313.html > Why ARM's EABI matters Thank you very much for this! I finally understood what's the deal with this EABI stuff. As for the floating point issue, I'm still trying to understand how the difference in ABIs only can lead to such a significant difference in performance. > The EABI advantages may not be significant to many eLua projects. > The floating point performance might be significant, as Lua uses by default a double number type, which means that ALL operations on numbers are done with doubles (even when you think you're doing integer operations). We'll probably need to do some benchmarking soon :), but for now this alone is a good reason to enable EABI (and thus CodeSourcery's toolchains) for regular ARM targets too, not just for Cortex. Best, Bogdan -------------- next part -------------- An HTML attachment was scrubbed... URL: https://lists.berlios.de/pipermail/elua-dev/attachments/20090222/7d99aed5/attachment.html |
Free forum by Nabble | Edit this page |