-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
C++11 vs. Tigerbrew #1163
Comments
All good questions! As far as C++ goes, we're largely just limited by what GCC versions can build on Tiger. We're not yet at the point of being limited by C++11 support because I haven't gotten GCC 7 and newer working yet. (The ~18 hour buildtime on my PowerMac G5 makes the cycle of fixing GCC builds a bit pokey, which is why I haven't gotten around to it yet...) It's my expectation that as C++11 and newer become mandatory, we'll be leaning more and more heavily on Tigerbrew-provided GCCs rather than anything supplied with the OS. I don't expect Go to ever work. It's never had official 32-bit PowerPC support and all the unofficial attempts at adding it have stalled out - not to mention that Go drops Mac OS version support aggressively, meaning even if we had PPC32 support, the Mac-specific stuff would still be broken on anything older than Mac OS 11. As far as universal binary support goes... yeah, that's tricky, huh? My feeling is that we can keep supporting it when building with the system compiler, but will have to start getting aggressive about telling the user it can't work if they're building with a non-Apple compiler. Make sense to you? |
I know that getting C++11 support is not in itself an insurmountable obstacle. C++11 is nominally useable via any of the five existing recipes (GCC 7 builds fine, or at least appears to build fine, using the existing I am more concerned with the structural design of Tigerbrew as it interacts with (1) the increasing requirement for a newer compiler to build current software, and (2) the needs (during construction) of fat binaries made using said newer compilers. Even using stock compilers, there are a startling number of libraries that inherently can’t be built as universal binaries using archflags alone. In general, they are those which test for the size of an int, a long, or a pointer during configuration, but several also exist which generate non-code files – usually headers – that vary between machine architectures. Given this, it doesn’t seem reasonable to me to just throw up our hands and say, “sorry, it can’t be done!” if someone has a newer compiler. I've made it work for every recipe I’ve yet encountered, one way or the other, using the stock compilers; and I don’t think it becomes unattainable just because a different approach is needed all the time vs. some of the time. Since the “do it twice and then use I propose that, instead, the |
I lost track of one other detail in there: What is the intent of the --c++11 option? It seems like a thing that would be required by formulæ that need it, rather than something the user might or might not choose to employ, so I'm more than slightly confused as to its existence as an install option. |
relevant to the issue of Go support: |
This is a question about how Tigerbrew is designed. I'm noticing a clear division within the histories of software packages -- at some point, most begin requiring a compiler with C++11 support, meaning that they can't thereafter be built with Apple GCC or anything else contemporary to Tiger or Leopard. This seems like the kind of thing best handled as a requirement. Meanwhile, Tigerbrew offers a C++11 /option/ on some formulæ, but all it seems to do is verify whether the selected compiler can process that flavour of C++, without then doing anything with that information.
The question, well, set of questions: What is the design rationale behind this approach? What is the longer-term plan for how to handle this split in software expectations -- pre-C++11 (that can be built using native compilers, with
--universal
done via-arch
flags) vs. post-C++11 (that requires a newer compiler, with--universal
requiring multiple compilation runs andlipo
)? For the latter, I can see two paths: Either have parallel sets of packages (one set being as up to date as feasible, the other consisting of a semi-fossilized inventory of the last versions that will build on native compilers), or else assume the basic handful of packages required to have a functioning Tigerbrew includes a newer compiler (GCC 10 is the last series that can supposedly be built without already having C++11 support).Somewhat tied in to this is Go support. GccGo has never supported Darwin (OS X), while the standalone Go compiler got going long after Apple switched to Intel and I don't believe it ever overtly supported PPC Darwin - but it might be easier to patch. I couldn't tell you one way or the other.
Also somewhat tied in to this is the nature of universal binaries. With a later compiler, setting the environment to support universal binaries is no longer effective because
-arch
flags are not supported, so one must explicitly run the compilation twice and uselipo
. In this environment, it makes the most sense to haveinstall
configure itself for a selected architecture build, determined by calling a utility function, and then the build system callinstall
twice with different assumptions, callinglipo
transparently. Having got fixated for a time on making every library I install be--universal
, I've already evolved a set of common idioms and support routines that do this in cases where-arch
flags don't work in the first place, and it's pretty straightforward. The only complicating factors are in packages likegmp
that have non-binaries that vary with the architecture, and those too are straightforward if tedious to handle.The text was updated successfully, but these errors were encountered: