Page 2 of 2

Re: OS integrated compiler

Posted: Thu May 28, 2009 10:25 pm
by NReed
Well, in my dream operating system. All the applications which would be managed through the package manager. So the idea of moving installed apps would be a matter of syncing the package managers together, so that the new machine re-installs all the old apps. Obviously this demands that it be done from the start and can't be done for mac os x ( it would be nearly impossible to get every mac os x app into a rep ), but it certainly is a potential idea for a new os. How would you compare this system to a universal binary?

Re: OS integrated compiler

Posted: Fri May 29, 2009 12:28 am
by Colonel Kernel
NReed wrote:Well, in my dream operating system. All the applications which would be managed through the package manager. So the idea of moving installed apps would be a matter of syncing the package managers together, so that the new machine re-installs all the old apps. Obviously this demands that it be done from the start and can't be done for mac os x ( it would be nearly impossible to get every mac os x app into a rep ), but it certainly is a potential idea for a new os. How would you compare this system to a universal binary?
Package management and universal binaries are orthogonal -- you can have either one without the other one. If you don't have universal binaries, you necessarily have more packages (one for each target architecture) just like in most OSes today. Universal binaries just make deployment much easier, whether it's being done via a package manager or drag-and-drop.

Think about how much time and money this saves Apple. They can just crank out one OS X DVD for everyone that will work on every supported Mac, be it 32-bit, 64-bit, PPC, or Intel. There is only one set of files on that DVD -- no separate sub-directories for each architecture to manage. Once the system is installed, again there is only one folder structure. No need to implement and maintain crazy tricks like c:\windows\syswow64 pretending to be c:\windows\system32 and all that cruft. It just makes things way simpler.

IMO the executable format and loader is exactly the right level of abstraction at which to deal with such architecture differences. It's too bad other OSes don't support such a mechanism for unmanaged code.

Re: OS integrated compiler

Posted: Fri May 29, 2009 5:40 am
by UbarDPS
Windows needs those directories for backward compatibility.

Windows On Windows has existed since NT 3.1 to provide backward compatibility with Win16 (and now Win64 with Win32).

The only 16-bit applications that don't run on Win32 are those that depend on DOS Drivers and make direct access to hardware.

WoW in Win64 privides the same function for 32-Bit applications.

It's quite a bit more elegant than the approach Apple uses to run Classic applications.

Universal Binaries are a waste of disk space, especially when lots of vendors use them, when you have have something made to runon 3 architectures, but you only use on.

Apple had to make Universal Binaries because they completely switched platforms. Windows never really did that.

x64 is backward compatible, it is just important to make sure the applications are isolated from system files they cannot use, and may overwrite (i.e. the installation programs).

Re: OS integrated compiler

Posted: Fri May 29, 2009 8:37 am
by Troy Martin
UbarDPS wrote:Universal Binaries are a waste of disk space, especially when lots of vendors use them, when you have have something made to runon 3 architectures, but you only use on.
Not the point, however. The point is to, basically, compile once, test on multiple architectures, and ship on one disc. Less discs are produced, and that's important if one architecture dies a few months after release, so you won't have extra discs going to waste or collectors.

Re: OS integrated compiler

Posted: Fri May 29, 2009 8:43 am
by Colonel Kernel
UbarDPS wrote:Windows needs those directories for backward compatibility.
Yes, that's my point. It's a lot of extra complexity that shouldn't have been necessary.
UbarDPS wrote:It's quite a bit more elegant than the approach Apple uses to run Classic applications.
Apple used virtualization to run Classic apps on the PPC versions of OS X. Microsoft is using virtualization to implement "XP mode" in Windows 7. This is off-topic though, as it is more about running apps for an older and much different version of the OS on the current version. UBs don't help with that.
UbarDPS wrote:Universal Binaries are a waste of disk space, especially when lots of vendors use them, when you have have something made to runon 3 architectures, but you only use on.
It's a tradeoff, but IMO a good one. Disk space is cheap. User and developer time is not.
UbarDPS wrote:Apple had to make Universal Binaries because they completely switched platforms. Windows never really did that.
Sure it did. NT was originally available for x86, Alpha, MIPS, and PPC. More recently, the first 64-bit version of Windows was for Itanium.
UbarDPS wrote:x64 is backward compatible
Yes, so why is it that Win32 and Win64 binaries are incompatible?

Consider how DLL loading works in Windows... Let's say a 64-bit process wants to load some system library. It has to use the 64-bit version of the library. More than likely, it was originally a 32-bit program that was ported to 64-bit, so it is probably going to look for the library with its old name, and in the original directory location (e.g.: c:\windows\system32\foobar.dll). Now you need these two parallel directory trees, one with 32-bit binaries and one with 64-bit binaries, and you need to re-direct all accesses from 32-bit processes.

On Mac OS X, a 64-bit process loads exactly the same library that a 32-bit process would, because the library is a universal binary that contains both the 32-bit and 64-bit versions of the code. No parallel directories, no re-direction of file system requests, no registry shadowing, etc. All of that stuff is a hack around failing to solve this problem in the right part of the system.

This is why .NET assemblies can target "AnyCPU" -- because the .NET guys realized that the old way of doing things was a hack (plus, with IL that gets JITted, it would be crazy not to do it this way).

Re: OS integrated compiler

Posted: Fri May 29, 2009 1:13 pm
by Love4Boobies
Colonel Kernel wrote:
Love4Boobies wrote:As for the binaries in Mac OS, no, I had no clue. Thanks for the tip. I would expect it to be some byte code that gets translated before being executed. It would waste less memory.
Actually, no memory gets wasted -- only disk space. The loader only loads sections of the binary that are compiled for the current architecture.
Sorry, that's what I meant to say. In romanian we often use the term "memorie" for both RAM and disk space and I guess that slipped.

Re: OS integrated compiler

Posted: Wed Jun 17, 2009 10:33 pm
by stephenj
man lipo

And I agree with Colonel, space in most programs shouldn't be due to binaries.

Re: OS integrated compiler

Posted: Wed Jun 17, 2009 10:40 pm
by Firestryke31
First, honestly, I was almost afraid to click that link (man lipo? Ew? They might have wanted to choose a better name for the program).

Second, this was a somewhat old thread. But just because it's back I use Monolingual to handle that. It also cleans out non-system language packs (it's actual purpose), which take up most of the space. I hear Snow Leopard will compress the localization stuff, but I don't remember if it was ever confirmed. It's a good idea, though...