Microsoft has been fairly good at allowing older binaries to run on newer systems.
Apple is pretty annoying in this regard. There’s a lot of software that doesn’t work on versions maybe only 5 years old.
A lot of software doesn’t need to change to be honest. Microsoft word for example. Word processing: you sit down and type stuff, maybe change the font once or twice. I guess the collaborative features are nice being able to edit the same document with others.
It would be fun to use an older machine and see how productive you can be with the old software too !
Been hit with that, I needed to run chromium v49 to be able to remote debug some TVs with old opera tv sdks, the version I had stoped working, and several versions that I tried crashed when using the chromium devtools. I ended having to use a windows virtual machine
I wish I could find the movie on youtube again, a demonstration of collaborative text editor from 1960-ish. Been looking for it a number of times just this year.
It's a very interesting show, how they solved the displays with commercial cameras filming the lcd displays in the lab and sending to the users screen. How the mouse worked, five button keyboard and so on.
Not to be snarky, but if there is a need to do this, it's pretty easy. If there is a real need, it is pretty trivial to do with VirtualBox or DosBox.
Those applications from 20 years ago running in emulators will work far better in 20 more years than Apps from today that stop working due to remote service dependencies to force vendor lock-in.
It is endlessly amusing to me that the more tightly integrated the cloud services get to conventional computing tasks, the more likely we will end up with Vernor Vinge style programmer archaeologists from A Deepness in the Sky...
When I worked for a NASA contractor doing sounding rocket telemetry, the main telemetry stack programming software was a Turbo C program from 1987-1990 (TDP502.exe on the odd chance that the maybe 50 other people on the planet who have ever used it sees this). Works just fine in DOSBox, at least to create files. Still needed an actual older PC with an ISA slot to handle the hardware that TDP knew how to control. But for configuration tasks, Windows + DOSBox + a USB 3.5" floppy drive = I could do things on an actual modern system.
So yeah, you're right, emulation saves the day in many cases. And I felt like a programmer-archaeologist using DOS to launch something into space in the 2010s...
Can you imagine how massive the field of software archeology will actually need to be to capture an understanding of the human experience of software development in the “distant past”.
Take any given product or system. what percentage of the software involved in implementing the system was written x years in the past. What’s gonna happen to that distribution in 10 or 100 years?
I wonder how inevitable it is that this percentage of ancient software in virtually every system will just keep growing and growing over time — until the systems of the future are tiny layers built on top of 1000 year old, impenetrable old growth forests which as well have been written by aliens for their understandability ...
Virtualizating Windows isn't very hard, even back to something like Windows 95.
On the other hand, only OSX 10.7+ are really easy to run in a VM, and .5 and .6 only work for servers, and anything before 10.5 isn't really going to be compatible with virtualization. That's 2007, so OSX lets you virtualize back about 13 years, and Windows you can go back almost 30 years. People even have Win 3.1 running in VMware.
This is probably due to the fact that there isn't powerpc virtualization software, but if you need to run osx software from before 2007, you're basically out of luck.
You can also virtualize windows from just about any OS you can imagine, Mac, Linux, Windows etc, while OSX virtualization has a hard requirement for running on Mac hardware.
There seems to be a misconception that you can only run 10.5 and later in a VM, but you can actually run OSX 10.4 Tiger fairly easily. This is the non server version. [1]
I was able to import almost everything from my old PPC computers. It's not completely virtualized because is using Rosetta and can not use Classic OS apps. But it is still extremely useful, and way faster than my PPC computers ever were.
>OSX virtualization has a hard requirement for running on Mac hardware.
If you aren't a stickler for Apple's terms of service (if you're doing this for business purposes, I suggest you should be), you can use a tool called macOS unlocker to patch VMWare Workstation to run macOS VMs. Runs great, though all VMWare products can only render display output for macOS in software mode.
Run a shady binary that seems to not have a certain author/website, as administrator, so it modifies VMWare binaries? A rather... curious approach, but for some reason common in Windows among e.g. gamers.
I've ran MacOS in VirtualBox iirc, without shady patches―though it probably was in Linux.
I have literally never heard of a "gamer" running shady binaries with administrator privilege in my entire life. Maybe you're thinking of the hacker culture of the 80s, but gamers today use launchers to manage downloading, installation and setup of software. Maybe you're thinking of software pirates using scene software as keygens or DRM-defeaters. I suppose that's common among kids who don't buy things (but I don't believe those tools run as admin).
It may be more common in Windows, but I would challenge that since Windows is basically free and runs on anything from a raspberry pi up, the vast majority of "hacky" stuff happens in Windows and Linux. Mac users buy very, very expensive hardware to do very specific tasks, and "hack around" is often not a good enough justification for the most expensive personal computers money buys.
I would also suggest that it in the Linux world where running random binaries as root is most common. Found some random repo that claims it's a fork of a good one with a bug fix? Build it and run it!
If the current version of OS X was backwards compatible with 10.0 - 10.4. It would still need both a PPC emulator and a 68K emulator since iOS 9 still had 68K code.
So if Apple kept “25 years” of backwards compatibility, should they have been better off bundling a 68K and PPC emulator? Why stop there? They should have kept compatibility with the Apple //e and also bundled a 68K emulator?
Someone else was complaining that they didn’t keep FireWire. Should modern Macs come with ADB ports?
Obviously not, but that doesn't prove that there isn't value to having backwards compatibility. Sometimes you just want something to run and not have to touch or change it for a long time.
A 20-year old machine that's critical to a factory can run off a serial cable plugged in to an expansion card running software written in the 90's that will still run on Windows 10. Nobody in their right mind would decide to write that same software on a Mac.
Well, given where all of the PC manufacturers are that were around in 1990 compared to the revenue and profit of just the Mac division, it seems like Apple didn’t make a bad business decision not prioritizing backwards compatibility.
If you compare where Apple is and where Microsoft is also, it doesn’t seem like chasing enterprise PC sales was as good of a long term bet as going after the consumer market....
> So if Apple kept “25 years” of backwards compatibility, should they have been better off bundling a 68K and PPC emulator? Why stop there? They should have kept compatibility with the Apple //e and also bundled a 68K emulator?
I don't think it's unreasonable that Apple hasn't done so, but neither do I think doing so would be unreasonable. Archive.org can emulate Apple II's in your browser, I'm sure Apple could add an equivalent feature to MacOS if that were something they cared to do. They obviously don't, and that's their prerogative.
I have a Windows 10 PC with a PCI (not express) slot that I installed a Firewire card in last year to use 15 year old software still available from Sony's website to rip a stack of Digital8 home movies.
I tackled that project about two years ago. Asked around and a friend had an old laptop with FW port, so I installed Ubuntu on it and copied all my old Video8 and Digital8 Tapes.
In the face of what Apple does for privacy, comparatively, nobody else doing a damn thing. Privacy is by a very significant margin the most important metric.
True enough. Though to be fair the last new version of a Win16 OS shipped 26 years ago, and Win32 became the standard API in consumer products 24 years ago. There are degrees of worry here. Software of the vintage you're talking about was contemporary with System 7, and the closest ancestor to current OS X was called "NextStep 3.3".
The point upthread was that genuinely useful stuff gets retired just a few years after release in the Apple world, and I think that's broadly true. It's true with hardware too -- professional audio people are stuck with truckloads of firewire hardware that they can't use with their new laptops, for example.
Apple shipped the last 32 bit Mac in 2006 over 10 years before 32 bit software wasn’t supported. There were plenty of FireWire to Thunderbolt adapters.
No the closest ancestor to MacOS X is System 7. There were Carbon APIs until last year. A poster up thread said they could use an emulator. There are 68K Mac emulators available too.
AppleScript for instance is a System 7 technology - not a NextStep technology.
No MacOS X when it was originally released had parts from NextStep and parts ported from Classic MacOS including QuickDraw, AppleScript, QuickTime, some audio frameworks etc.
The entire Carbon API was a port of classic MacOS APIs to make porting from classic
MacOS to OS X easier.
MacOS X was a combination of both. That was the whole brouhaha of why Apple ported Carbon APIS to OS X because major developers like Adobe and Microsoft insisted on it.
That’s not to mention that the first 5 versions of MacOS had an entire OS 9 emulator built in.
To take the analogy to the extreme. MacOS had two parents - Classic MacOS and NextStep.
I would disagree, most of what was brought from Classic OS was ported, adapted, out of necessity and short lived. OSX was an entirely new operating system that ported some frameworks and software but wasn't backward compatible. Were it so, they wouldn't have provided an emulator.
I think you're just supporting the original assertion that Apple does not support things for very long. Does Software written for OS X v10.1 run on Catalina today without using 3rd party tools or emulators? Software written for Windows 95 still runs on Windows 10.
Sounds to me more like the ported programs were short lived - and IMO, in that they are not entirely wrong.
Sure, Carbon and Rosetta certainly were no mean feat, and the drastic PPC/x86 break is something Microsoft never really had to deal with (heh, the biggest problem trying to run a PPC/MIPS/Alpha based NT application today is actually finding one :) ).
But Apple never went to the same lengths as Microsoft regarding backwards compatibility, and while Carbon and Rosetta immensely eased the transition, the continuity definitely wasn't comparable and it was never transparent to the developers (and in Apple's defense, this was never their intention and they always were quite open about it.)
For one, Rosetta (and thus PPC compatibility) was dropped with Lion in 2011, so no amount of Carbon would help 10.1 applications after that.
And even with Rosetta, each release, especially after Tiger, came with quite a list of API changes and deprecations (with the whole of Carbon declared obsolete in 2012) - and and increasingly longer list of high-profile software that would not run anymore and require an update or upgrade. And while Microsoft did a lot even to prevent and/or work around issues with notorious software (hello Adobe! :) ), Apple was far less willing to do so.
I mean, just as an example - I can run Photoshop 6.0 (from 2000) on Windows 10 (certainly no thanks to Adobe), but no chance for PS 7.0 even on Leopard...
PPC to x86 possibly the smoothest transition I've seen in my lifetime, for most it was just a recompile, and I'm convinced it was only as smooth as it was because of the shit show transition to OS X.
Apple announced it's plans to move to OS X in 1997 and that they'd ship an emulator, Blue Box, to run classic apps. That was met with a resounding "no" from the community.
Carbon was never suppose to exist, the Classic APIs were not memory safe, don't support thread, and had a lot of other issues. Apple wanted a clean break in the form of Cocoa but the community said no. So Apple came up with Carbon, which was sort of a port of Classic APIs to OS X, but because the two operating systems were so different it wasn't anywhere close to a 1:1 copy and required developers to port to it.
Since it's inception, Apple wanted Carbon dead, it required them to rewrite core parts of OpenStep in C and they had to maintain them alongside their Obj-C equivalents. It took them 12 years to get to the point where they felt comfortable killing it off and almost 20 years before they actually could.
> Can you run the PPC version of any Windows NT apps?
Developing for PPC was much like targeting x86 and PPC on a OS X. It was mostly a recompile unless the App used assembly. You can't run the PPC version of an NT app on modern hardware just as you can't run the PPC version of an OSX app on MacOS.
The difference thought is that PPC on NT never took off so there's something like 4 or 5 Apps for NT versus the thousands or hundreds of thousands for OSX.
I haven't forgotten anything, I just fail to see the relevance to this discussion. (68k? Really? That one's been dead for 14 years. And what is with you and NT on PPC? You really want to start comparing a 25 year old, short-lived, ultra-niche side version no one bought or even wrote software for with the "mainline"?)
I think you missed the entire point of my posting, i.e. that even outside the architecture changes long term compatibility was never even near the same level (and different arch often not even the culprit). Carbon being available doesn't help you a thing when old software still doesn't work.
If you are complaining that you can’t run 25 year old Mac software on an x86 Mac, the only option is for Apple to ship MacOS with a 68K emulator and a PPC emulator. The first version of MacOS that ran natively on x86 came out in 2006.
Yes I realize that PPC Macs came out in 1994. But they required a 68K emulator because even parts of MacOS were 68K.
There were a few breaking change epics in MacOS history.
There were three major breaking changes for MacOS.
- If you bought the x86 version of software in 2006. It would potentially work until 2019 when Apple dropped 32 support.
- If you bought the first first version of OS X PPC software in 2001, it could potentially run until July 2011 with the release of 10.7.
- If you bought a classic MacOS app, it could run from pessimistically from 1992 with the release of System 7 to 2006 with the introduction of the first x86 Macs.
"Carbon was an important part of Apple's strategy for bringing Mac OS X to market, offering a path for quick porting of existing software applications, as well as a means of shipping applications that would run on either Mac OS X or the classic Mac OS. As the market has increasingly moved to the Cocoa-based frameworks, especially after the release of iOS, the need for a porting library was diluted. Apple did not create a 64-bit version of Carbon while updating their other frameworks in the 2007 time-frame, and eventually deprecated the entire API in OS X 10.8 Mountain Lion, which was released on July 24, 2012. Carbon was officially discontinued and removed entirely with the release of macOS 10.15 Catalina."
I think you are confusing "supported" with EoL. Adobe was pissed because there was originally talk of doing a carbon64bit and they never supported it so they had to move their entire app over.
The main point is, that Windows would never stop that api from "existing" In some manner. Unlike Apple.
This is just a difference in how both companies view themselves. While Apple claims "it just works". That isn't quite true in some of the cases we have seen. Microsoft has actually done a far better job of this.
I know someone that worked on the visual studio team. They literally had 100-200 servers that would run overnight with each build guaranteeing that the software would install and run on every single permutation of windows on an array of hardware.
I've only heard complaints from Silverlight and Windows Phone/Mobile developers anecdotally.
From a web perspective (and my experience), .NET Framework 2/4 -> Core is actually not a big changeover outside of the views (probably better if you switched to MVC).
The Windows Phone apps I built are dead now, but that isn't a matter of APIs no longer being supported, but an entire platform going under.
As a macOS user, I had one operating system update kill external GPU w/ Nvidia cards (that sucked) and another update kill 32 bit apps (that one isn't a big one for me personally). All on the same computer.
The entire ASP.Net Core and Entity Framework architecture was changed and is not compatible. Not to mention all of the legacy third party .Net Framework only third party packages that don’t work.
Microsoft also completely abandoned Windows CE/Compact Framework while there were plenty of companies that had deployed thousands of $1200-$2000 ruggedized devices for field services work.
> The entire ASP.Net Core and Entity Framework architecture was changed and is not compatible.
There's been a lot of confusion, due in no small part to Microsoft's branding and communication, but what you said is not at all accurate if not intentionally misleading.
What's been know as .NET for the last 20 years is now called ".NET Framework", this is not unlike how OS X is now called MacOS retroactively. ".NET Core" is an entirely new framework that just happened to be compatible with ".NET Framework" but as time goes on the two have diverged.
> Not to mention all of the legacy third party .Net Framework only third party packages that don’t work.
".NET Framework" and ".NET Core" are similar to Cocoa and Cocoa Touch in the sense that you can write code that will compile under both AND you can write code for either that will be incompatible with the other. In fact I maintain a half dozen packages that are compatible with both.
> Microsoft also completely abandoned Windows CE/Compact Framework while there were plenty of companies that had deployed thousands of $1200-$2000 ruggedized devices for field services work.
Microsoft didn't "abandoned" Windows CE, it stopped development for it 6 years ago as it was largely dead and Microsoft offers many pathways off of Windows CE. The CF actually runs on platforms other than CE intentionally such that any apps written for the CF will just work elsewhere. AND they still support CE and CF to this day, they just don't maintain or develop new versions of them.
What's been know as .NET for the last 20 years is now called ".NET Framework", this is not unlike how OS X is now called MacOS retroactively. ".NET Core" is an entirely new framework that just happened to be compatible with ".NET Framework" but as time goes on the two have diverged.
The two weren’t initially slated to diverge at all. .Net Framework and .Net Core were suppose to be separate implementations of “.Net Standard”. In fact, you could originally create ASP.Net Core and EF Core apps that ran on top of .Net Framework.
NET Framework" and ".NET Core" are similar to Cocoa and Cocoa Touch in the sense that you can write code that will compile under both AND you can write code for either that will be incompatible with the other. In fact I maintain a half dozen packages that are compatible with both.
Which will not be the case for long since MS has stated that no new features will come to .Net Framework.
Microsoft didn't "abandoned" Windows CE, it stopped development for it 6 years ago as it was largely dead and Microsoft offers many pathways off of Windows CE. The CF actually runs on platforms other than CE intentionally such that any apps written for the CF will just work elsewhere. AND they still support CE and CF to this day, they just don't maintain or develop new versions of them.
Which is also not true. The last version of Visual Studio that supported Compact Framework was VS 2007. It was far from dead in the Enterprise by 2010 or even 2012. Companies were still relying on CF to run on their $1200-$2000 ruggedized field service devices. They had deployed literally thousands of devices in the field. I know, I was developing on VS 2007 until 2011 just to support them.
I mean devices like these that cost $1300 each. I deployed software for a few companies that’s had thousands of Intermech and ruggedized Motorola devices.
> The two weren’t initially slated to diverge at all. .Net Framework and .Net Core were suppose to be separate implementations of “.Net Standard”.
Uh... no. Hard fucking no. .NET Standard is the commonalities between Core and Framework. Core and Framework were NEVER the same or intended to be the same.
Framework is all of the legacy Windows specific Libraries for things like the File System, Active Directory, etc.
Core is intended to be platform agnostic and cross platform.
And yet you can still run .NET 1.0 apps on Win10, and this isn't changing in the foreseeable future.
Hell, you can run VB6 apps on Win10 - it even ships the runtime! - and the remaining hold-outs in that developer community have been complaining about abandonment for two whole decades now.
The .NET Framework 1.1 is not supported on the Windows 8, Windows 8.1, Windows Server 2012, Windows Server 2012 R2, or the Windows 10 operating systems. In some cases, the .NET Framework 1.1 is specifically identified as required for an app to run. In those cases, you should contact your independent software vendor (ISV) to have the app upgraded to run on the .NET Framework 3.5 SP1 or later version. For additional information, see Migrating from the .NET Framework 1.1.
Which would be no different to a macOS app hard coding a check for 10.3 and not working if you have anything newer. Neither says that the app _couldn't_ run, just that a badly thought gate prevents it.
> There must be enough apps that don’t run that MS thought to call it out.
The callout exists because Microsoft takes a different approach to support from Apple. Microsoft provides support material for all of it's legacy and deprecated software, as well as the ability to download and install them. So it's important to identify and track incompatibilities between them.
When Apple moves the past is whitewashed over and when support stops they forget it ever happened.
And so the mystery of why a 32bit version of Windows 10 still exists is solved.
What's mildly annoying is that much of the early 32bit Windows software came packaged in 16 bit installers. Office 97 would be such a breeze on modern hardware.
Office 97 can be installed on 64 bit windows 10 with original installer. I have done it just last month and it runs without any problems... and it is fast.
There are special workarounds, many of the old installers run a small piece of 16 bit code which doesn’t work in 64 bit Windows but because it’s so common Windows just runs a replacement version.
Failed the last time I tried, which must have been on 7. Will be a strong example for Microsoft dedicating resources to compatibility if they added it for 10 or with a patch update. (both their resources and the users', there's a crazy amount of checking for necessary compatibility hacks going on whenever an executable is started)
There are some third-party implementations of NTVDM that allow running 16-bit DOS and Win16 apps directly on Win64. Although DosBox is still the easiest route, and "good enough" in practice.
Apple is pretty annoying in this regard. There’s a lot of software that doesn’t work on versions maybe only 5 years old.
A lot of software doesn’t need to change to be honest. Microsoft word for example. Word processing: you sit down and type stuff, maybe change the font once or twice. I guess the collaborative features are nice being able to edit the same document with others.
It would be fun to use an older machine and see how productive you can be with the old software too !