The Next Series: Part 1 – Simulated Past
I have decided to have a few posts focused on discussing what’s next in the world of computing. The idea is to look at progression thus far, and see what makes sense to be researched and improved next. This simple formula can be surprisingly effective, due to collective “creationism” that defines computing industry. In other words – we create our future (just like in real life) by focusing research efforts (aka: wishes) on particular things. Thus, we can foresee now what will the future bring, simply by assuming that our research pans out.
And now, the actual point – For a while now, everyone agreed that a lot of our collective progress was often hindered by the infamous “backwards compatibility” syndrome. The base dilemma thus, for engineers not just of operating systems, but of many core technologies, was “Do We Make It Faster and Break old software” or “Do We Keep it slow, but not Break anything”. Traditionally, Microsoft preferred not to simulate old software standards fully, probably because it would end up being too slow to properly run older software.
But now, with faster CPU’s and built-in virtualization on chip level, they can finally move forward and just run that old Windows XP stuff inside “Virtual PC” – a full computer simulator running a “real” copy of Windows XP SP3. This “magic” finally allows the developers to Revolutionize, as we often like to do.
This magic comes to Windows 7 officially now, though I used Virtual PC in Vista as you all know, happily running Windows 98 and some 1995 software in it, without a hitch. Even in Windows 7 it feels like an “add-on” to me, hopefully in Windows 8 and beyond, the engineers will fully realize the potential “simulating” past, and allow themselves to Rethink every layer of the Operating System! What will result is prettier, faster, better technology. But where would we want to take it? Stay tuned, we’ll discuss that in the next parts of the series!
[via Within Windows blog]