I just wrote a long rant on the Windows 7 Virtualization forum and I thought I’d publish it here:
Windows Server 2008 R2 with high end graphics:
anything works but VMWare?
http://social.technet.microsoft.com/Forums/en-US/w7itprovirt/thread/b8aa814a-356d-4fa1-ad80-c4f7d363fd86
A sample virtual machine: Mac OS (Leopard) running inside Win VistaFor different reasons, my main development workstation has always been Windows server over the years (Windows 2000 Server, Windows 2003 Server and Windows 2008 Server).
Last Friday I downloaded and installed Windows Server 2008 R2 x64. I created a new partition and installed it without problems.
Unfortunately, just like with Windows 2008 Server, I cannot use hyper-v in Windows 2008 Server R2. Since this is my main workstation I have a high end graphics card from Nividia - I enjoy Aero, and now Aero Peek, etc... But as soon as I installed the Hyper-V role, the machine's performance went down the drain. High CPU, un-responsiveness, etc... This not something new, the same thing happened in Windows Server 2008 and it has not yet been fixed in Windows Server 2088 R2. There a Microsoft Support KB article that gives the reason (something related to "flushes" from the graphics driver), but gives no solution.
Ok, I told myself - Microsoft decided not to give this a high priority. Let me then try the new "Windows Virtual PC" - the one that makes "Windows XP Mode" work so well in Windows 7. Unfortunately, after attempting to install the latest "Windows Virtual PC", I got the error that Windows Server 2008 R2 is not supported.
All right, I said, let me go back to my trusted Virtual PC 2007 SP1 - it has worked so well in Windows Server 2008. I was able to install Virtual PC 2007 SP1 on Windows Server 2008. Unfortunately the virtual graphics adapter broke my real network adapter. If I unchecked the virtual graphics adapter option in my physical adapter's properties, it worked again. I tried the fix of deleting the virtual graphics adapter, rebooting and adding it again, but that did not work - the moment I added it back my real network adapter stopped working.
So I'm left with no way to run my VHD in Windows Server 2008 R2.
I'm back using the free VMWare Server 2.x - I like Virtual PC 2007, Windows Virtual PC and Hyper-V better, but you leave me no choice.By the way, if I cannot use VHDs in my development environment, I will not recommend we use them in our Production environment. VMWare is what we'll use until Microsoft starts giving a bit of support to developers who need to run the server OS as their main workstation.
Yep, I mean it - I can’t recommend something I can’t use. So VMWare is what we’ll be using for running our development, testing and production virtual machines.
Update: here’s a bit more info on how performance sucks when you have a high end graphics card and you enable the Hyper-V role. It was a known issue since Hyper-V was first released in Windows Server 2008 – check these articles:
Bad Performance with High-End Graphics and Hyper-V
http://blogs.msdn.com/virtual_pc_guy/archive/2009/01/07/bad-performance-with-high-end-graphics-and-hyper-v.aspx
Video performance may decrease when a Windows Server 2008-based computer has the Hyper-V role enabled and an accelerated display adapter installed
http://support.microsoft.com/kb/961661
Unfortunately, the folks who prioritize features for Hyper-V apparently don't think this is a big issue, even though 100s have validated the bug on Microsoft Connect.
It's their loss, we'll stick with VMWare until they fix it.
Comments