Monday, January 28, 2013 10:44 AM
I'm running some tests to see how much difference a GPU makes to playing full screen HD video from a Server 2012 Remote Desktop Session Host using the new RDP 8.0/RemoteFX stuff.
The hardware is an HP Z220 workstation with an E3-1245 Xeon CPU and a Quadro 4000 GPU. I've run the tests with the GPU present and physically removed and was expecting to see lower CPU usage when it was present due to the RemoteFX video encoding being offloaded to the GPU.
But in reality it seemed to make little difference, despite the GPU seeming to be doing a reasonable amount of work when it was present.
Details plus graphs are on my blog here:
Scroll to the bottom for the graphs.
Can anyone explain:
a) is my testing valid, or if not what have I done wrong
b) if the testing is valid, what's going on? I was hoping that a GPU would allow higher user density due to lower CPU usage
c) anything else that's relevant, perhaps my understanding of this aspect of RemoteFX is wrong/flawed, or my expectations are wrong (why?)
Thanks very much in advance.
Monday, January 28, 2013 1:27 PMModerator
In your tests, did you Enable Use the hardware default graphics adapter for all Remote Desktop Services sessions in the server's local group policy?
You are running RDSH directly on the physical hardware and not in a VM, correct?
Monday, January 28, 2013 2:21 PMYes to both, that policy option is set and I'm running on the hardware directly.
Sunday, March 03, 2013 1:49 PMAny news or updates on this?. We se the same in our environment.