Jump to content

Recommended Posts

Posted

We have a server running as a Terminal Server on 2003. We've noticed that when flash is used over RDP, is kills the server, putting the CPU up massively. 4 or 5 users watching flash videos maxes it out. Is there a way around this, or do we have to use a different method? The server is also doubled up as a VMWare ESX server, which is capable of running VMWare clients. Is this the way forward?

Posted

It is an 'Access 24' system, allowing the students and staff access to the school network from home.

Posted

Yeah, we've done experiments on it. When this happens, there are no VM sessions active, only TS. The usage goes up to 50% when one flash video is being watched, and up to 100% three or more are going. It drops back down to minimal levels when the flash videos stop.

Posted

We've also tested it on a different server with TS installed, which doesn't have any form of VMWare installed at all. We get the same results.

  • 3 weeks later...
Posted

Typically in this environment you'll want to have an app that would accelerate performance. Flash over RDP on the WAN is excruciating. Citrix has (or had?) something called speedscreen multimedia acceleration. Microsoft has their App-V project as well.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...