Level: Intermediate Advanced
When planning to deploy Windows desktops and applications in modern cloud and mobility environments, acceptable user experience is an important success factor. Unfortunately, traditional benchmarking parameters—such as frame rates and system performance counters—don't entirely represent the perceived user experience on a remote client. Aspects like client capabilities, media redirection, changing network conditions, compression artefacts, media asynchrony or remote UI response time delays introduce significant new challenges.
Join RDS MVP Benny Tritsch in this session about successfully benchmarking user experience performance in remote sessions and virtual desktops. He introduces you to a working set of acceptance criteria and a proven test methodology you can adapt for your own RDS, VDI or SaaS environment. Get expert guidance on which components are required to build your own remote UX performance test lab. In hands-on exercises and demos you will learn how to define effective test scenarios and metrics, build test control and automation scripts, collect performance and telemetry data, create screen recordings of test sequences and convert raw results into great reports.
One of the session highlights is the visual comparison of virtual desktops and RemoteApp UX performance under different network conditions introduced by a WAN emulator. Examples from an archive of several thousand videos with recorded test sessions collected over the last years will show you the difference between good and bad user experience in VDI and cloud environments.
You will learn:
- How to benchmark remote user session and virtual desktop performance
- About test criteria and success factors when benchmarking VDI user experience
- How to build your own remote UX test lab