X to photon latency measurements in software
X TO PHOTON
Definition
Challenges
Solution
Results
PROBLEM DEFINITION
• VR world presents unique problems arising due to system latencies
• Most noticeable impact is latency from physical event to display├
• Gauging effect involves measuring latency between two human
sensory events
• Physical measurement involves complicated equipment's with precise
positioning and orientation ⏚
• Hard to automate and reproduce precision and positioning⏀
• Hard to define a constant and consistent workload ⏉
SOFTWARE MEASUREMENT CHALLENGES
SIMPLIFIED X - PHOTON FRAME JOURNEY PIPELINE LATENCY CAUSES
Hardware
event (for eg.
Motion,
camera
sensor)
Processing
blocks (for eg.
Sensor fusion,
ISP … blocks)
Frame
compositor
(typically app
+ OS render
components)
Display Driver
Arbitrations/clock stretching, scheduling, non aligned locks …
Post and pre-processing on frames, involves complex high volume calculations, often pushed off to
GPU thus getting affected by submit, context switching, shader, … compounded by CPU latencies
like scheduling, fence acquire/release imbalances …
Typically a consumer of an external event and producer of a derived display frame. Responsible for
deciding the display frame elements based on input event. Typically involves techniques like time
warps, aligning frame production with vsync, front buffer rendering … GPU/CPU factors as above
affect latencies.
End point for software. Driver typically responsible for picking up composited display frame and
scan out to display. Latency beyond this is the hardware panel response, color/pixel illumination
latencies etc which are more complex to measure …
APPROACH TO SOLUTION
• Technically same data flowing across multiple stages ⏉
• Generally physical events get timestamped while being formed into a frame
• This event timestamp is now key for identifying that frame at various pipeline stages
• current time @ stage – frame time = frame to current stage latency
• The only challenge is then to figure out way to embed and persist this timestamp
across multiple stages as well as the end point ⏋
• With tesseract’s dependence on SDK, the render function of the SDK used to embed
frame timestamp on to the color buffer
• Display driver then reads out this embedded time stamp just before scan out
• With traces at multiple places it’s now possible to use tools to generate visual latency
graphs at individual stages in addition to end to end ├
• Strategy adopted successfully for pass-through camera, WIP for motion to photon
⏀
TYPICAL AR PIPELINE
MOTION TO APP LATENCY BREAKDOWN GRAPHS

More Related Content

PPT
Duma ver3
PDF
Tactile Internet with Human-in-the-Loop
PDF
VisionizeBeforeVisulaize_IEVC_Final
PPTX
Pacs cg ani_ve_ip
PPT
PPTX
Indoor scene understanding for autonomous agents
PPTX
Future Directions for Compute-for-Graphics
PPTX
GPU Algorithms and trends 2018
Duma ver3
Tactile Internet with Human-in-the-Loop
VisionizeBeforeVisulaize_IEVC_Final
Pacs cg ani_ve_ip
Indoor scene understanding for autonomous agents
Future Directions for Compute-for-Graphics
GPU Algorithms and trends 2018

Recently uploaded (20)

PDF
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
PDF
22EC502-MICROCONTROLLER AND INTERFACING-8051 MICROCONTROLLER.pdf
PDF
distributed database system" (DDBS) is often used to refer to both the distri...
PDF
August -2025_Top10 Read_Articles_ijait.pdf
PDF
ChapteR012372321DFGDSFGDFGDFSGDFGDFGDFGSDFGDFGFD
PPTX
introduction to high performance computing
PDF
Visual Aids for Exploratory Data Analysis.pdf
PPT
Total quality management ppt for engineering students
PPTX
Sorting and Hashing in Data Structures with Algorithms, Techniques, Implement...
PPTX
Current and future trends in Computer Vision.pptx
PDF
EXPLORING LEARNING ENGAGEMENT FACTORS INFLUENCING BEHAVIORAL, COGNITIVE, AND ...
PDF
737-MAX_SRG.pdf student reference guides
PPTX
Chemical Technological Processes, Feasibility Study and Chemical Process Indu...
PDF
UNIT no 1 INTRODUCTION TO DBMS NOTES.pdf
PPTX
Software Engineering and software moduleing
PDF
Exploratory_Data_Analysis_Fundamentals.pdf
PPTX
Fundamentals of safety and accident prevention -final (1).pptx
PDF
Categorization of Factors Affecting Classification Algorithms Selection
PPTX
Management Information system : MIS-e-Business Systems.pptx
PPTX
Feature types and data preprocessing steps
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
22EC502-MICROCONTROLLER AND INTERFACING-8051 MICROCONTROLLER.pdf
distributed database system" (DDBS) is often used to refer to both the distri...
August -2025_Top10 Read_Articles_ijait.pdf
ChapteR012372321DFGDSFGDFGDFSGDFGDFGDFGSDFGDFGFD
introduction to high performance computing
Visual Aids for Exploratory Data Analysis.pdf
Total quality management ppt for engineering students
Sorting and Hashing in Data Structures with Algorithms, Techniques, Implement...
Current and future trends in Computer Vision.pptx
EXPLORING LEARNING ENGAGEMENT FACTORS INFLUENCING BEHAVIORAL, COGNITIVE, AND ...
737-MAX_SRG.pdf student reference guides
Chemical Technological Processes, Feasibility Study and Chemical Process Indu...
UNIT no 1 INTRODUCTION TO DBMS NOTES.pdf
Software Engineering and software moduleing
Exploratory_Data_Analysis_Fundamentals.pdf
Fundamentals of safety and accident prevention -final (1).pptx
Categorization of Factors Affecting Classification Algorithms Selection
Management Information system : MIS-e-Business Systems.pptx
Feature types and data preprocessing steps
Ad
Ad

Measuring x to photon latency in software

  • 1. X to photon latency measurements in software
  • 3. PROBLEM DEFINITION • VR world presents unique problems arising due to system latencies • Most noticeable impact is latency from physical event to display├ • Gauging effect involves measuring latency between two human sensory events • Physical measurement involves complicated equipment's with precise positioning and orientation ⏚ • Hard to automate and reproduce precision and positioning⏀ • Hard to define a constant and consistent workload ⏉
  • 5. SIMPLIFIED X - PHOTON FRAME JOURNEY PIPELINE LATENCY CAUSES Hardware event (for eg. Motion, camera sensor) Processing blocks (for eg. Sensor fusion, ISP … blocks) Frame compositor (typically app + OS render components) Display Driver Arbitrations/clock stretching, scheduling, non aligned locks … Post and pre-processing on frames, involves complex high volume calculations, often pushed off to GPU thus getting affected by submit, context switching, shader, … compounded by CPU latencies like scheduling, fence acquire/release imbalances … Typically a consumer of an external event and producer of a derived display frame. Responsible for deciding the display frame elements based on input event. Typically involves techniques like time warps, aligning frame production with vsync, front buffer rendering … GPU/CPU factors as above affect latencies. End point for software. Driver typically responsible for picking up composited display frame and scan out to display. Latency beyond this is the hardware panel response, color/pixel illumination latencies etc which are more complex to measure …
  • 6. APPROACH TO SOLUTION • Technically same data flowing across multiple stages ⏉ • Generally physical events get timestamped while being formed into a frame • This event timestamp is now key for identifying that frame at various pipeline stages • current time @ stage – frame time = frame to current stage latency • The only challenge is then to figure out way to embed and persist this timestamp across multiple stages as well as the end point ⏋ • With tesseract’s dependence on SDK, the render function of the SDK used to embed frame timestamp on to the color buffer • Display driver then reads out this embedded time stamp just before scan out • With traces at multiple places it’s now possible to use tools to generate visual latency graphs at individual stages in addition to end to end ├ • Strategy adopted successfully for pass-through camera, WIP for motion to photon ⏀
  • 8. MOTION TO APP LATENCY BREAKDOWN GRAPHS

Editor's Notes

  • #4: ├ When user sees effect of a physical event (X to photon), where X is another physical event ⏚ For eg. A high speed camera or a optical sensor is almost always involved at the display end as the receiver ⏀ - Gives rise to high percentage of run to run & precision variations for eg. photo sensor needs to be in a particular position on the display side ⏉- Like how does one define a workload which produces constant motion and constant oriented output on display
  • #5: ├ Reacting to physical event if required or simply pass it over ⏃ Potentially grouped together by buffers of data (for e.g. display and camera) ⏀ for eg. camera keeps producing frames asynchronous to display pipeline ⏊ for eg. a missed vsync might end up having app showing up relatively older camera frame thus increasing latency for one frame without guarantee of compensation in future
  • #6: - Typical bus communication problems while communicating with hardware sensor
  • #7: ⏉ - for e.g. same camera frame (or same sensor frame) finally reaches display in some form or the other based on use case performing the frame transformation ⏋ - in case of X to display, the end point is the display driver ├ - https://jirasw.nvidia.com/browse/AV-446 for discussions and required patches ⏀ - Passthrough camera latency checks added to sanity (https://jirasw.nvidia.com/browse/AV-484) along with Motion to App (https://jirasw/browse/AV-430) motion to display is WIP