1. Introduction
In recent years, visualization of gaming telemetry data has become
increasingly important, enabling players to make data-driven decisions based
on performance and behavior analysis.
Our first quarter goal was to develop real-time visualization tools that allow
players and coaches to collect and analyze actual gameplay data.
To achieve this, we developed two applications:
1. A desktop application for real-time 3D visualization
2. A web application for post-match analysis
This presentation describes the overall technical architecture and challenges
faced during development.
2. Project Overview
Gaming Device
Desktop Application
(real-time visualization)
Architecture
Pyside6 + PyQt + OpenGL
Primary functions
- UDP Packet Processing
- Data Logging + Cloud Upload
- Real-time Visualization
Web Application
(post-analysis)
Architecture
Streamlit + Firebase Admin Services + Google Cloud Storage
Primary functions
- User authentication
- Search specific data from Cloud Storage
- Data Analysis
Google Cloud Storage
Telemetry UDP Packet
Store JSON
List & Load JSON files for post analysis
Retrive JSON to compare
with realtime data
Data Flow
3. Effective Development Environment
Desktop Application
The project originated from the community's existing
discovery of UDP packet accessibility from a gaming
device. Building upon this known capability, we
focused on creating a comprehensive visualization tool
that players can analyze their own data.
We selected PySide6 as our development framework,
leveraging its robust features for real-time data
processing and visualization. This choice enabled rapid
development while maintaining high performance, all
within a pure Python environment.
Network Layer
Decryption
UDP Socket Binary Data Processor
Application Layer
Thread Manager
User Event Signals
Graphics Layer
Store Data
Real-time Visualization
UI State Update
System Architecture
Processed Telemetry
4. 3D modeling visualization
Our application transforms telemetry data into 3D
motion using OpenGL through PyQtGraph, which which
will be replaced by VisPy.
The visualization engine implements quaternion-based
rotations and matrix transformations for precise object
movement while providing multiple camera control
options.
Desktop Application
Physics Visualization
Based on matrix calculations translate complex
motion data into smooth visual transitions,
with temperature-based color mapping.
Camera controls
Multiple viewing angles and position tracking enhance data
interpretation through intuitive camera controls.
5. Thread Management
Thread separation is crucial for our application since it
requires to handle continuous data streams with
simultaneous processing, visualization, and logging.
This handling improved the stability of GUI rendering
performance and avoid possible logging data loss.
Packet Process Thread | Data Logging Thread
Main Thread
Desktop Application
Start / Stop Signals
User Events
Signals
6. Quick Prototyping of Data Analysis Dashboard
Web Application
Our first quarter focused on getting ideas and insights from players
by using real data, leading us to choose Streamlit for its simple
backend development, deployment environment, and user
authentication capabilities.
The framework allowed rapid development, enabling multiple
stakeholders to access and utilize the application with real data.
Architecture
Streamlit + Firebase Admin Services + Google Cloud Storage
Primary functions
- User authentication
- Search specific data from Cloud Storage
- Data Analysis
7. Data Manipulation & Flow
Our application combines three powerful Python modules: Pandas, NumPy, and SciPy, each handling specific aspects of
telemetry data processing and visualization by Plotly.
Web Application
Data Processing
Plotly
The processed data is visualized through interactive
plots such as scatter plots, line charts, and path
visualizations. Users can explore temporal data
sequences, compare multiple datasets, and analyze
performance patterns.
Interactive Analysis
Pandas + NumPy + SciPy
Raw JSON data is transformed into dataframe and
manipulated by Pandas, NumPy, and SciPy. This
structured data includes position metrics,
performance indicators, and identified key points
through signal processing.
8. Advantages of using Object Storage for our JSON files
Object Storage maintains data in its raw format, eliminating the need for data
transformation. It is a suitable option for our project since the desktop
application dumps telemetry data directly as JSON.
The filename format follows username_identifier_date.json. Search
functionality is implemented by utilizing Google Cloud Storage's list_blobs API
with prefixes, where prefixes can be either username or username_identifier.
This simple approach is sufficient for our current data volume and usage
patterns.
Prefix based Search
Web Application
Users can search by username and identifier, for
instance, “alice” and “002”, in which two files will be
listed in this example.
9. Challenges
Performance Impact of Page Refresh
Streamlit's automatic page refresh mechanism poses
performance challenges in our application. When a user
modifies any parameter, the entire page reloads instead of
updating individual components. This full-page refresh affects
user experience, especially when interacting with our multiple
interactive visualizations. As a temporary measure, we
reduced data points for visualization using pandas DataFrame
slicing (df.iloc), though a more fundamental solution will be
needed.
Web Application
From Self-Service to Guided Analysis
We are shifting from providing self-service dashboards to
delivering focused analysis reports with regular user meetings.
While our initial plan was to develop comprehensive tools for
users' self-analysis, feedback showed users preferred analysis
of specific scenarios based on their individual needs. For the
next quarter, rather than building an all-purpose dashboard,
we're adopting a market-in approach focused on targeted
analysis based on user feedback. This shift allows us to utilize
existing tools like Jupyter for quick visualizations rather than
developing a complete dashboard.