OptiX01: Tutorial 01 (7.0-7.5)
Published:
(work-in progress…)
Published:
(work-in progress…)
Published:
sudo lshw -C videoPublished:
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
sudo lshw -C videoPublished:
Published:
Published:
(work-in progress…)
Published:
(work in progress…)
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
I have divided the instructions into three sections:
Published:
(WORK IN PROGRESS…)
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
I have divided the instructions into three sections:
Published:
This is a short tutorial regarding enabling the Variable Rate Shading (VRS) feature for HMD using Unity. I was working in this project probably in 2022 (can’t remember exactly). However, while writing this (2025), many of the mentioned plugins and SDK might be deprecated or changed. Nonetheless, it is always good to have instruction sets if someone limited to working in Lagacy version. This tutorial is OpenVR-based. One big change is the eye interactions (gaze position) has shifted entirely to the OpenXR and it is suggested to start with the OpenXR. Someone can follow the complete instruction for eye-track-based interaction. In that case, just ignore the VRS part. Whenever I can manage time, I will write another updated version here, VR Rendering with Unity 6.0+.
Published:
(WORK IN PROGRESS…)
Published:
I have divided the instructions into three sections:
Published:
This is a short tutorial regarding enabling the Variable Rate Shading (VRS) feature for HMD using Unity. I was working in this project probably in 2022 (can’t remember exactly). However, while writing this (2025), many of the mentioned plugins and SDK might be deprecated or changed. Nonetheless, it is always good to have instruction sets if someone limited to working in Lagacy version. This tutorial is OpenVR-based. One big change is the eye interactions (gaze position) has shifted entirely to the OpenXR and it is suggested to start with the OpenXR. Someone can follow the complete instruction for eye-track-based interaction. In that case, just ignore the VRS part. Whenever I can manage time, I will write another updated version here, VR Rendering with Unity 6.0+.
Published:
sudo lshw -C videoPublished:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
(work in progress…)
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
sudo lshw -C videoPublished:
Published:
Published:
(work-in progress…)
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
(work in progress…)
Published:
(WORK IN PROGRESS…)
Published:
Published:
(work-in progress…)
Published:
(WORK IN PROGRESS…)
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
Tamashii is an excellent scientific rendering framework under development by the Department of Computer Graphics, TU Wien. Its main goal is to simplify the creation of research applications in the field of computer graphics by providing a fundamental structure in the form of libraries implementing resource loading, input handling, user interface creation, a complete rendering framework, and graphics API abstraction. The backend graphics API is Vulkan. However, Tamashii only supports single display systems, and \textbf{has not been modified for VR.}
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
sudo lshw -C videoPublished:
(WORK IN PROGRESS…)
Published:
This is a short tutorial regarding enabling the Variable Rate Shading (VRS) feature for HMD using Unity. I was working in this project probably in 2022 (can’t remember exactly). However, while writing this (2025), many of the mentioned plugins and SDK might be deprecated or changed. Nonetheless, it is always good to have instruction sets if someone limited to working in Lagacy version. This tutorial is OpenVR-based. One big change is the eye interactions (gaze position) has shifted entirely to the OpenXR and it is suggested to start with the OpenXR. Someone can follow the complete instruction for eye-track-based interaction. In that case, just ignore the VRS part. Whenever I can manage time, I will write another updated version here, VR Rendering with Unity 6.0+.
Published:
(work in progress…)
Published:
I have divided the instructions into three sections:
Published:
I have divided the instructions into three sections:
Published:
I have divided the instructions into three sections:
Published:
I have divided the instructions into three sections:
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
Tamashii is an excellent scientific rendering framework under development by the Department of Computer Graphics, TU Wien. Its main goal is to simplify the creation of research applications in the field of computer graphics by providing a fundamental structure in the form of libraries implementing resource loading, input handling, user interface creation, a complete rendering framework, and graphics API abstraction. The backend graphics API is Vulkan. However, Tamashii only supports single display systems, and \textbf{has not been modified for VR.}
Published:
Published:
Published:
(work in progress…)
Published:
sudo lshw -C videoPublished:
Published:
Tamashii is an excellent scientific rendering framework under development by the Department of Computer Graphics, TU Wien. Its main goal is to simplify the creation of research applications in the field of computer graphics by providing a fundamental structure in the form of libraries implementing resource loading, input handling, user interface creation, a complete rendering framework, and graphics API abstraction. The backend graphics API is Vulkan. However, Tamashii only supports single display systems, and \textbf{has not been modified for VR.}
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
This is a short tutorial regarding enabling the Variable Rate Shading (VRS) feature for HMD using Unity. I was working in this project probably in 2022 (can’t remember exactly). However, while writing this (2025), many of the mentioned plugins and SDK might be deprecated or changed. Nonetheless, it is always good to have instruction sets if someone limited to working in Lagacy version. This tutorial is OpenVR-based. One big change is the eye interactions (gaze position) has shifted entirely to the OpenXR and it is suggested to start with the OpenXR. Someone can follow the complete instruction for eye-track-based interaction. In that case, just ignore the VRS part. Whenever I can manage time, I will write another updated version here, VR Rendering with Unity 6.0+.