OptiX01: Tutorial 01 (7.0-7.5)
Published:
(this tutorial is work-in progress…)
Published:
(this tutorial is work-in progress…)
Published:
sudo lshw -C video
Published:
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
sudo lshw -C video
Published:
Published:
Published:
(this tutorial is work-in progress…)
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
I have divided the instructions into three sections:
Published:
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
I have divided the instructions into three sections:
Published:
This is a short tutorial about how to enable VRS feature for HMD using Unity3D. However, in the recent years (2025), many of the mentioned plugins and SDK might be deprecated. The interactions (eye gaze) has shifted to OpenXR. This tutorial is OpenVR-based
. Therefore, it might be better you start from OpenXR
.
Published:
Published:
I have divided the instructions into three sections:
Published:
This is a short tutorial about how to enable VRS feature for HMD using Unity3D. However, in the recent years (2025), many of the mentioned plugins and SDK might be deprecated. The interactions (eye gaze) has shifted to OpenXR. This tutorial is OpenVR-based
. Therefore, it might be better you start from OpenXR
.
Published:
sudo lshw -C video
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
sudo lshw -C video
Published:
Published:
Published:
(this tutorial is work-in progress…)
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
(***Constantly upgrading this page at my free time. Some information you may find misleading, non-correct, or already obsolete)
Published:
Published:
Published:
(this tutorial is work-in progress…)
Published:
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing
. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate
light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion
. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
Tamashii is an excellent scientific rendering framework
under development by the Department of Computer Graphics, TU Wien. Its main goal is to simplify the creation of research applications in the field of computer graphics by providing a fundamental structure in the form of libraries implementing resource loading, input handling, user interface creation, a complete rendering framework, and graphics API abstraction. The backend graphics API is Vulkan. However, Tamashii only supports single display systems, and \textbf{has not been modified for VR.}
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
Tried to write some instructions from my side on top of the original NVidia Falcor documentation (v6.0, 7.0, and 8.0), as the original Falcor’s documentation often found old and many things has changed since. This post is equivalent to Falcor’s README doc.
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing
. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate
light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion
. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
sudo lshw -C video
Published:
Published:
This is a short tutorial about how to enable VRS feature for HMD using Unity3D. However, in the recent years (2025), many of the mentioned plugins and SDK might be deprecated. The interactions (eye gaze) has shifted to OpenXR. This tutorial is OpenVR-based
. Therefore, it might be better you start from OpenXR
.
Published:
I have divided the instructions into three sections:
Published:
I have divided the instructions into three sections:
Published:
I have divided the instructions into three sections:
Published:
I have divided the instructions into three sections:
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing
. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate
light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion
. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
Tamashii is an excellent scientific rendering framework
under development by the Department of Computer Graphics, TU Wien. Its main goal is to simplify the creation of research applications in the field of computer graphics by providing a fundamental structure in the form of libraries implementing resource loading, input handling, user interface creation, a complete rendering framework, and graphics API abstraction. The backend graphics API is Vulkan. However, Tamashii only supports single display systems, and \textbf{has not been modified for VR.}
Published:
Published:
Published:
sudo lshw -C video
Published:
Published:
Tamashii is an excellent scientific rendering framework
under development by the Department of Computer Graphics, TU Wien. Its main goal is to simplify the creation of research applications in the field of computer graphics by providing a fundamental structure in the form of libraries implementing resource loading, input handling, user interface creation, a complete rendering framework, and graphics API abstraction. The backend graphics API is Vulkan. However, Tamashii only supports single display systems, and \textbf{has not been modified for VR.}
Published:
At the time of writing this (October, 2025), there are limited publicly available rendering engine that works for real-time stereo path tracing
. The slow convergence with pixel dense display in VR is limiting real-time stereo path tracer. However, with the RT-core and GPU architecture improvements, I am noticing the recent trend of physically accurate
light simulation in real-time rendering, e.g., games. Therefore, definitely with time, the real-time stereo path-tracing (and with other advanced global illumination algorithms) would be a real thing once we will overcome frame-rate constraint. Physically-accurate light simulation will minimized the difference between real and virtual world and extend the path of true immersion
. Currently, 90 fps has become a golden standard for VR. However, the higher is the better for immersion.
Published:
This is a short tutorial about how to enable VRS feature for HMD using Unity3D. However, in the recent years (2025), many of the mentioned plugins and SDK might be deprecated. The interactions (eye gaze) has shifted to OpenXR. This tutorial is OpenVR-based
. Therefore, it might be better you start from OpenXR
.