In VR, eye tracking technology can quickly and accurately measure the direction of the user's gaze, which is often associated with the gaze point rendering technology to reduce the consumption of hardware performance by the VR. Although these two technologies complement each other, eye tracking can also be used more in VR.
The eye tracking technology has been considered as an out-of-reach technology for many years. Yet from the current development of the entire industry, it has achieved rapid development in accuracy and robustness. The developers are getting more and more hardware support.
At present, eye tracking providers like Tobii have produced several eye tracking devices. Qualcomm's VRDK developer has also joined in Tobii's work. Qixin Yiwei is already shipping aGlass device for Vive Headset. Oculus has recently demonstrated an eye-tracking VR prototype. Even Apple Inc. has participated in it. It was previously reported that it has acquired SMI, a well-known eye-tracking company, has already planned to implement the technology-related patents.
With this momentum, we should be able to see eye tracking as a standard component of VR headsets in just a few years.
Gaze Point Rendering (or Center-Concave Rendering)
The birth of gaze point rendering technology aims to reduce the GPU computing power required to render VR scenes. Its English name, Foveated rendering, is derived from the word ‘fovea’, which is a nest (central retina fossa) in the center of the human retina. It is densely lined with light-sensitive cells, providing high-resolution vision for the center of our vision. At the same time, our vision is actually very poor at capturing details and colors, but more suitable for observing movements.
The area of high resolution in human vision is actually very small, only a few degrees from the center of the field of vision. The difference in resolving power between the fovea and other parts of the retina is obvious. People have always overestimated their visual abilities because the brain has done a lot of unconscious processing.
The purpose of gaze point rendering is for this visual feature to present high resolution virtual scenes in the area seen by the fovea of the retina, while in the peripheral field of view, the details of the scene can be greatly reduced. This concentrates most of the performance on the most useful details while saving resource consumption elsewhere.
Of course, this requires the cooperation of eye tracking, and we need to quickly and accurately identify the user's gaze center before using this point-based rendering technique.
Automatic Detection and Adjustment
In addition to detecting eye movements, eye tracking can also be used as a biometric identifier. When different people wear VR heads, the system can instantly identify different identities and call their own custom environment, content library, game progress and settings.
Eye tracking can also be used to accurately measure IPD, the interpupillary distance. Many people do not know what the interpupillary distance is. In fact, the interpupillary distance is very important in VR because the VR headset display needs to move the lens and the display to a suitable position to provide the best display effect.
Eye tracking can be used to measure the interpupillary distance of each user in real time, and then the lens position is automatically adjusted by the head-mounted software system or the user is warned that their IPD is beyond the hardware support range.
Zoom Display
The optical systems used in today's VR headsets are still quite simple. They do not support an important function in human vision: dynamic focusing. The display in the VR headset is always at the same distance from our eyes, even when the depth of stereo vision is different, which leads to a problem called vergence adjustment conflict.
To solve this problem, people have proposed a zoom display technology that can dynamically change the focal length. There are many ways to change the display focal length, the simplest of which is to change the focal depth by moving the lens position.
Making this zoom display also requires eye tracking. By tracking the way of the user's sight in the virtual scene, the system can calculate the focal plane, then send this information to the display to adjust accordingly and set the focal length to match the user.
The effective application of the zoom display technology can solve the vergence adjustment conflict. Yet before the zoom display is applied to the VR head, the eye tracking can also be used to simulate the depth of field so that objects outside the focal plane appear as a blurred state.
Central Concave Display
Although gaze point rendering technology can allocate rendering resources for different visual areas, similar effects can also be achieved by changing the physical arrangement of pixels. Instead of changing the rendering details only in some parts of the display, it is better to only increase the pixel density in the center of the user's field of view.
Previously, VR headsets have been in the process of raising the resolution improperly, which is not only costly, but also meets challenges with the number of screen pixels close to the retina resolution. The central concave display technology will move a small high-resolution display to the position where the user is looking at based on the eye tracking data, allowing the VR head display to be higher without violently expanding the full-field resolution. The display resolution may even create a larger field of view than with a single display.
Varjo is a company dedicated to the development of a central recessed display system. They place a small, high-resolution display in front of a large, common display screen. This means that users can have both a wide field of vision and a central location. The area gets high-resolution images.
Currently Varjo's prototype cannot move the central small-size display, but the company stated that it has developed other methods to ensure that the high-resolution area is always in the center of the user's line of sight.
More Realistic Avatars
At present, in many social softwares, the avatar settings are not enough images. Although some of them support abundant character settings and personalized customization options, they are still not ideal. In particular, the avatar's flexibility in the head and physical movements.
In terms of avatars, although many virtual images support blinking, watching and anthropomorphic movements, they are all set in advance. Because of this, this kind of virtual image that is not real enough extends people's distance in VR social networking.
In the real world when communicates to others, the eyes have played a vital role in gazing at each other as a courteous and necessary act. So does in VR. Applying eye tracking technology to a VR avatar can reflect your true state in real time in the avatar.
In this way, actions including blinking, turning white eyes, and turning the eyeball can all be embodied in VR. Even some emotions can be judged from the eyes such as happiness, surprise, sadness, etc., which will be reflected in the face of the virtual avatar.
Deep Interaction and Intention Analysis
In fact, eye tracking can also analyze the player's intention by collecting the player's fixations and even integrate the interaction. For example, in a horrible game, when you look at the doorknob of the front door, the door will open automatically or there will be other interactions. There are many similar examples.
We know that VR games at this stage are basically interactive through the handle. Usually when entering the next area (usually gates, aisles, turns, etc.), the next interaction will naturally appear. For example, Zombies automatically jumped out somewhere and so on.
Adding eye-tracking interactions to the game or application can make the experience more natural. Even in the horror game, we can highlight only the direction of gaze, and dark areas in other areas to enhance visual immersion.
Tobii, a provider of eye-tracking hardware and software, told us that eye-tracking can also help players better control the objects in the virtual environment in the game. In a VR game case, throwing objects with traditional handles may be different from what you might think of in the real world. They may fall quickly or may be thrown high, not what you want to throw out.
In the above demo, the gray ball on the left is the track without eye tracking, and the red ball on the right is the track with the handle and the eye tracking. It can be clearly seen that the latter can be accurately thrown into the game. In fact, this is based on the position system you are looking at when tossing the ball. The system automatically looks for the placement point. The action that the handle throws does not require aiming (after all, this kind of experience is unfriendly in VR), which can greatly enhance the experience of throwing games in VR. .
In addition, eye tracking can also be applied at the analysis level. By collecting issues that the players are most concerned about or paying attention to in the game, a heat map is formed. The developer can better understand the application using the heat map.
So what do these heat maps do? Taking the game as an example. It can help developers understand whether the player finds in-game interactions, whether attention is being distracted by other non-environmental factors, and where the players are paying attention to, in order to improve the interaction of the game.
Flexible Manipulation and Input
Eye-tracking is also helpful for active input. It allows players to quickly complete the operation through the eyeball fixation. For example, in VR there is a large field of view but many UI interfaces are not friendly and are often developed in accordance with traditional game interfaces.
For example, the icon design requires a lot of shaking of the handle or handle from the upper left corner to the other direction. However, if eye tracking is added, the preselection frame can be quickly moved by the gaze of the eye, and then the handle can be pressed once to complete, which greatly reduces the number of steps and is more direct than the virtual laser column in the VR handle.
Health Care and Medical Research
Eye-tracking also helps in the areas of health care and medical research. In fact, companies such as SyncThink have begun to detect concussions through the VR headsets tracking function to improve the efficiency of on-site diagnosis.
People in the field of medical research can collect data through eye tracking devices, such as understanding the impact of eye contact on autism, and using eye-tracking head-worn equipment to observe where the pianist looks at while playing the piano.
In summary, eye tracking is an important technology that has great potential and is expected to change the interaction of VR games. It is foreseeable that more and more VR headsets will support eye tracking in the future. In the end, all VR headsets are expected to be equipped with this feature.
Source: 7tin
http://www.7tin.cn/news/111937.html











评论
发表评论