- Sony Interactive Entertainment recently published a patent for an eye-tracking technology to allow players to make choices in multiplayer narrative video games without using buttons or joysticks.
- Players would select dialogue or actions in the video game simply by looking at them for a pre-determined amount of time, with the game tracking their eye movements and gestures to make the selection.
- The system would also provide audio-visual content in response to a player’s selection, which could come from an external device associated with a character in the video game.
- The in-game environment would also display text associated with the player’s selection at the updated location where the player’s eyes have moved, allowing for seamless gameplay.
- The patent also specifies that the selection from the second set of options may be based on the matching emotion, further enhancing the player’s ability to make choices in the video game based on their natural reactions.
Sony Interactive Entertainment recently received a patent titled “Using gaze tracking to effect player choices in multi player interactive narratives,” filed in January 2022 under the name of Sony Interactive Entertainment Inc. The patent, published last month, describes an eye-tracking technology in multiplayer narrative video games to allow players to control actions performed by specific characters, such as dialogue options, by looking at a set of selections provided on their display and focusing on the desired action for some time, or by looking at the selection and performing an action or gesture. The method includes identifying the object the player focuses on and sending an indication to the corresponding video game device, such as a console, to display a set of selections and then receive data corresponding to the player’s selection. The system then provides the user with audio or visual content associated with the selection.
“The present disclosure is directed to controlling outcomes in a game that includes multiple different users playing respective roles of specific virtual characters. The multiple different users may be present at a same physical location, or the different users may be located at different physical locations when movement of their eyes is tracked. Here, a user may choose one of a set of provided selections by simply looking at the chosen selection for a period of time or by looking at the chosen selection and performing an action or gesture,” reads the abstract for the patent. “This functionality allows multiple different users to control actions performed by different specific characters via an online multiplayer system. Depending on what a first user looks at, that first user or a second user may be provided with a corresponding set of selections or audio/visual content via respective gaming devices operated by the first and the second user.”
Simply put, the system uses eye-tracking technology to identify what a player looks at and send that information to other players in the multiplayer video game. This allows players to interact with each other and the in-game world without needing complex controllers or physical actions. Hence, players could use eye movements to choose and control their in-game characters. Conventionally, players have relied on buttons and joysticks to control their characters and interact with the in-game world, which can be difficult with limited input options, particularly for players with disabilities. With the eye-tracking system, players can make selections and interact with the in-game environment using their eyes, providing a more natural and intuitive input method. This can potentially enhance the gaming experience and make it more accessible to a wider range of players.
According to the patent’s claims, the system described is for controlling a virtual environment using “gaze tracking.” When a player looks at a virtual object, a command is sent to another player’s device to display a set of options related to that object. The first player can select an option, and the second player’s device sends back data indicating the selection. The first player is then provided with audio-visual content based on the selection, which may include three-dimensional noise initially associated with a direction. This could be used to provide additional context or immersion for the player. The audio content provided to the first player includes words spoken by a character in the in-game environment with whom the first player’s character has had a conversation. Likewise, instead of the audio content being spoken by the second character, it may be spoken by the first character in response to the first player’s selection.
Hence, the virtual environment may respond to the first player’s eye movements. When the first player looks in a particular direction, an image of a third character may be displayed, and audio-visual content related to that character may be provided to the first player. Additionally, the patent specifies that the system can also provide audio-visual content, such as a sound or video clip, related to a third character in the video game. This content may come from a third device, such as a smartphone or tablet, associated with that character rather than from the video game itself. Essentially, this means that the system can incorporate external media sources into the gameplay experience to enhance the narrative and immersion for the player. Furthermore, the patent also discusses how the virtual environment can show text related to the player’s selection. It explains that when the player looks at something in the game for a certain time (a long duration will indicate a stronger interest in the selection), the video game will know that the player has selected it and show information about the selected object at the new location where the player is looking. This allows the player to interact with the video game without buttons or a joystick.
The patent also describes a system in which players can use eye-tracking to communicate with other players in the game. For instance, a player could glance at another player’s character to initiate a conversation or interact with them. Additionally, eye movements may not be the only way to control actions performed by specific characters since the patent mentions using gestures like blinking of the eyes, which must include distinct eye closings and openings, to make a selection rather than solely through eye tracking. The pupil size of either player can be detected and used to trigger actions in the in-game environment. This introduces the possibility of players making selections based on their facial expressions, which can be matched to one or more pre-determined emotions. Furthermore, the patent also mentions that the selection from the second set of options may be based on the matching emotion. Specifically, if a player feels a certain emotion, such as happiness or sadness, they can select options matching it. For instance, if players feel happy, they may be more likely to choose options that reflect positive outcomes. This suggests that the system can use the eye-tracking system to detect players’ emotions and provide options that match those emotions.
It’s worth noting that just because Sony Interactive Entertainment published the patent doesn’t necessarily mean it will use the technology in a commercial product. It’s possible that this eye-tracking technology may never make it to the market. Whether it’s unclear how (or if) Sony Interactive Entertainment will implement this technology into PlayStation consoles, it could have major implications for the future of gaming. It could enhance the gaming experience for players and make multiplayer narrative games more accessible to those with physical disabilities who may have difficulty using traditional controllers. It seems like Sony Interactive Entertainment is exploring new ways to innovate in the gaming space by making video games more accessible and inclusive for players worldwide.
What do you think about this? Do tell us your opinions in the comments below!
Similar Reads: Sony Automating Parental Controls Based On Player Behaviour
Good job! Please give your positive feedback 😏
How could we improve this post? Please Help us. 💡