The lights are on
Veteran Member - Level 13
I'm in my senior year of high school right now, and I've been taking a psychology course. One of the units we completed a couple months ago was based off learning about how the brain processes the signals for our senses. Of course, I knew beforehand that we use both of our eyes when looking around in everyday life, but as we learned about how the eyes process things, an idea came to me. When we play first-person shooters, or look through the eyes of a character in any other game, we're only looking through one, singular camera, are we not? While games have been said to have been getting more and more realistic over time, I realize now that there's one major way they're still inaccurate: vision.
Use both of them.
I looked online to see if I could actually find anything about if first-person shooters actually do use binocular vision, but my search ended up fruitless. If some games do use it, they aren't showing it. When I say that I want to see binocular vision, I have two reasons for doing so. One, to mimic how a human actually sees, and the see the result from that. And two, to create new, interesting experiences from a first-person perspective that people may not have thought about before. Keep in mind that this is me coming from a purely creative stand-point; I'm no technical genius, and the farthest I've gone into game-making is making a game based off the War of 1812 in RPG Maker VX Ace. So, for any of you actual developers out there, don't say "braaah you can't do that", please. ;)
The way the brain processes visual information is by having the rods and cones within the retina process what is in a person's environment. This is called transduction, and then the signals are sent through the optic nerve to the visual cortex of the brain. I left out a couple stages, such as the ganglion and bipolar cells, but those don't really matter for the purposes of this blog. What does matter is that each eye does this process, and the visual cortex takes these signals and puts them together to form the proximal stimulus, or the image that a person sees through their eyes. A person's environment is called the distal stimulus, or the actual image of reality. Of course, or eyes aren't perfect, and that's why we have things like blind spots and glares.
I understand that implementing a system where the player "sees" from both eyes would not change normal first-person shooters that much. In reality, it would probably just be a hassle for developers, and could possibly lead to strain on the player's part if it wasn't done correctly. One potential way, however, could be to implement a system where the player closes one of the character's eyes, in order to limit distractions and concentrate on one field of vision. Another interesting idea could be for a game like Amnesia, where the player's character begins to lose their sanity upon seeing a monster. What if the player could have their character cover one of their eyes and run past the monster as a way to avoid losing sanity. Not the most unique idea in the book, but I think that developers could come up with some unique ways of dealing with true monocular vision, which is seeing with only one eye. I know you don't lose much line-of-sight with one eye...but hey, there are ways to make it interesting.
The corpus callosum is the set of nerves between the two hemispheres of the brain that link them together. For those who don't know, the left hemisphere of the brain controls the right side of the body, and vice versa. The corpus callosum can actually be split by a surgeon, and not kill a person, and it has been used as a form of treatment for epilepsy in the past. I am very uninformed of how a person's vision is truly affected from this condition, but there is an effect. I did some research, and saw that one possible effect is with math. When a person with a split brain sees a math problem in their right field of vision, the left hemisphere will solve the problem with no qualms. But, if presented to the left field of vision, the right hemisphere will be confused, and won't be able to process the problem.
Sure, that sounds sort of lame-how would covering one eye in a video game, and not being able to process a math equation, be innovative gameplay? The answer is that it wouldn't be. But, it's the principle I'm trying to convey here: what if we did have to play from the perspective of a character with this condition? How different could our games actually be? This may be something for later down the road, when psychologists and neurologists have performed more tests and discerned more of what each side of the brain does. In any case, developers won't be able to experiment with this sort of idea until games do actually have players view the world through binocular vision. If you want to look more at how split brained people process images, you can try this fun little game here.
One final idea I have for binocular vision is with the Oculus Rift. I read up on the VR headset, and it says that each eye is given a separate image inside the headset, and it appears they are the same image. I think that Oculus Rift could really bring the idea of binocular vision in video games to life, as the player would literally close one eye and only see the game from the other one. Again, I don't have tons of ideas of where developers could go from that point, but I think it's safe to say that they definitely could do something new and interesting. It'd probably be compared to the haptic feedback triggers in the Xbox One controller, as they are a minute addition to add more immersion for the player.
What are your thoughts on this matter? Have you ever thought about how first-person games are inherently different from how we actually see the world, or has it never bothered you? Would you like to see this idea in the future? Leave your thoughts below!