Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!
You can virtually call out someone acting suspicious, even in virtual reality
“Trust no one” is usually sage advice in a video game where orcs and ogres lurking behind every tree and castle wall could have you for breakfast. But what if the video game is your real life?
Nothing evil from the dungeons of cyberspace is about to jumpscare you and rip your head off. With the COVID-19 pandemic still on uncertain ground, virtual reality is being seen as a more and more viable alternative to meetings in person. The problem is trust issues. This is why Tara Collingwoode-Williams of Goldsmiths University in London devised two studies that looked into how we perceive others in VR. Your avatar could affect how much someone trusts you.
Welcome to the game known as the Collaborative Virtual Environment (CVE), where everything you say and do contributes to your professional track record. Someone who isn’t trusted by their peers in a collaborative effort could mean problems.
“Picking up on deception is such a subtle artform.,” Collingwoode-Williams who recently led two studies published in Frontiers in Virtual Reality, told SYFY WIRE. "What
is shared in both a virtual scenario and real-life one lies in the mixture of verbal and nonverbal behaviour as well as visual appearance. In our VR setup, we focused on body movements, and there may be cues in body language that may convey reason for distrust."
Anyone who has had to be part of a guild in World of Warcraft or another MMORPG may be familiar with needing to trust others in an effort to take down a dragon. Now take it one step further, VR headset and all, to something like AltSpace, Tilt Brush or Oculus Medium. You are not just watching the game as you manipulate your character. You are the character.
Collingwoode-Williams is playing Alliance in this one. She wants to see what kind of avatar representation setup makes people trust each other the most rather than pointing the finger at suspicious behavior. When she and her colleagues ran their study with volunteers, occasionally asking a researcher or to join as a “confederate” to see how trust levels would be affected, it was to rule out the types of setups in which people reported suspicious behavior. Whatever looked suspicious was probably only perceived that way because of avatar issues.
When people use full-body avatars in a CVE, it was found that they are less likely to be trusted by others only using a disembodied head and hands. There is more to process when you are faced with the visual of an entire body. Whether or not the others were actually trustworthy, they were trusted, which is what needed to be achieved across the board(room). Subjects in one study interacted with each other, while a confederate joined the group in the second study.
"I believe confederates who used full-body avatars were not seen as trustworthy as thsoe who didn't use a full body because some nonverbal cues depicted by their avatar made them appear less trustworthy," she said. "The confederate in this
context was asked to act if they were a participant, and instead of the full-bodied avatar neutralising this deceit, it made it more obvious."
Players participating in both experiments were asked to play a Jenga-type game called Build the Block, which needed both trust and concentration so the tower wouldn't topple. The other game, DayTrader, had more at stake. Subjects had to trust someone enough to give them a certain amount of virtual money that the other person was then supposed to invest and give back when it had accrued interest. Questionnaires were given to all participants, and performance data was analyzed by the researchers.
Watching this play out revealed that groups with players who all used the same type of avatar showed more trust than groups with mixed avatars. When confederates came onto the scene with a full-body avatar, they were thought to be less trustworthy because of how their nonverbal behavior was seen. Whether this had anything to do with the initial unfamiliarity of the confederate is unknown. Trust in DayTrader was also associated with the amount of money that was involved. If the player someone was going to give imaginary money to had the same avatar, there would be more trust there.
Go to any corporate site and read the “our culture” page (or something like it). You’ll be hard-pressed not to find that trust is one of the main buzzwords. As psychology continues to permeate the workplace, trust is essential between both individuals and groups if something is to be accomplished efficiently. Collingwoode-Williams has been trying to find out which types of avatars foster the most trust between the individuals behind them.
“We cannot avoid being misjudged in real life," she said. "VR doesn’t make it better or worse, as long as we are aware of the limitations of VR as it is now. It does not replace real life, but I do see VR being a useful medium to facilitate the future of work for instance, for group collaboration (since it enables turn taking cues) or when you need to discuss high dimensional data remotely."
There is still much to explore in this cyber frontier. Even if you are on a screen in avatar form, and no one can see your actual face on Zoom, some behavior can be taken the wrong way, and not just by your boss and coworkers. The current HR culture is leaning more and more towards what is called a “people-based” model that is really just armchair psychology. This is not just dangerous because of the initial misinterpretation that can happen, but punishment based on that interpretation that can lead up to wrongful termination.
Cyber-offices may allow HR unprecedented access to every meeting. It could mean people who are not mental health professionals will analyze the behavior of every employee based on their avatar.
If anything, Collingwoode-Williams’ research could reverse that. Someday, in a not-so-distant future, we may all be able to trust each other—at least in VR.