Skip to main content

Dr Julie Williamson from the University of Glasgow wins £1.75m grant from the European Research Council

The University of Glasgow has won funding for a project that aims to eradicate awkward virtual meetings by better portraying non-verbal cues.

The FUSION project has been awarded £1.75m to create meeting spaces that will incorporate both virtual and physical spaces. People will interact with each other in these spaces both physically and as avatars.

Over the course of the next five years, the project will utilise cameras and sensors to observe volunteers as they interact with each other, both in person and online, while wearing headsets. These observations will help to develop models of social signals, including voice, gestures, and positions, between individuals and across different realities. The aim is to make avatars better represent non-verbal cues.

Dr Julie Williamson from the University of Glasgow (pictured), who is leading the project, said: “Many of us became very familiar with virtual meeting software like Zoom and Skype to help us maintain contact with friends, family and co-workers during covid lockdowns.

“While those tools can be very useful, they can also be frustrating experiences. People talk over each other or don’t make consistent eye contact with their cameras, for example, and it’s impossible to see non-verbal cues like body language if you’re restricted to only seeing people’s faces.

“More advanced technologies like virtual reality headsets can allow users to feel more present together, but they’re still very crude approximations compared to face-to-face interactions.

“Social signals like gestures, eye contact and personal space are currently very difficult to recreate in virtual spaces, which often prevents interactions with other people from feeling realistic. What we’re aiming to do with FUSION is dissolve the barriers between virtual and physical realities to create social experiences that accurately capture the nuances of human behaviour.”

Analysis of social cues allows for a new database of behavioural patterns that persist across virtual and in-person spaces. This will help create software that improves mixed-use communications and stabilises interactions for more immersive experiences, the university said.

For instance, users’ positions can be subtly adjusted to create more effective group set-ups or their eye lines could be tweaked to better simulate face-to-face eye contact. Additionally, in cases where multiple people are talking at once, the software may manipulate the audio to focus the group’s attention on a single speaker.

Quelle:

https://www.xrtoday.com/virtual-reality/university-project-aims-to-dissolve-borders-between-virtual-and-real-world-meetings/
IKEA Launches Interactive AR Game to Teach Children About Marine LifeExamples

IKEA Launches Interactive AR Game to Teach Children About Marine Life

IKEA released an educational interactive AR game that’s made available to users as an Instagram…
27. Februar 2023
HOW MIXED REALITY TRAINING WILL HELP MAKE OUR ROADS SAFERExamples

HOW MIXED REALITY TRAINING WILL HELP MAKE OUR ROADS SAFER

National Highways traffic officers play a critical role in keeping the nation’s road network moving.…
20. Oktober 2022
3M VR-Training für Absturzsicherung, Absturzsicherung für Werkzeuge und SchweißsicherheitExamples

3M VR-Training für Absturzsicherung, Absturzsicherung für Werkzeuge und Schweißsicherheit

Sicherheitstraining mit unterschiedlichem Schwerpunkten. Sicherung auf der Baustelle von mir als Person und Gegenstände /…
21. Oktober 2017

Leave a Reply