TechUK is a trade association that brings together people, companies, and organisations to show the realise the positive outcomes that digital technology can achieve. Its purpose is to champion their members and their technologies; preparing and empowering the UK for what comes next, delivering a better future for people, society, the economy, and the planet. I am a supporter and member of the techUK forums and it is an amazing environment to discuss, discover and enable all in the UK to adopt and grow with emerging technologies.
Recently I attended the Immersive Tech Roadshow in Bristol for techUK, which consisted of several engaging sessions. A session to convene the community and explore the state of the industry and the use of xR, a networking break and demos with local stakeholders and influencers, followed by a policy roundtable exploring the opportunities and barriers to greater adoption and growth of xR technologies.
Throughout this article I will refer to all immersive technologies as xR, or use the terms AR for Augmented Reality as a specific case, and VR for Virtual Reality.
In the first session, hosted by Laura Foster, Head of Technology and Innovation at techUK were:
- Alex Harvey, Co-founder and Creative Director at RiVR
- Richard Earley – Head of UK Content Regulation Policy at Meta
- Verity McIntosh, Researcher and senior lecturer in Virtual and Extended Realities at The University of the West of England
- Dr Neil Ralph, Head of Technology Enhanced Learning at NHS England
The panel presented their perspectives and responded to questions from those gathered. I’ll get onto my perspectives from this session in the article summary.
In the break between the sessions there were demonstrations of immersive technologies by a number of organisations, among them: Scentient, Rescape, RiVR and Fracture Reality. Unfortunately, I didn’t manage to experience any of the technologies first-hand, although I could watch others using the technologies.
The closing session was a policy roundtable that focused on some pre-defined questions and encouraged, as the name suggests, an open discussion. There was some great debate in the room, and as could be expected, we ran out of time.
Takeaways from the event – VR and AR
From the great discussions and amazing knowledge of the panel and all present, I was left with a few high-level takeaways.
At the outset I will remove all consumer use cases as my focus is on how businesses can develop strategies from the successful application of xR.
The collective opinion is that VR will be used to enhance communication in scenarios such as training, education, and tele-resonance. This last term a new one for me but aligns to use cases such as robot vehicles in dangerous environments controlled by humans removed from the danger, but connected in a fully immersive manner. Outside of tele-resonance, the use of VR was temporary in nature, although there were some amazing use cases, especially in healthcare, which centre around temporary or periodic use. This is probably a positive as I didn’t relish the idea of long-term wearing of VR, even when battery weight, battery life and environmental considerations are addressed.
Augmented Reality (AR) falls into a separate use case, and I heard it described as a tool, like any other in a toolbag to be used when needed. With AR again there are two scenarios that I could group people’s perspectives into: AR where a camera analysed and overlaid data relating to what the wearer could see, ehereas the other use case was more a Heads Up Display (HUD), where data was relayed without analysing the user’s visual environment, but may be dependent on positional or environmental information. There are already a number of suppliers of VR and AR equipment and there is a marked difference in price; I expect VR to remain the exclusive use of specific and unique use cases. With AR, on the other hand, I can see a far more rapid adoption.
Speaking of adoption, there were discussions around the blockers for users taking up and using the equipment when configured and supplied. Some of these blockers come from outdated experiences from theme park-styled rides of years gone by; technology has moved on and talk of 60 frames per second (FPS) and full 360 filming in 3D have enhanced this environment, meaning there will be a massive relearning, especially for those who have previously created content for the screen.
Professor Verity Mcintosh shared a term relating to how we use touchscreens in this generation, and there was discussion of how, years from now, there will be a generation known as the generation of ‘screen stabbers’; this is in relation to the expectation that digital interactions will become far more gesture-related with less reliance on touch. I found this an interesting concept, and with the rapid advancement of ever smaller, more powerful mobile technology, a very realistic one; especially when we look at cutting-edge ideas such as Hu.ma.ne., which bases its human interface on this concept. I also wonder if adoption issues are once again a case of over-zealous implementation before considering the ‘why’.
Microsoft, in one of their Ignite 2023 videos, hinted at the AI and xR connection driving the next level of human interaction and productivity. This use case falls very much into the tool scenario, one which is easy to see can be applied to various use cases; Manufacturing, Service Industry, Automotive and many more. The prime benefit from these use cases is the engagement of remote support from individuals possibly more experienced or trained, which may not be able to be physically on site in a required timeframe. Additionally Microsoft highlighted the use of AI to monitor, advise and assist real-time with remote users of xR technology.
One use case that did stand out and I was glad to see was by Alex Harvey of RiVR. Alex shared with us some of the work they have been carrying out with wheelchair-bound users to allow them to travel through and experience locations that are impossible for them to access physically. This experience is also being extended to individuals with paraplegia, who, with extremely limited physical movement, can still experience the wonders of a mountain pass or a busy street. In other technologies we have seen how pupil tracking and small muscular pulses can provide an innovative interface and an amazing level of control to users in this scenario, providing a freedom previously unavailable to them.
Questions to consider
The session did leave several questions open that you may want to consider when looking into xR:
- What is the environmental impact of creating this high-quality content? Can we quantify this and how do you create a worth-it equation?
- What are going to be your guidelines for VR and AR wear, weight, duration, etc?
- Learning and information retention was raised by 70% using VR technologies; how much of this is the removal of distractions such as mobile phones, laptops etc and how much is the immersive environment?
- How does the compute capability compare to traditional screens, and is the extra investment and potential running costs for all case scenarios valid?
- xR requires the acceptance of cameras in almost every environment, and with them becoming ever more discreet, are we ready to accept them and should they become invisible?
- Inclusion and equity are real concerns, on the one side these devices can create an inclusive experience while excluding others. I discuss this in more detail later, but think about your organisation.
- High quality images (60 frames per second) create an amazing experience when used for sharing real world scenarios, but what is the processing cost to create rendered environments and how much environmental impact will this create?
- With reports showing 70% of VR community users are male and yet 70% of the avatars are female, are we able to draw any conclusions on the equity and attraction of the platform? How will you ensure an equitable and fair adoption across your business?
Are we at risk of duplication of technology, and by association, the environmental impact of manufacturing multiple devices per user with similar capabilities, a screen, VR and an AR device? Where xR doesn’t directly replace environmentally impacting activities such as travel or where the expensive and extensive technology required to create this environment become overwhelming, does it have a value?
Cameras that can film 360 degrees in 3D are not cheap, or light on technology and components, especially ones that film at 60 frames per second. The rendering and presentation as well as the storage and computation of the user’s interaction is also not to be sniffed at. A fully immersive 360, 3D environment will require a fair amount of processing to create an experience that doesn’t invoke motion sickness or a sub-optimum experience to the user.
One caveat I will bring to the end of this article was raised by Dr Neil Ralph, and it’s a very real quandary for adoption in business areas; how do you ensure equity and accessibility for all with a technology that is primarily focused on sight-enabled individuals? The Government’s Equality and Diversity policy and the Public Sector Equality Duty (PSED) contains general expectations that state:
…advance equality of opportunity between people who share a protected characteristic and those who do not protect and ensure equity for all;
Without an exemption in this framework can an xR technology be considered fit for use in the government and public sector branches?
It is, without doubt, the amazing audience at the techUk event that makes it possible for me to create this article and draw these conclusions, and they all have my heartfelt thanks, especially for enduring my direct questioning.
If you have a ‘why’, then ‘what’ and ‘how’ will become clearer as we embark on this journey; please do reach out if you want to discuss this further and learn about the options open to you.
I am both fully supportive and excited about the application of this technology into our lives. Its use, adoption and environmental impact must, however, be carefully managed and measured against the alternative and maybe less immersive environments.