Testing the Metaverse and Beyond

Why We Can’t Wait to Automate!

Metaverse: a network of 3D virtual worlds focused on social connection and often described as a hypothetical iteration of the Internet as a single, universal, simulated world that is facilitated by virtual and augmented reality technologies.

What is the Metaverse really?

Although this article began with a fancy definition, no one really agrees on what the metaverse is or isn’t. Conceptually, it’s not hard to grasp the metaverse as an online 3D cyberspace connecting users across the globe in various aspects of their lives. Think of multiple computing devices and platforms connected together via the Internet, allowing each user’s avatar or character to navigate through a virtual space and become immersed in different experiences everywhere from virtual hangouts to sports games, bar mitzvah’s, weddings, and more. In reality though, the metaverse doesn’t exist, well at least not yet.

The idea of the metaverse was first developed in the science-fiction novel Snow Crash by Neal Stephenson. Stephenson had a vision of a virtual-reality (VR) based Internet evolving into a world that resembled a massively multiplayer online (MMO) game. In it, the metaverse would be populated by user-controlled as well as system daemons running in the background. While the idea of a metaverse was once fiction, it is looking more and more likely that it will soon be a reality. As you may know, the social networking giant Facebook has gone as far changing its name to Meta to reflect their belief that the metaverse is the next evolution of social connections, and are committed to bringing the metaverse to life.

Where Can I Experience the Metaverse?

Let’s take a look at some instances where the global coronavirus pandemic pushed quite a few metaverse-like experiences to the forefront of our personal and professional lives. Here, I’ll share a few of my own experiences and encourage you to chime in with any others…

Gather.Town

In the Fall of 2020, I was getting ready to log into the StarWest testing conference. As a testing nerd, this was nothing new for me. After all, I’d attended this and several similar conferences over the last decade or so. However, this particular year, the conference had to be held virtually because of global shutdowns due to the pandemic. StarWest has always had a virtual component for folks who could not attend in-person to be able to watch the keynote sessions for free. However, upon logging in, I wasn’t greeted with the usual interface with web links and embedded video streams. Instead, I was transported into a 2D virtual space called gather.town.

Waiting for Jason Arbon to show up in Gather.Town for our couch session at StarWest 2020

The floor plan for the virtual space was quite similar to the floor plan for the physical conference and, as an attendee, I could select an 8-bit avatar and now instead of jumping from link to link, it was possible to walk my avatar from one session to another, and even bump into other folks for a quick catch-up along the way. As you walk up to another avatar or group, your video feed automatically appears and you can immediately engage in the conversation. My buddy Jason Arbon and I, even held our “Ask Us Anything” couch session virtually in Gather. The above screenshot of me waiting for Jason to show up for our session is living proof that some things probably won’t change, even in the metaverse :)

It was also cool to walk up on the main stage with my avatar to see myself and the other keynote speakers. Although the experience was completely retro with the 2D space and 8-bit characters, this gave me a good glimpse of what the metaverse could be like.

My virtual avatar walking up on stage during the lightning keynote session at StarWest 2020

Virtual and Augmented Reality

Once the pandemic hit, I found myself spending a lot more time using my Oculus Quest 2. When it was time for the olympics, I wondered if there was anything available in VR that could get me in close on the action. Thankfully, the Tokyo Olympics presented some live and full event replays on the Quest 2 for pay-TV subscribers in the United States. Set up was easy, as I just had to download an app is called NBC Olympics VR by Xfinity and sign in with my account. And there I was watching the opening and closing ceremonies, and my favorite Track & Field and other sporting events in 3D VR. I even set up a watch party with a few of my Oculus Friends who were also subscribers.

Watching the NBC Olympics VR by Xfinity on the Oculus Quest 2

I was also able to get my wife and kids involved in trying out various Quest 2 experiences, ranging from playing Beat Saber to realistic simulations like the Jurassic World: Apatosaurus VR experience. If you’re trying to get an idea of what the metaverse may be like, I’d suggest trying out some of the VR or augmented reality (AR) headsets as the metaverse will likely be powered by these technologies.

Video Games

You can already find elements of the metaverse in popular video games like Second Life, World of Warcraft, and Fortnite. Because of the services they now offer, along with the emphasis on 3D VR, video games are perhaps the closest you can currently get to a metaverse experience. My kids are huge fans of video games like Roblox and Fortnite, which now host virtual events like concerts and meetups. That’s right, gamers aren’t just playing games anymore, they are using gaming platforms for all aspects of their lives in cyberspace. I just had to check out this stuff for myself and so in Fortnite, along with 12.3 million other players, I took part in Travis Scott’s virtual in-game music tour.

Travis Scott’s In-Game Music Tour within the Fortnite Metaverse

It is important to note that many of these “metaverse” elements aren’t new to the gaming world. For instance, it’s been 20 years since a wedding was held in the game Second Life and, since 1996, player avatars have been roaming around the 32-bit meadows of Furcadia, one of the oldest massively multiplayer online role playing games (MMORPGs).

Furcadia is one of the longest running social MMORPG’s got a visual overhaul in 2016.

To bring things full circle, a group of developers came up with a novel way to host a DevOps conference within the popular Nintendo game Animal Crossing: New Horizons.

Deserted Island DevOps Conference 2020 hosted virtually within the game Animal Crossing: New Horizons.

Why We Can’t Wait to Automate the Metaverse?

As the buzz and hype around the metaverse continues, many are raising concerns about the potential risks in an environment where the boundaries between the physical and virtual worlds are blurred. These risks and concerns associated with the metaverse are a large part of the motivation for why we can’t wait to begin investigating automated metaverse testing techniques. Here’s a quick list of some of the key challenges:

Risks and Quality Concerns

  • Identity and Reputation— Ensuring an avatar is who they say they are in the metaverse. Along with identity authentication and verification comes the need to protect users in the metaverse against impersonation and activities that may harm their reputation.

  • Ownership and Property — People have already started to purchase digital assets like NFT art and virtual properties. The metaverse raises two key questions: how can ownership rights be grated for creators of digital assets? and how can ownership be verified?

  • Theft and Fraud — As banking, currency, payment systems, and other forms of commerce migrate to the metaverse, we are likely to see an increase in attacks that result in stealing, scamming and other types of crimes for financial gain.

  • Data — Folks are already speculating that the metaverse is yet another ploy by the tech giants to get more data. After all, control of data can help organizations control markets. Whether or not you believe there is a conspiracy to get your data, there’s no doubt that nowadays the abuse of data and misinformation is widespread and needs much attention.

  • Invisible Avatar Eavesdropping — Malicious actors may figure out how to make their presences in the metaverse undetectable, and use such an exploit to invisibly join meetings and eavesdrop on private conversations.

  • Harassment and Personal Safety — Did you know that a woman was sexually harassed on Meta’s VR social media platform? Harassment and personal safety is not just a physical thing, it can be verbal and now a virtual experience that should be prevented.

  • Legislation and Jurisdiction — In a virtual space that is accessible to anyone across the world, it is essential to be able to identify any boundaries of that space and put rules into place to make sure it is safe and secure for everyone.

  • User Experience — The ability of the metaverse to become a space where people can connect, form meaningful relationships and become fully immersed in its surroundings, is directly correlated with its visual and graphical fidelity along with other aspects of look and feel. Visual, audio, performance, accessibility and other issues are therefore likely to detract from the overall metaverse user experience.

Promising Directions

Even with such a tall order of risks, concerns, and testing challenges, there’s much more behind our motivation to automate testing of the metaverse. AI and machine learning (ML) has been helping us to design and develop new, more robust, and resilient, automated testing tools, products, and frameworks. The engineering team at test.ai has pioneered many of these innovations and continue to leverage AI to solve some of the hardest testing problems we have today. Here are some examples of where much of our work to date directly correlates and translates into automated testing of metaverse-like experiences.

AI for Testing Digital Avatars

Computer vision unlocks a realm of possibilities for test automation. Bots can be trained to recognize and interact with visual elements, just like humans do. We applied our AI-driven testing technology to the validation of digital personas like those developed by SoulMachines. The engineering team at test.ai trained ML object detection classifiers to recognize scenarios like when the digital person was speaking, waiting for a response, smiling, serious, or confused. Leveraging AI, we developed automated tests to validate conversation-based interactions with the digital avatars on SoulMachines. This included two forms of input actions, one using the on-screen textual chat window, and the other by tapping into the video stream to ‘trick’ the bots into thinking that pre-recorded videos were live interactions with humans. A test engineer could therefore pre-record video questions or responses for the digital person, and the automation could check that the avatar had an appropriate response or reaction.

Even in the early stages of their development, our bots were able to learn how to recognize interactions that were relevant to the application context, but ignore others. Let’s look at a concrete example. The figure below shows the results of running a test where the goal was to validate that the bot was able to respond appropriately to the gesture of smiling. We all know that smiles are contagious and it’s very hard to resist smiling back at someone who smiles at you, and so we wanted to test this visual aspect of the bot interactions. The automation therefore launched the digital person, tapped into the live video stream, and showed the digital avatar a video stream of one of our engineers who after a few moments started to smile. The automation then checked to see the avatar’s response to smiling, and here was the result.

Bots detecting visual differences in digital avatars based on the application context, while ignoring others.

As shown in the figure, if you compare the bot’s current observation of the avatar with the prior observation, you will notice there are two differences. Firstly, the avatar’s eyes are closed at the moment of the capture as indicated by the blue boxes, and it is also smiling broadly enough that its teeth are now visible (red boxes). However, the difference mask generated by our platform only reports one difference — the smile. Can you guess why? Perhaps a bug in the test.ai platform? No, quite the contrary. Here the test.ai bots have learned that blinking is part of the regular animation cycle of the digital avatar. It’s not just trained on a single image, but actually trained on videos of the avatars, which include regular movements. With those animations now recognized as part of the ground truth, the bot distinguishes that the big smile is a deviation from the norm, and so produces an image difference mask highlighting that change and that change only. Just like a human would, AI can notice that the avatar smiled back in response to someone smiling at it, and knows that the eyes blinking at the moment of screen capture is just coincidental.

Want to play around with AI’s ability to detect emotions for yourself? Or maybe even ON yourself? Head over to https://cloud.google.com/vision and scroll down to the ‘Try It Yourself’ widget and upload a photo of yourself or someone else’s face. Under the ‘Faces’ tab of the output, you’ll see how the AI classifies the facial expression in the picture.

Pretty cool huh? At last year’s Conference for the Association for Software Testing (CAST), I gave a workshop on AI for software testing and had the participants play around with the Cloud Vision API. Testing guru Paul Holland took part in the workshop, and as a group we found his result somewhat amusing :) Check it out…

Testing guru Paul Holland looking joyful according to the Google’s Cloud Vision API.

Well what do you think? Does Paul look joyful? Some folks detected a bit of a smile in Paul’s photo but Paul himself wasn’t so convinced :) On the flip side, the bot was 92% confident in its classification of Paul’s emotions!

AI for Testing Video Games

AI plays games, why not test them too?”

This was the tag line for a chapter I wrote in the 2021–22 World Report on the State of AI Applied to Quality Engineering. In fact, much of my day-to-day focus for 2021 was on R&D related to the use of AI for testing modern video games like League of Legends, Fortnite, Call of Duty, and more. Like a good software engineer, I’m going to apply the DRY (Do Not Repeat Yourself) principle to this article, and instead, point you to some of the resources we developed and shared last year on the subject.

Here’s a link to the aforementioned chapter entitled Transforming Game Testing with AI-Driven Automation, which also includes an audio book version for those who’d rather listen :) There were also a number of presentations, but one of my personal favorites was the talk I did for the AutomationGuild conference. Our friend and founder of TestGuild Conferences, Joe Colantonio, has granted special permission to share the presentation video recording as part of this post. Enjoy! Thanks Joe!

Automation Guild 2021 Presentation: AI for Game Testing — It’s All Fun And Games Until the Tests Fail!

After watching the presentation is it should be clear why I can’t wait to automate the metaverse. These techniques certainly have a wow factor and I remember some days just spinning up our tech on some of the latest games, just to watch the bots explore and test the games. Finally, using the powers of AI for the good of testing and also getting paid for it :). Definitely a dream job for a testing nerd like me.

AI for Testing Virtual and Augmented Reality

Of course we couldn’t just stop with 2D and 3D game testing. Huge Kudos to Dionny Santiago for this next one. Dionny leads our core platform team at test.ai and recently extended that platform to handle Virtual Reality (VR) environments. Using a similar approach to how we interact with gaming consoles, controllers, and video streaming devices, we integrated tools and drivers that allow us to manipulate and observe the input-output functions of a VR headset. Once we could control inputs and observe outputs in VR, it was just a matter of tying that API into test.ai’s Game Testing Cortex, an ML brain that combines supervised ML and goal-based reinforcement learning to test video games in real-time.

Integrating Virtual Reality Headset I/O into Test.ai’s AI-Driven Testing Platform

The final result is that engineers or external programs can make calls to the VR API controller and leverage it to build and execute tests in that environment. Take a look at it in action as we programmatically modify the yaw causing the headset to rotate within the virtual space.

AI for Testing at Scale

AI brings some other key advantages to the test automation world, including accelerated test coverage, reuse of tests across applications, domains, and platforms, among others. When it comes to fruition, the metaverse will undoubtedly be a big place, the bounds of which may not even be fully comprehensible to the human mind. Machines on the other hand, are good at handling large amounts of data and processes at scale, and AI and ML are paving the way forward. This happens to be one of Jason Arbon’s favorite topics, and if you are curious to learn more about this subject, including the future of testing at scale, check out this webinar with Jason and Kevin Pyles. Our DevOps architect Patrick Alt also gave a technical presentation on using Kubernetes, AI, and other techniques for Deploying a Large Scale Army of AI-Driven Testing Bots. After all, our mission at test.ai is to test the worlds apps… how’s that for a testing at scale problem? :)

AI promotes accelerated test coverage and reuse across applications, domains, and platforms.

AI for Non-Functional Testing

AI for performance, accessibility, usability, trustworthiness and security testing are all topics that we have covered over the last year. AI is slowly taking over the testing space. It started with AI being incorporated into specific toolsets and programming languages, but now we’re seeing AI stretching beyond functionality into cross-cutting concerns. You can expect this type of growth until there is eventually a full-stack replacement for test automation using AI. It’s quite exciting to watch and as ideas like the metaverse become a reality, such AI-powered full-stack test automation will prove to be necessary to keep up with the pace of development.

Metaverse testing will require a full-stack approach to test automation powered by AI.

If you’re interested in learning more about each of the above dimensions of testing with AI, check out my e-book on AI-Driven Testing. It’s free if you are an O’Reilly Learning Platform subscriber, and if not, for a limited time, Keysight Technologies has made the book freely accessibly via the following sign up and download page.

Get your copy today, it’s free for a limited time :)

AI for Testing AI

So there’s one aspect of the metaverse that I’ve purposely left until last because its both particularly interesting and challenging. Human controlled avatars aren’t likely to be the only entities you can interact with in the metaverse. Yes, there is a great probability that there will be some equivalent to non-player characters within the metaverse. In gaming, a non-player character (NPC), is any character that is not controlled by a human player. For example, in one of my favorite games FIFA soccer, there are several NPC’s ranging from the referee, goalkeeper, fans in the stadium, among others.

A non-player character (NPC) in FIFA. There will likely be the equivalent of NPC’s in the metaverse.

To enhance the experience, NPCs in the metaverse may themselves incorporate AI and ML so that they can adapt and evolve to changing user needs and conditions in the virtual environment. The open question therefore is how do we test these AI-controlled NPC’s in the metaverse. Well, my answer couldn’t possibly get anymore meta: use AI to test AI.

Now that I’m 100% confident you’ve now seen the true level of my crazy…

Let me explain :)

AI and ML are enabling highly dynamic behaviors in software today. A static approach to testing is likely to be inadequate in validating these self-adaptive and self-evolving capabilities. As a result, many of the current testing techniques and approaches will only take you so far with such a system. Automation itself will need to evolve and the testing framework itself will need to be adaptive so that it can monitor and regulate the system under test. Such a system would be trained, managed, and monitored by humans, while facilitating gathering and coalescing the vast amounts of data on metaverse interactions and transactions. This would allow us to apply the rules, laws, goals, and policies defined for the virtual space. Furthermore, testing will need to become an inherent part of the NPC’s and other aspects of the metaverse for the initiative to be successful. It is the only foreseeable way to ensure that all the risks and quality concerns mentioned in this article can be addressed. A holistic testing strategy for the metaverse would involve validation and verification activities across multiple disciplines that are triggered both offline, prior to deployment, as well as on-line, continuously, after deployment.

Holistic quality engineering practices in the Metaverse will stretch across multiple disciplines.

As we’ve been building out the next generation of automated testing at test.ai, I have noticed that there is a need to promote and share our practical experiences and strategies for testing AI-based systems. Furthermore, beyond testing, an end-to-end quality engineering approach should chart the path forward with AI. I recently had the privilege of working with former O’Reilly AI/ML editor, Rebecca Novak, designing a course on Quality Engineering for AI and Machine Learning. A short version course is available for individual subscribers on O’Reilly’s Learning Platform, or if you’d like to have your engineering team go through it, you can register your team for an extended version of the course.

Wrap Up

The metaverse isn’t here yet, but we can’t wait to automate it. Firstly, it’s too risky to wait around, and secondly the challenge itself is being met equally by some pretty cool and awesome technology that is driven by AI. It’s a great time to be in technology, and an even greater time to be in testing, so let’s look forward to testing the metaverse and beyond.

A huge thanks to Yashas Mavinakere, Jonathan Beltran, Justin Phillips, Jason Stredwick, and the entire engineering team at test.ai for all the work they do to make things like this possible.

— Tariq King, Chief Scientist, test.ai

References

Balance Academy. What is the Metaverse? Published Sept. 21, 2021. Last Updated Jan. 13, 2022.

Wired. Cecilia D’Anastasio. Video Games Already Do What the Metaverse Just Promises. Jan, 10, 2022

Facebook. Introducing Meta: A Social Technology Company. Oct. 28, 2021.

Michael Kan. New Use for Animal Crossing: Virtual Tech Conference Venue. Apr. 30, 2020.

MIT Technology Review. Tanya Basu. The Metaverse has a Groping Problem Already. Dec. 16, 2021

Next
Next

Test.ai Carbon version 2.11 is Available Now