BY ALAN SMITHSON FOR ADELLO MAGAZINE
A lot of technology has gone into what we call the Internet. From unique IP addresses and fiber-optic cables to taxonomy and naming conventions, standards, protocols, and so much more, a congregation of new technologies to solve more complex challenges as use cases and users of the internet developed. As the number of users grows exponentially, it took 12 years for the internet to reach one billion users and only five to reach two billion. That was in 2011.
Now think about Apple; it took the company more than 30 years to reach a $1 trillion market cap and only four more years to triple it to $3 trillion.
That’s the thing about exponential growth: human brains have trouble predicting it and grasping the consequences, sometimes until it’s too late. But as we start spending more time in virtual worlds, trading virtual goods, and speaking with virtual avatars, we need to seriously consider the potential for bias and harm while providing a free and open structure on which the world can be built.
The metaverse has been a hot topic lately, but few consider what it will do for humanity. I believe that the metaverse will be the internet that we enjoy today, with video, audio, text, and images that we scroll and swipe through, plus a more immersive, interactive, decentralized, and autonomous layer on top.
JP Morgan, Goldman Sachs, PwC, Grayscale, Citi, and Morgan Stanley predict the metaverse will be worth $6–13 trillion by the end of the decade, with revenues from virtual worlds alone expected to reach $800 billion in 2025.
What are the business use cases for the metaverse? The high-level use cases being deployed today include; training, marketing, engineering, and digital twins for maintenance, remote collaboration, virtual retail, education, and many others.
Recent advances in three seemingly disparate fields of technology are starting to converge together and hold the promise of a new paradigm of computing in three dimensions. With the explosive growth of all three of these technologies, the question becomes, ‘What industry or field of human endeavor will the metaverse not impact?’.
The visual layer is the first innovation underlying the metaverse development. xR is a catch-all term for real-time 3D, virtual, augmented, and mixed reality. Some say multiplayer games like Fortnite and Roblox are early prototypes of the metaverse, while others believe that platforms like Decentraland or Sandbox are the first ‘real’ metaverse platforms due to their open and decentralized nature. Others point to VR worlds like VRChat, AltSpace or Meta’s Horizon are the start of the metaverse due to their immersive nature.
The above examples are all built using similar technology stacks for rendering real-time 3D. Before building any virtual world, 3D website, interactive training system, AR virtual try-ons, or games, designers and developers create 3D assets using software like Maya, Cinema4D, Blender, and Adobe Substance Painter. Those can also be purchased from stock 3D model markets such as Sketchfab and Turbosquid or even 3D scan physical items into models using the LiDAR scanners built into newer iPhone and Samsung models.
Despite the lack of consensus on global standards regarding which 3D format will be universally accepted across all platforms and devices, The Khronos Group and, more recently, the Metaverse Standards Forum have brought together over 1,500 organizations in the pursuit of standardization in the XR and 3D industry, similar to what happened with the JPEG (images) or MPEG3 (audio) and MPEG4 (video). So far, the two contenders are Graphics Language Transmission Format (GLTF) and Universal Scene Description (USD).
Once you have the 2D and 3D files required to build your experience, it’s time to jump into creation platforms or game engines, which make it easier to build scenes, games, and worlds adapted to popular gadgets. Platforms like Unity, Unreal, and MetaVRse can quickly help designers, developers, and creators build and deploy quality immersive and interactive experiences.
Creating compelling real-time 3D metaverse experiences involves great skills in storytelling, architecture, game mechanics, sound, animations, avatars, and more. Each discipline is brought together to create a magical experience for the end user. Think of XR as the visual layer of the metaverse.
The second technology that will power the metaverse is the blockchain, or distributed ledger technology (DLT). A number of technologies are built on the blockchain: non-fungible tokens (NFT), distributed autonomous organizations (DAO), cryptocurrencies, utility tokens, and others that can be used for transacting commerce, value transfer, governance, proof of provenance, and traceability of assets or commodities, both physical and virtual. It will provide metaverse users with an immutable record, and a more fair and distributed economic system, free from government oversight. We are at the nascent stage of blockchain technology, but rapid advancements in the field have attracted billions of dollars in investment, with the hope of building the next great platform for Web3.
Many great books have been written about blockchain and its benefits to society, but few mention its direct relationship to the metaverse as the foundational layer of trust and trade for 3D assets across platforms. Let’s break down the individual technologies:
NFTs are a simple tag or receipt of ownership of a digital or physical asset through a distributed ledger. The underlying assets (avatar, 3D object, art piece, etc.) are rarely held by the NFT itself, but rather by a 3rd-party hosting service, like AWS or Azure. Artists, brands, creators, and holders of IP love the idea of NFTs because, depending on the smart contract, the creator of the project could get a percentage of any subsequent sales. Imagine if DaVinci had NFTs, so every subsequent resale of a painting netted 5% of his estate.
DAOs are organizations without a hierarchical management structure. They are decentralized to give the token holders voting rights and are useful when it exists for one intended purpose (i.e. Constitution DAO formed to purchase a rare copy of the US Constitution). Once more complicated business structures evolve, the rigid nature of governance through blockchain seems to become a challenge as each change to the underlying code requires consensus to change. Newer DAO structures, like Madder.Science DAO, are acting like incubators/accelerators for promising projects in the metaverse space. However, this business model is still unproven and will take time and a community focused on growth to succeed. (Disclosure: The author is one of the founder members of Madder.Science DAO).
Cryptocurrencies are the economic layer of the metaverse, with the promise of allowing instant, cross-border transactions without government oversight and taxation, at least in theory. Of course, assuming any unregulated asset class would remain so would be a grave miscalculation, and assuming governments wouldn’t come looking for tax would be equally so. Since it is relatively easy to create and distribute a cryptocurrency, many have been created. To date, there are over 20,000 different cryptocurrencies with a total market capitalization of over $1 trillion. These coins (or tokens) range from the popular ones, Bitcoin (BTC), Ethereum (ETH), and Tether, to completely obscure tokens, Pitbull (PIT) and Vaquita (VAT).
Not all blockchains are created equal, and some, like Bitcoin, use a process called proof-of-work, or POW (computer processing to ‘mine’ coins). This uses a lot of energy, and new, faster, and environmentally sustainable protocols like Hedera (HBAR) and Solana (SOL) use a proof-of-stake (POS) model (approved nodes that require no computing). Ethereum moved to a POS system in mid-September 2022.
The third and potentially most impactful technology required to power the metaverse is Artificial Intelligence (AI). As the world moves from phone to face over the next decade, new technologies will be required to help create virtual worlds and objects.
AI is a computing concept that helps machines think and solve complex problems as we humans do with our intelligence. A human performs a task, makes mistakes, and learns from them. Likewise, an AI is supposed to work in the same fashion as a part of its self-improvement (Dibbyo Saha, Ryerson University).
AI is built upon several subdomains of academic studies; computer vision, deep learning, machine learning, natural language processing, neural networks, evolutionary computation, suggestion algorithms, and simultaneous location and mapping (SLAM).
Creation — DALL-E by OpenAI, MidJourney, NightCafe, and Deep Dream Generator create novel 2D visual art in seconds. This has been a dream of computer scientists since the beginning of AI, but I feel those true graphic artists will struggle as this form of AI creates license-free works instantly for pennies.
Other, newer systems, such as Anything.World, can create entire 3D environments and scenes from simple text input. Kaedim is a new platform that promises to turn 2D images into 3D objects with mesh-level details.
There is also an entire practice of turning 2D videos of people into other people, known as Deep Fakes. DeepFakesWeb and ReFace allow you to pretend to be a celebrity using deep fake videos. This is all pretty revolutionary and will have a massive impact on the creation of the metaverse on a global scale.
Conversation — Natural language processing (NLP) is the study of allowing computers and humans to communicate in a more natural way. New AI platforms such as Inworld.AI, Dialogue Flow, and LivePerson are already operating at scale, servicing websites and applications across myriad industries.
Computer Vision (CV) — CV uses cameras and sensors to understand the world around you and return data in real time. The best example is simultaneous location and mapping (SLAM), which uses the RGB cameras on your phone to understand where the ground is and where objects are in relation to you, projecting virtual graphics on top in the form of augmented reality. CV is also used for recognizing faces, places, and things (try Snapchat’s plant identifier) and will be a big part of the metaverse when we live half our lives there.
Personalization — Personalization engines are the main driver of providing better movies to watch (Netflix, TikTok) or better products to buy (Amazon, Alibaba). But in the context of the metaverse, these suggestion algorithms will help you find new communities, learning opportunities, and job prospects. There may be inherent bias programmed into these algorithms, and as such, great care must be taken to ensure that they do not exclude people or harm their users.
As you can see, the metaverse will be a complicated mess of new technologies, virtual worlds, talking avatars, suggestion algos, predictions, and computers that know more about the world around us than we do. Great care must be taken to ensure that the metaverse is built for everyone, with inclusion and diversity at the heart of it. It will be imperative over the next few years that we build technology to serve humanity, not just to make more money for the ultra-rich. This is the promise of a decentralized, democratized metaverse. Let’s take care to build the future we want.