Metaverse Interoperability Space


  • Users of the Metaverse will be spread across worlds; high concurrent user count is only required when users converge on a single (unsharded) area. Region-based scaling is already offered as a service by SpatialOS and Big World Technology.
  • Interoperability for a MetaGalaxy (i.e., a subverse of the Metaverse) is easier than for the Metaverse, because the authority of the subverse dictates the standard.
  • Updating standards and frameworks causes fragmentation of the interoperability space and risks the creation of legacy systems.
  • If web standards are to form the Metaverse, then native OS applications would have to be re-engineered for interoperability at cost.
  • If native OS applications are to be part of the Metaverse, the OS would be responsible for switching between worlds.
  • Client-side interoperability allows for Ubiquity i.e., users to access the Metaverse on all devices; Server-side interoperability allows for virtual worlds to use services provided by other virtual worlds.
  • It will be hard to support some low-power devices if the footprint for interoperability is large e.g., a peer-to-peer system.
  • There is already fragmentation in the blockchain interoperability space; virtual world owners can choose to rebuke blockchain leading to potentially more fragmentation in the Metaverse interoperability space.
  • It doesn’t matter if identity or assets are fraudulent on blockchain or through the authority of another virtual world, administration is required to correct the situation.
  • From an interoperability perspective, trusted virtual worlds can offer identities and assets, similar to how blockchain provides them; the Metaverse can form through the self-organization of various agreements, blockchain included.
  • IPSME can be used for both client-side and server-side interoperability, or between blockchains; systems can be integrated without being taken down and re-engineered; avoids the need for standardization and supports legacy systems; and, reduces the complexity of integrating systems.
  • IPSME can enable an ‘industry of integrations’; anyone knowing two systems could write a translation and market the translation.

Web servers, serving up web sites, are linked together through the Internet to form the World Wide Web. The ability to link through the Internet is referred to as technical ‘interoperability’ (Noura, Atiquzzaman, and Gaedke 2019). This article will concern itself with higher levels of interoperability e.g., syntactic, semantic. The Internet as an enabler of networking allows for other types of servers to also be connected e.g., email, file transfer and databases. This article will focus on the higher levels of interoperability to connect ‘virtual worlds’ (Nevelsteen 2018) and ‘subverses’ (Nevelsteen 2021a) together.

In Nevelsteen (2021a), I argue that interoperability "is the determining factor for individuals as to how close we are to obtaining the Metaverse".

Before diving into interoperability, I will first speculate that there are basically 3 ‘camps’ to which people align, with respect to how the Metaverse will evolve.

  • The first camp is the ‘Web camp’; web standards will continue to expand and evolve, and form the basis of the Metaverse (Louis 2021). This would mean that all virtual worlds in the Metaverse would be accessible via an Internet browser available, on each Operating System (OS).
  • Those in the second camp, the ‘Blockchain camp’, consider blockchain technologies so profound that blockchain will form the basis for the Metaverse. I think it is fair to say that the Metaverse OS (Burke 2021) and the Crucible Network fall in this camp.
  • The third and last camp is the ‘camp of Everything’, which harbors the idea that everything should have the opportunity to connect to the Metaverse, without costly re-engineering for interoperability e.g., converting to web applications. I speculate that the Metaverse Primer (Ball and Navok 2021) is written in this context e.g., "we want as much of the world to integrate into the Metaverse as possible".

Basics: serving up one world

A virtual world server can be a single machine that servers up a shared world for its users. To connect to the virtual world server users are often running special client software on their device. If the virtual world is web based, the Internet browser is the client software.

Instead of a single server, a clusters of servers can be used to scale up a service to serve a massive amount of users e.g., cloud computing. Such a cluster forms a multi-server architecture (Yahyavi and Kemme 2013). This technique works for web sites (e.g., Google search) as well as for virtual worlds. Virtual worlds can be scaled up by partitioning virtual space using ‘regions’ and/or ‘shards’.

“Using regions, the virtual space is divided into static or dynamic areas, with each area handled by a different group of servers; players can still move throughout the entire virtual space. With shards, players are divided up into groups and assigned to a unique copy (a shard) of the virtual space, with each shard handled by a different group of servers; players are prohibited from moving between shards. Shards are copies of the same virtual space that do not synchronize with each other. Note, that a shard can be divided up into regions.” (Nevelsteen 2018)

Region based scalability is now offered as a service by modern game engines e.g., SpatialOS and Big World Technology. We are still only serving up one virtual world, but to a massive amounts of users. World of Warcraft was divided up into shards; each virtual world was referred to as a ‘realm’. In Nevelsteen (2018), whether a world is contained in one shard determines if it is one or more virtual worlds. Because each realm is a different shard, each realm is a different virtual world of the same game.

Concurrent User Count, Continuous Space

Users of the Internet as a whole are spread out on different web sites, hosted by many servers. A web sites doesn’t usually go down unless an extreme amount of users try and access the same site simultaneously, overloading it i.e., essentially a Denial of Service Attack. Similarly, in the Metaverse, consisting of a collection of many virtual worlds/subverses, users will also be spread amongst all participating worlds and subverses. Server load becoming a problem when users converge on a single (unsharded) area e.g., in EVE Online, they used game design in an attempt to avoid a converging of users (Emilsson 2010). RP1 advertises a "persistent, unsharded, real-time platform with infinite scalability" for the Metaverse i.e., an infinity number of concurrent users. RP1 doesn’t provide any details for their platform, but the claim seems farfetched considering the complexity of the number of interactions is O(N2-N). They would have to have a form or interest management to keep the number of interactions in check.

Another successful technique is to simply limit the amount of users that are able to connect to a particular service or server. Ball (2020) mentions the Marshmello concert of 2019, where 11 million people experienced the same event. In order to avoid having all those people converge on a single concert space, more than 100,000 sharded ‘instances’ (Nevelsteen 2018) were orchestrated simultaneously, each with 100 people or less.

Interoperability Introduction

A ‘MetaGalaxy’ can be formed, by having "multiple virtual worlds clustered together as perceived collectives under a single authority" (Dionisio, Burns III, and Gilbert 2013). Users of the MetaGalaxy can then create a unique virtual world inside the galaxy, according to specifications provided by the MetaGalaxy. There are many of these MetaGalaxies in existence today, sometimes being referred to as ‘a metaverse’ (Batchelor 2021). Each virtual world inside the MetaGalaxy can be served by a single server or multiple servers. To achieve a MetaGalaxy, higher levels of interoperability are required between the various virtual worlds; participants must speak the same protocol. A common protocol can be achieved through ‘standardization’ or by requiring systems to adopt a ‘framework’ as a dependency (Nevelsteen and Wehlou 2021). Because a MetaGalaxy is controlled by a single authority, that authority can specify the standard protocol or framework, all worlds must conform to, in the MetaGalaxy.

To obtain ‘the Metaverse’ (Nevelsteen 2021a), there must be interoperability between all participating virtual worlds and MetaGalaxies (i.e., subverses of the Metaverse). There are, currently, many groups developing (possibly competing) standards or frameworks for a possible Metaverse e.g., Epic Games, the Open Metaverse Interoperability Group, those advocating the Open Metaverse OS, and more.

Standardization and Frameworks

When a protocol is specified to be a ‘standard’, those who want to participant must adhere to the standard in order to communicate. If a group of ‘stakeholders’ (Madni and Sievers 2014) can agree on a common protocol, then that protocol is a standard for that group. The process of standardization can be shorter in smaller groups. Within a MetaGalaxy, it is possible that the single controlling authority specifies the standard protocols and all virtual worlds in the MetaGalaxy conform. Specifying a public standard is a more arduous process as the number of stakeholders increases. Standardization is "a slow and limiting solution which can stifle fast paced innovation" (Nevelsteen and Wehlou 2021).

Probably the most successful public standard is that of HTML. According to W3Techs (2021), on the 6th September 2021, 88.9% of all web sites support HTML version 5; 4.3% support a different version and 6.8% support something different. That means 88.9% of web sites should render in a modern browser, with 11.1% not being supported. Each update to the web standard causes further fragmentation e.g., when an update is released, the interoperability space splits into those supporting the new standard and those that do not. All systems not updated risk becoming a legacy system (Selberg and Austin 2008). Legacy systems are difficult or impossible to re-engineer and bring up to standard (Madni and Sievers 2014; Selberg and Austin 2008).

People in the web camp support the idea that web protocols will form the Metaverse. This means existing native OS applications would have to be re-engineered (if possible) to web applications e.g., via WebAssembly (Louis 2021). Such re-engineering can be costly.

If disparate standards for the Metaverse arise, the interoperability space for the Metaverse also risks fragmentation, with each standard only gathering a smaller share. The stats by Levin (2021) can be used as an example of a fragmented interoperability space for Internet APIs in 2014; the share for REST, SOAP, JavaScript and XML-RPC being 69%, 18%, 5% and 2%, respectively; 6% other.

Frameworks are similar to standards except that, instead of a protocol, a framework (e.g., Software Development Kit) is provided which must be conformed to. The framework implements the common protocol. As long as systems use the latest framework, communication compatibility can be maintained. Updating the framework requires the re-engineering of systems to adhere to the update, or risk legacy systems.

Game Engines

Ball (2020) mentions that game engines such as Unity or Unreal engine could implement common protocols allowing for the formation of the Metaverse. Unity and Unreal are already frameworks for a large share of video games. A tactic, to form the Metaverse, could be to implement interoperability in these existing frameworks, to try and obtain unanimous adoption. Epic Games already offers "online services" via a framework for any game developer wanting support services such as voice chat, in-game achievements, matchmaking and more; it is only a small step to provide more interoperability services.

The problem is that competing standards and frameworks effectively fragment the Metaverse interoperability space e.g., if Unity and Unreal implement competing protocols.

If Unity and Unreal were to implement the same protocols, then the adoption share might be quite high. All worlds/subverses of the Metaverse would have to either use Unity/Unreal or implement the standard protocol themselves. The problem with this approach is that, after each new release, existing systems must be re-engineered to support the release; this re-engineering can be costly. Failure to update risks creating legacy systems. This tactic is in the camp of Everything. For the tactic to work, fragmentation would have to be low; a standard with a large share, that is not updated often e.g., like HTML. The slow updating of the standard could, however, stifle the fast paced innovation for the Metaverse.

If the protocol implemented by Unity/Unreal was the web standard, then the adoption share could be even higher. It is already possible to export a project for web and native platforms, using Unity. Unless all native OS applications are re-engineered to web applications, as described in the Web camp scenario above, this approach is in the camp of Everything. The problem being that many existing applications are not web applications. The OS would be responsible for switching between native application and the browser for web applications, for which there is currently little or no interoperability.

Client-side vs Server-side

‘Ubiquity’ (Nevelsteen 2021a) for the Metaverse means access via any device e.g., desktops, laptops, gaming consoles and more, using text descriptions, 2D, 2.5D or 3D. This would be the client-side of the Metaverse interoperability space. Fragmentation of this space is dependent on the degree of devices being supported, and is especially likely in early stages of interoperability. Users without a particular device could be excluded from accessing some virtual worlds e.g., "all of these virtual worlds [Cryptovoxels, Decentraland, etc] still require at least a smartphone, which still currently excludes 6/10 of the global population" (Burke 2021). Any interoperability solution client-side, will have to cater and maintain compatibility with various devices; failure to do so will also result in a greater degree of fragmentation. Interoperability client-side means protocols are in place so that clients can switch virtual worlds/subverses, similar to how a web browser switches web sites when the user clicks a link.

Ball (2021a) states that, "we need to think of the Metaverse as a sort of successor state to the mobile internet". With respect to the fact that interface devices (i.e., client-side) are getting smaller and more personal, this would be correct. However, with respect to interconnecting virtual worlds (Dionisio, Burns III, and Gilbert 2013), from the server- side, it would be more correct to consider the Metaverse as a successor of the Internet in general (Nevelsteen 2021a). Server-side interoperability allows virtual worlds to make use of services from other servers. The reusability and interoperability of such server-side services converge into strategies such as Service Oriented Architectures (Madni and Sievers 2014). Server-side interoperability includes the usage of Blockchain technologies and connects the Internet of Things (IoT).

Access via Device

If user devices are to be connected to the Metaverse, those devices must support client-side interoperability. If we specify a Virtual Reality (VR) head-mounted display as the access device, then such a device has an OS with native applications running on device; one of those applications can be a web browser. If we subscribe to the web camp, then native applications would be disconnected from the Metaverse and the web browser used by the user to navigate from one virtual world/subverse to the next (including possible local on devices worlds). If native applications are to access the Metaverse, then the camp of Everything is subscribe to, and interoperability must sought between native OS applications and the web browser e.g., to allow switching between browser and native applications when appropriate.

Augmented Reality (AR) can be used to access the Metaverse (Nevelsteen 2021b) e.g., through a mobile phone. The scenario on a mobile phone is similar to that of the VR HMD i.e., an OS running native applications, including a web browser. For other devices (e.g., smart glasses, contact lens, autonomous vehicles), a web browser might be absent and only an architecture with an SDK might be available. This is problematic for the Web camp, unless custom web browsers are created for those platforms. The conclusion for the camp of Everything is that the OS is the interface for the Metaverse, not the browser. If the OS is the interface, that means Steam, Apple, Facebook and SnapChat have control over their respective platforms: Valve Index, iPhone/macOS, Oculus and Spectacles. This means those companies have control to promote or stifle interoperability e.g., Apple being the gatekeeper for applications installed on their platform or Oculus blocking competing HMDs from accessing their content.

Some devices are such low-power that there are limits to the interoperability footprint possible on device e.g., wearables or IoT devices. A large interoperability footprint could be problematic for the Blockchain or Web camps. Low-power devices can make use of gateways as mediators, but this pushes the problem of interoperability to the gateways e.g., competing gateways (Noura, Atiquzzaman, and Gaedke 2019) already fragment the IoT interoperability space.

Decentralized or Distributed

It has been said that ‘decentralization’ is a characteristic of the Metaverse (Nevelsteen 2021a). It has already been discussed how a distributed system can be used to scale one virtual world to serve a massive amounts of users. The approach is still a centralized one, in the sense that all users connect to one centralized world, consisting of a distributed multi-server architecture. If the Metaverse is "a collection of many virtual worlds/subverses" (Nevelsteen 2021a), then that collection is a decentralized one; many user, each behind a client device, connected to one or more virtual worlds, with each world perhaps a distributed system of servers.

The Metaverse would still not be a fully distributed system, unless clients connected to each other i.e., a ‘peer-2-peer’ (Yahyavi and Kemme 2013) system where every compute node is both client and server. Those in the Blockchain camp might argue that the Metaverse will be an ultra-large peer-2-peer system. Such an architecture could prove problematic for some devices e.g., mobile devices without constant availability are not ideal for server functions, and neither are low-powered devices lacking performance. The interoperability footprint is too high for such devices. The Metaverse would lack ubiquity, meaning the interoperability space would not be covered.

One might suggest a hybrid architecture; peer-to-peer combined with a server-based architecture (Yahyavi and Kemme 2013). But, the purpose of such an architecture is not to lighten the load of each node, but to ensure proper operation of the system e.g., avoid cheating.


The next generation of the web is said to be "increasingly decentralised and based on user centricity and the sovereignty of their data and wealth ... a paradigm ultimately based on blockchains" (Burke 2021). Blockchain technologies have the potential to allow users to utilize a single identity with assets, in one virtual world/subverse to another, across the Metaverse. Ball (2021b) states the shortcomings of blockchain, "specifically because of their decentralization", to currently be: costly transaction fees, slowness and energy use.

Blockchain is a decentralized peer-to-peer technology, and Burke (2021) has peer-to-peer as the basis for the Metaverse OS. But, peer-to-peer is not suitable for all devices (e.g., low-powered), making it hard for blockchain to obtaining ubiquity, risking fragmentation of the Metaverse interoperability space. This makes it problematic for blockchain to be the basis for the Metaverse, as prescribed by those in the Blockchain camp. Ball (2021b) already notes that "all of the major NFT platforms and blockchain-based worlds are browser-based and lack console releases and mobile applications".

Supporting blockchain requires implementing standard protocols or frameworks. There are already many different blockchains in existence today (i.e., competing standards), with blockchain interoperability as an open issue. If there is already fragmentation of the blockchain interoperability space, then why would there not be an increase in fragmentation when considering the entire Metaverse i.e., fragmentation of the Metaverse interoperability space. With an open Metaverse, anyone can add a server to the Metaverse and be the authority of the virtual world on that server. That authority can choose whether to support or rebuke blockchain leading to potentially more fragmentation.

In Nevelsteen (2018), virtual worlds are distinguished based on their single shared data space, a shard. Blockchain technology can be a horizontal layer overlaid across the vertical silos of virtual worlds. In the extreme case, a virtual world could primarily be comprised of only blockchain elements. This means virtual worlds become "increasingly interoperable and interconnected to the point it will be hard to distinguish them as separate but rather different instances of a whole" (Burke 2021). It is not difficult to understand why those in the Blockchain camp already call this collection of blockchain connected worlds ‘the Metaverse’.

Although blockchain has incredible potential, there are still some fundamental problems with respect to interoperability. An identity can be registered on a blockchain allowing worlds to authenticate identity. From an interoperability standpoint, this is identical to using another trusted virtual world as identity authenticator, provided server-side interoperability exists allowing one virtual world to query the other. If an identity is spoofed, then it doesn’t matter if it is on blockchain or maintained by another virtual world, administration will be required to correct the fraudulent behavior. This is currently the case for many artists who are having their art "stolen" e.g., minted on the blockchain and sold by a fraudulent artist. If a virtual world in the Metaverse is malicious, it could use blockchain as a source of data, but have a blatant disregard for rights and permissions e.g., allowing users to import any blockchain data into the world, refuse to import a users data or neglect to revoke access to an item when ownership was transferred. The scenario is the same if another virtual world was the authority over the virtual item e.g., World of Warcraft dictating who knows an ancient artifact.

Having blockchain as an authority for identity and assets across worlds could be powerful, especially since the authority would not be under the control of a single company or organization. However, I think it would be naive to discount the possibility that other virtual worlds/subverses can also offer their identities and assets out to the Metaverse. This is exactly what server-side interoperability offers; virtual worlds can provide services to other worlds. This is especially true for virtual worlds that do not want to support blockchain or those worlds that would like a hybrid approach. Instead of trying to blanket the Metaverse interoperability space with one overarching technology, the Metaverse could form through the self-organization of various agreements between worlds, blockchain included. This is exactly why server-side interoperability is important.


What does IPSME (Nevelsteen and Wehlou 2021) have to offer?

IPSME is not a protocol or framework, but a set of conventions, with a super small footprint, making it possible to implement on most any system (i.e., even low-powered devices); the pubsub dependency forming the messaging environment usually already exists on most platforms. This means IPSME can be utilized for both client-side and server-side interoperability, interoperability between competing blockchains, and the interoperability for an OS to switch between web and native OS applications. IPSME describes a dynamic evolutionary architecture for a system of systems e.g., the Metaverse.

Integrations between systems are made external to the systems being integrated. When interfaces utilized in an integration are updated, the systems being integrated do not have to be re-engineered, meaning those system don’t even have to be take down. The translations can be updated or translations can added to the system dynamically and externally incorporating the updates. In the analogy by Ball (2021a), industrial plants "continued to use a lumbering network of cogs and gears that were messy and loud and dangerous, difficult to upgrade or change". Rather than requiring existing system to be re-engineered to support a particular protocol or framework, IPSME allows systems to keep using their "cogs and gears", but adds translations for interoperability.

The role of the reflectors of IPSME is allow any organization topology for communicating participants i.e., promote scalability. Reflectors provide interoperability between competing platforms (e.g., different operating systems), by leaving the protocol by which reflectors communicate as undefined. Reflectors also separate complexity so that authors of participants have a very limited scope of other communicating participants they must take into account during development.

IPSME avoids the need for standardization, by allowing any number of protocols to be used simultaneously. Deviations from a standard can be supported by adding translations. Legacy systems can be supported without requiring them to be re-engineered, which is sometimes costly or impossible.

IPSME reduces the complexity of the system of systems, by reducing the number of integrations required for a fully connected system. Translations are reusable and transitively applicable. If a someone creates a translation between interface A & B, and installs that on system X; a system Y can also install that translation to gain the same translation. If three systems, X, Y and Z, and two translations, A : X↔Y and B : Y↔Z, are communicating via IPSME, then a translation X↔Z is implicit. System X broadcasts out a message which is translated from X to Y by A, and again from Y to Z by B. As the number of translations in the system of systems grows, complexity can be detected and translations added dynamically, to increase performance and better message translation quality.

Rather than require organizations behind each system to develop an integration for each system they wish you interoperate with, IPSME offers the possibly of an ‘industry of integrations’. Anyone who knows the interfaces of two systems could write a translation, integrating the systems, and market that translation. If a translation is poorly written, then others can write a competing translation. Ball (2021a) states the success of the iPhone to be based on that "iPhone had ‘an app for that’ because millions of developers built them"; the success of the Metaverse could be because IPSME allows developers to build a translation for that.