Considerations for regulating the metaverse: New models for content, commerce, and data

As regulators catch up to Web2, human behavior and technological change continue to drive evolution in metaverse experiences. How can companies and regulators plan for a more distributed, immersive, and tangible internet?

Jana Arbanas

United States

Jennifer McMillan

United States

Chris Arkenberg

United States

Michael Steinhart

United States

Don’t count the metaverse out. The grand vision of hyperscale metaverse experiences bringing potentially billions of users together into vast 3D worlds, enabled and managed by Web3 protocols like blockchains and cryptocurrencies, may indeed face technical and organizational challenges as it weaves together and unfolds. But the currents moving people and businesses into more digital and embodied interactions could continue to swell into a sea change. Regardless of how people interface or which services they use to do so, the concept of the metaverse is an important way to understand how more of the world is migrating into the internet and how digital systems are connecting more deeply with the physical. As businesses grapple with the opportunities, regulators are considering the implications.

Data from Deloitte’s 2023 Digital Media Trends survey shows that digital “places” are a very real part of users’ lives. Around one-third of US respondents consider online experiences to be meaningful replacements for in-person experiences, and half of all Gen Zs and millennials surveyed agree. Among these younger generations, 48% agree that they spend more time interacting with others on social media than in the physical world, and 40% say they socialize more in video games.1 These generations are much more engaged with multiplayer video games that support self-expression, social connection, and immersion—and they’re spending money to personalize their avatars with digital goods.2

Regulators are still considering the challenges of the Web2 era: endless content, data collection, privacy, and increasingly complex digital economies. But are current regulations prepared to address the challenges and harness the opportunities presented by hyperscale metaverse experiences? Are businesses ready for the challenges of content and conduct, privacy, and trade when people don digital avatars to express themselves, socialize, and transact, casting large data shadows across vast networks? What are some of the implications when glances, gestures, and even speech are fully digitized, recorded, and stored, when identities can be distributed and obfuscated, and when source and provenance are even more uncertain thanks to generative AI? How can ownership of digital things, dominion over virtual territory, and currency that comprises a bundle of crypto tokens be defined?

These questions may not have answers yet, but business leaders and regulators should be asking them. Technology and media companies are committing billions of dollars to develop metaverse technologies in a race to capture consumer and business demographics. Investors are eyeing trillions in value.3 The hype may have died down, but the deeper currents are still building.

In this article, we explore how regulators are working to tackle the challenges of hyperscale social media and content creation—issues such as content moderation, conduct, privacy, copyright and fair use, and tax and trade—and then consider how metaverse experiences could amplify these challenges and necessitate new regulatory considerations.

Why is the metaverse different?

However it’s defined, the metaverse is a place. It approximates the physical world and typically acts as if it were solid. It has physics, and its users have bodies through which they can act, interacting with objects, others, and the virtual world. Social video games illustrate this “placeness” and the embodiment of avatars in interactive environments. Today’s most popular social video games host tens of millions of players, with all the attendant behaviors such large and diverse groups entail.4

If a metaverse enables potentially tens of millions of users to interact together outside the confines of a specific game, how might this amplify the challenges of moderation, behavior, harassment, and misinformation? If a user says something in a metaverse space, is that the same as posting content? Who owns that content, and who is liable? Is an avatar an individual whose speech is protected? What constitutes abuse? Are the rules defined by the location of the users or the location of the service provider, or both? And where is the metaverse, anyway?

This lack of location and the abstraction of humans into avatars and things into data objects challenge companies and regulators to think differently about existing categories of law in metaverse experiences. A European Parliament briefing, for example, outlines the potential opportunities and risks for competition, liability, and intellectual property.5 South Korean lawmakers stated that metaverse use cases raise many issues around data governance, privacy, and user safety.6 The Japanese Ministry of Economy, Trade and Industry (METI) has established a “Web 3.0 Policy Office” to connect finance and tax departments with media, sports, fashion, and other industries to foster economic interoperability in the metaverse.7 Such efforts now exist alongside broad Web2 regulatory frameworks.

Yet more questions are likely needed to be ready for the metaverse—and the regulations that may be required. Do legal definitions of content apply to 3D objects and assemblies? Will metaverse experiences make data capture much more powerful? How should we view the emergence of AI influencers and digital twins of celebrities?8 What are the tax implications when identity and location may be hidden by encrypted blockchains? Amid so much continuous disruption, regulators are often caught between implementing and enforcing existing laws, understanding where they might apply to metaverses, and when they may need entirely new rulesets.9

The regulatory focus: Content, conduct, privacy, and trade

By analyzing current regulations, the human behaviors that drive their evolution, and the novel implications that metaverse interaction models can create, business leaders have an opportunity to inform their strategies and implementations. Staying ahead of domestic and global regulatory developments can help mitigate risk and define the scope of innovative experiences. These considerations could enable a more equitable and safer user environment in the metaverse, drive public adoption of the technologies, and help inform future policy developments.

When anticipating how regulators may seek to manage hyperscale metaverse experiences, companies should consider how they’re currently addressing content, conduct, privacy, and trade issues such as tax and finance.

Content, conduct, and safety

Providers and regulators are now contending with the consequences of digitizing human behaviors into hyperscale social environments—the leading social media services that span continents and gather hundreds of millions (and even billions) of users together. Social media, user-generated content services, and multiplayer video games are working to monitor conduct and moderate content. They aim to protect their brands and support positive experiences for their users while establishing responses in case any cause problems.10

These services are used by billions of people across many cultures, normative spaces, and legal jurisdictions. Yet they are often navigating which kinds of content and conduct require moderation. Depending on how it’s created and distributed, content can undermine ownership, like copyright violations. Depending on the views it expresses and the images it contains, content can be illegal by law, like hate speech or offensive by the various standards of different cultures. Conduct such as harassment or bullying can lead to harms and undermine the safety of these experiences.11 In the US, Section 230 of the Communications Decency Act generally provides online platforms with immunity from liability based on third-party content. Many lawmakers are pushing for this immunity to be scaled back.12

In the European Union, the Digital Services Act (DSA) established a regulatory regime that requires platforms to manage the risks that illegal and harmful content and activity may pose to users and society.13 Aimed primarily at services that allow users to share content (including social media platforms), the DSA seeks to increase online safety, drive accountability, and create transparency on how content and behavior are managed on platforms.

While the EU has put in place a regulatory framework across its member state countries, there is limited federal law in the United States, where individual states are legislating piecemeal. A challenge in creating social media regulation is the need to not only protect against harmful content but also protect free speech. California’s AB 587, for instance, requires social networks to post their content moderation policies and provide a description of their processes for flagging and reporting problematic content like hate speech, racism, extremism, dis- and misinformation, harassment, and political interference.14

There seems to be little global consensus on how to regulate human behaviors in social experiences. How such rules might play out in metaverse spaces can be even more fraught where current codes of conduct are suggested but rules seem to be scarce.15 What are the lines of self-expression through avatars? What about virtual violence? How should copyright apply to avatars and digital clothing, or deepfakes and generative AI content? Many of these issues are being tested and the legal positions being defined via lawsuits, in the absence of regulation. For instance, issues such as fair use and the copyright position for generative AI works are being debated in courts in multiple jurisdictions.16 The EU aims to address the risks of AI-generated content in its Digital Services Act and draft AI Act, requiring those in scope to classify levels of risk and enable labeling of deepfake and inauthentic content, for example.17

These issues may become more problematic in the presence of younger users. The UK’s Age-appropriate Design Code (for privacy) and pending Online Safety Bill (for content), along with the EU’s Better Internet for Children (BIK+) program, are guidelines seeking to clarify how services should protect younger users, including recommendations for enabling “strong” privacy by default, turning off location tracking, and not prompting children for personal information.18 Providers are asked to assure the age of users, protect their data, and shield them from harmful and illegal content.19 The US state of Utah recently enacted requirements for social networks to secure parental consent before any child account is created and to set curfews on child accounts, preventing access between 10:30 PM and 6:30 AM.20 In 2019 and further amended in 2021, China limited the amount of time minors can play video games, aiming to deter “internet addiction.”21 In more immersive experiences, will younger users find it difficult to limit their time online? And whose responsibility is it to set those limits?

Critical considerations

When negative consequences are allowed to continue unchecked, regulators can be compelled toward strong responses that can limit growth and innovation. As regulations come into effect, tech companies should be proactive in embedding protections into their current policies and implementations and adopt leading practices that help support positive outcomes for users and society.

Protection by default, trust by design: Providers should enable protections by default, with straightforward and easy-to-manage user controls and policies for content and conduct.22 Restricting unsafe search results for younger users, blocking younger users from appearing in searches, disabling direct messaging from unknown accounts, and establishing new accounts as private until configured otherwise are steps that can help companies demonstrate compliance and dedication to user safety. Some may consider how to selectively “mute” other avatars, keep avatars at a distance, spin off safer metaverse spaces designed for children or teens, and follow a minimal data collection policy for younger users.

Real-time content moderation: Moderating such enormous amounts of content and conduct can be very difficult and costly—and metaverse experiences can be even harder.23 But as potential harms become evident, regulators are able to impose greater punitive measures on service providers.24 Providers should be paying attention to AI and large language models (LLMs) that may be better able to moderate at scale.25 Such tools may help mitigate harms, avoid legal challenges, and foster enjoyable experiences for the majority.

Risk analysis: Metaverse innovations may inadvertently create new ways for bad actors to engage in harmful conduct and exert influence. It is unclear how issues like bullying, harassment, stalking, grooming, and hate speech may manifest in metaverse environments. Companies should consider scenario modeling to identify new risks and mitigation strategies, then communicate them to users and regulators to build awareness.

Global regulators put people in the center

So far, the EU has played a leading role in advancing regulations focused on consumer protections, with a stated “people-first paradigm” that aims to emphasize the safety of individual users. Among these are the EU’s General Data Protection Regulation (GDPR) and Digital Services Act (DSA). These regulations apply across member countries and carry hefty fines for companies found in violation, including for non-EU businesses that operate there.26 They could be applied to metaverse use cases.

On the content and conduct front, a comprehensive Online Safety Bill is advancing through UK Parliament. It includes rules that online platforms must follow to protect children from harmful content and to identify and delete “illegal” content. Lawmakers have indicated that they deem these rules “future-proof” and applicable in metaverse environments, which may create a significant challenge for platforms that don’t authenticate users’ ages.27

In the US, federal frameworks tend to regulate individual industries (finance and healthcare, for instance), whereas consumer protection rules often vary by state. Some US states are adopting rules modeled after those in the EU for protection of digital privacy.28 A federal-level “Kids Online Safety Act,” the goal of which is to protect younger users from social content that may be considered harmful, is under consideration in the Senate and the subject of robust debate.29

Show more

Privacy

In the Web2 era, many tech companies and social media platforms have built their businesses around collecting and monetizing user data. However, there are concerns that consumers may not understand how and which kinds of data are being collected, the difference between personally identifiable data (PII) and anonymized data, how it’s being stored and protected, and how it’s being used.30 Tracking users can potentially infringe on privacy rights, and resellers can, if not regulated, provide data to third parties with potentially very different uses in mind.31 Data breaches have put consumers at risk and enabled additional targeting, persuasion, and abuse.32 Given the scale of data in the digital economy and the potential risks to consumers and providers, some governments have shifted to establishing much stronger legal and regulatory frameworks around personal data and privacy.

The EU’s GDPR requires any company that collects and processes personal information of EU residents to provide transparent privacy notices to users, obtain their explicit consent to collect and use their information, and implement data-protection measures.33 The proposed ePrivacy Regulation would govern the use of cookies, electronic marketing, and confidentiality of communications, and sets rules around tracking technologies and electronic marketing practices.34

In the US, data is regulated by industry, and consumer privacy laws vary from state to state, which may create complications for companies whose business crosses state borders.35 Several individual states have data privacy regulations in effect as of this writing. In California, the Consumer Privacy Act (CCPA) has similarities to the GDPR, such as the right to access, delete, and opt-out of the sale of personal information.36 Many of the state-level data privacy rules apply primarily to businesses operating in those states, regarding data of residents in those states, and they do not govern data transfers.37

Many of these may apply to the metaverse, but a world of much more embodied and digitized interactions—where gestures, facial expressions, and conversations are captured and stored, for example—could amplify these challenges and enable new ones. A Web 3.0–enabled metaverse could obfuscate user identity through encrypted blockchains or make it difficult to locate data trackers. Metaverse use cases could create new kinds of personal information, like sentiments and emotions collected from virtual reality headsets, and 3D user-generated content co-created with generative AI. Synthetic influencers and celebrity digital twins are already challenging licensing models and even personhood itself.38

Critical considerations

To help mitigate risks and anticipate future conditions, companies should view their data collection and usage through a regulatory lens, implementing privacy protection measures during development, not merely when problems arise. In effect, data should be considered an asset that is also a potential liability requiring risk management.

Operate across borders: Jurisdiction-based privacy laws can be difficult to apply when users in many locations interact with entities in multiple countries. Blockchains and other encrypted distributed ledgers can make ascertaining jurisdiction easier—or more difficult.39 A decentralized framework for digital identity may enable users to “travel” across environments with their unique identity, information, and ownership of goods, but companies would then likely need compatible interfaces that recognize and support each user. Tech companies may consider sharing their operational challenges with regulators to frame common standards and harmonize approaches.40

Privacy by design: Operating from a privacy-first perspective may help companies stay ahead of rules around consent, transparency, and data protection. Instead of finding pockets of customer information and attempting to lock them down, companies should consider a holistic and scalable data management approach. Solutions such as public data vaults and secure peer-to-peer storage architectures could remove some of the burdens of data security and compliance from companies.41

Let customers decide: Establishing policies with metaverse-specific language and empowering users to decide how their data is collected and used are expected to help reduce liability and hedge against future and emerging regulations.42 Companies should approach this effort strategically rather than responding ad-hoc based on the uneven patchwork of regulatory regimes. For data-sharing and advertising, financial incentives between consumers and companies may encourage users to share more freely and give them a stronger role in the success of providers. Customers can also be given more tools to monitor and even modify their data on the company platforms.

Tax and finance

Businesses that operate in or transact through Web2 services have been navigating a patchwork of shifting tax jurisdictions that have raised many questions about indirect tax, the definition of digital goods and services, and how to handle multiple jurisdictions based on the location of the buyer, the seller, the service, and the benefit derived.43 On the runway from Web2 to Web3 and metaverse experiences, factors include considering identity and location, extending the definition of goods to include digital things, and addressing the challenges of cryptocurrencies.

In Europe, digital service taxes are levied across countries in the Organisation for Economic Co-operation and Development (OECD) on digital activities such as the sale of online advertisements or customer data, and on companies that maintain a digital presence in a given jurisdiction. For example, when a company collects user data, applies analytics, and serves ads, the profits are taxed based on the company location as well as the user’s location where the profit-generating event occurred.44

Value-added tax (VAT) rates may also apply based on the location of the supply (for physical goods) and based on the user’s location (for digital goods). However, there is no consensus yet on whether the organizers of virtual events in virtual spaces, for example, and exchanges of virtual currency for virtual goods are subject to VAT. The debate hinges on how to define cryptocurrencies and virtual assets.45 The OECD offers guidance for transactions involving crypto assets. The EU’s Markets in Crypto Act (MiCA) provides consumer protections and imposes stricter requirements on crypto exchanges.46

These two developments demonstrate traceability and oversight considerations. The parties involved in the transaction should be identified, the flow of assets should be defined, and this information should travel with the transaction and be retained by both sides. Depending on their implementation, cryptocurrencies can make this easier or harder.47

In the US, The Securities and Exchange Commission has brought cryptocurrency exchanges to court for alleged fraudulent behaviors and has asserted that the majority of cryptocurrencies are securities, rather than commodities. The distinction between securities and commodities can determine how they are taxed and regulated.48 To add more transparency to online transactions and deter criminal activities, the INFORM Consumers Act, which took effect in June 2023, requires online marketplaces to collect information about “high-volume” sellers,49 block those that don’t provide the information, and provide means for consumers to report problems.50

In metaverse environments, identification and tracking may be more difficult, as different parties have different levels of access to user identity, information, and location. Many companies have experimented with bundles of real and virtual assets to encourage metaverse engagement and nurture brand loyalty.51 But regulators could struggle to define these assets and may apply different rulesets. For hyperscale metaverse services and the economies they support, specifying effective taxation and auditing may present unforeseen challenges, but companies have a chance to help cocreate those specifications.

Critical considerations

Work by analogy: In countries where rules around digital goods and virtual transactions are not clearly defined, companies and regulators should collaborate on analogies that correlate to existing rules—for example, whether nonfungible tokens (NFTs) are collectibles, or purchases executed with cryptocurrency are subject to capital gains taxes.52 These analogies could help in conversations with regulators and auditors.

Expand tax data reporting capabilities: When transactions cross jurisdictions with different tax and reporting requirements, companies may need to ensure they have systems and processes to capture data, retain it securely, and produce it as needed. However, many back-end systems may not be set up to capture, track, and report the information necessary to determine appropriate taxation and comply with transparency rules. Some legislative bodies are considering real-time reporting, which will require companies to grant them access directly to transaction data.53 It may be wise to include tax and finance professionals in product development discussions in order to help ensure that new models conform to applicable rulesets.

Leverage blockchain: Smart contracts, ledgers, and other blockchain technologies represent a potential solution for tracking and storing transaction information in Web2 and Web3 environments. Real-time information exchange between the platforms and authorities may enable companies to improve compliance and reporting once interoperability and scalability challenges are solved.54

Do the work today for a more successful tomorrow

The rise of the social web has mobilized billions of people onto leading providers, bringing with them many of the expectations and behaviors of the physical world. The success of these services has underlined their value while revealing the very real consequences and side effects that attend their mass adoption. The popularity of social media and now social gaming reflect deeper currents in how people are migrating to digital experiences, further blurring the lines between the physical and the virtual. The vision of “the metaverse” is a way to learn where these currents seem to be leading us. If, in 2005, we had understood the trajectory of early social media, would we have done things differently?

This seeming inevitability is why we should take the metaverse seriously and why companies and regulators should be working together to establish the guardrails needed for robust and safe metaverse experiences and opportunities—to moderate content and guide conduct, prioritize privacy and safety, and define metaverse economies clearly. Doing the work today can secure better user experiences while helping enable businesses to innovate and can drive value with significantly less risk.

Continue the conversation

Meet the industry leaders

By

Jana Arbanas

United States

Jennifer McMillan

United States

Chris Arkenberg

United States

Michael Steinhart

United States

Endnotes

  1. Kevin Westcott, Jeff Loucks, and Jana Arbanas, 2023 Digital media trends: Immersed and connected, Deloitte Insights, April 2023.

    View in Article
  2. Newzoo, “U.S. core gamers: 81% of those aware of in-game cosmetics want to trade skins for real-world money,” 2023.

    View in Article
  3. Reuters, “Metaverse could contribute up to 2.4% of US GDP by 2035, study shows,” May 10, 2023.

    View in Article
  4. Matt Schmidt, “Why the gaming industry could be the new social media,” Forbes, March 16, 2021.

    View in Article
  5. Tambiama Madiega, Polona Car, Maria Niestadt, and Louise Van de Pol, Metaverse: Opportunities, risks and policy implications, European Parliamentary Research Service, June 2022.

    View in Article
  6. South Korean Ministry of Science and ICT, “MSIT to vitalize the digital economy with regulation amendments,” September 2022; CryptoNews, “South Korea release guidelines to reduce crime in the metaverse,” Binance Feed, November 30, 2022.

    View in Article
  7. Ministry of Economy, Trade and Industry (Japan), “Web 3.0 Policy Office established in the minister's secretariat as a cross-departmental internal organization,” press release, July 15, 2022; Sohini Bagchi, “Tech companies in Japan collaborate for metaverse economic zone,” Techcircle, February 28, 2023.

    View in Article
  8. Chris Arkenberg, Generative AI is already disrupting media and entertainment, Deloitte Insights, June 29, 2023.

    View in Article
  9. Jack Schickler, “EU’s leaked metaverse strategy proposes regulatory sandbox, new global governance,” CoinDesk, July 6, 2023; Euronews and Reuters “EU is analysing the metaverse ahead of possible regulation, says anti-trust chief Margrethe Vestager,” August 2, 2022.

    View in Article
  10. Michael A. Cusumano, David B. Yoffie, and Annabelle Gawer, “Pushing social media platforms to self-regulate,” Regulatory Review, January 3, 2022; Meta, “Code of conduct for virtual experiences,” December 2022.

    View in Article
  11. For a discussion of human behavior vis-à-vis the metaverse, please see Duleesha Kulasooriya, Michelle Khoo, and Michelle Tan, Being human in a digital world: Questions to guide the internet’s evolution, Deloitte Insights, June 26, 2023.

    View in Article
  12. Katie Paul, “Bipartisan U.S. bill would end Section 230 immunity for generative AI,” Reuters, June 14, 2023.

    View in Article
  13. European Union Agency for Criminal Justice Cooperation, Digital Services Act: Ensuring a safe and accountable online environment, November 16, 2022.

    View in Article
  14. California Legislative Information, “AB-587 Social media companies: Terms of service,“ November 18, 2022.

    View in Article
  15. Meta, “Code of conduct for virtual experiences.”

    View in Article
  16. Arkenberg, Generative AI is already disrupting media and entertainment.

    View in Article
  17. European Union Agency for Criminal Justice Cooperation, Digital Services Act: Ensuring a safe and accountable online environment; European Parliament News, “EU AI Act: First regulation on artificial intelligence,” June 14, 2023.

    View in Article
  18. Information Commissioner’s Office, “Introduction to the Children’s code,” September 2021; GOV.UK, “A guide to the Online Safety Bill,” August 30, 2023.

    View in Article
  19. European Commission, “A European strategy for a better internet for kids (BIK+),” accessed September 2023.

    View in Article
  20. California Legislative Information, “AB-2273: The California Age-Appropriate Design Code Act,” November 2022; Utah.gov, “Holding social media accountable,” May 2023.

    View in Article
  21. Zen Soo, “China keeping 1 hour daily limit on kids’ online games,” AP News, January 19, 2023.

    View in Article
  22. Marine Cole, “Trust by design: A path to inclusive, compliant, safe products,” Deloitte, May 2023.

    View in Article
  23. Tate Ryan-Mosley, “How an undercover content moderator polices the metaverse,” MIT Technology Review, April 28, 2023.

    View in Article
  24. European Commission, “Rules for business and organisations,” accessed July 2023.

    View in Article
  25. PR Newswire, “Global Content Moderation Solutions Market Report 2023: Increasing need to monitor user-generated content on various online platforms drives growth,” Yahoo! Finance, May 11, 2023.

    View in Article
  26. European Commission, “Rules for business and organisations.”

    View in Article
  27. Camomile Shumba, “UK’s new Online Safety Bill applies to the metaverse, lawmakers agree,” CoinDesk, July 13, 2023.

    View in Article
  28. Fredric D. Bellamy, “U.S. data privacy laws to enter new era in 2023,” January 12, 2023.

    View in Article
  29. Amy Novotney, “Kids Online Safety Act: APA leads more than 200 advocates in urging Senate to pass bill,” American Psychological Association, July 18, 2023; Jason Kelley, “The Kids Online Safety Act is still a huge danger to our rights online,” Electronic Frontier Foundation, May 2, 2023.

    View in Article
  30. Jana Arbanas, Paul H. Silverglate, Susanne Hupfer, Jeff Loucks, Prashant Raman, Michael Steinhart, Consumers embrace connected devices and virtual experiences for the long term, Deloitte Insights, September 2023.

    View in Article
  31. Bess Pierre and R.J. Cross, “Why are data breaches bad?“ Public Interest Research Group, June 2023.

    View in Article
  32. Ibid.

    View in Article
  33. European Commission, “Data protection in the EU,” accessed July 2023.

    View in Article
  34. European Commission, “Proposal for an eprivacy regulation,” accessed July 2023.

    View in Article
  35. Thorin Klosowski, “The State of Consumer Data Privacy Laws in the US (And Why It Matters),” New York Times, September 2021; “California Consumer Privacy Act (CCPA),” State of California Department of Justice, May 2023

    View in Article
  36. Ibid.

    View in Article
  37. National Conference of State Legislatures, State laws related to digital privacy, June 7, 2022; Anokhy Desai, “US State Privacy Legislation Tracker,” International Association of Privacy Professionals, August 2023.

    View in Article
  38. Arkenberg, Generative AI is already disrupting media and entertainment.

    View in Article
  39. Jana Arbanas, Kevin Westcott, Allan V. Cook, and Chris Arkenberg, The metaverse and Web3: The next internet platform, Deloitte Insights, July 25, 2022.

    View in Article
  40. World Economic Forum, “New initiative to build an equitable, interoperable and safe metaverse,” press release, May 25, 2022.

    View in Article
  41. Decentralized Identifiers (DIDs) v1.0, Core architecture, data model, and representations,” World Wide Web Consortium, July 2022

    View in Article
  42. Jana Arbanas, Kevin Westcott, Allan V. Cook, and Chris Arkenberg, The metaverse and Web3: The next internet platform, Deloitte Insights, July 25, 2022.Ibid Arbanas, Westcott, et. Al.

    View in Article
  43. Deloitte, “Emerging trends: Moving to SaaS and XaaS,” infographic, March 2023.

    View in Article
  44. European Commission, “Fair taxation of the digital economy,” accessed September 2023. March 2018.

    View in Article
  45. VAT Committee, Commission question on non-fungible tokens, European Commission, March 21, 2023.

    View in Article
  46. European Parliament News, “Crypto-assets: green light to new rules for tracing transfers in the EU,” press release, April 20, 2023.

    View in Article
  47. Jana Arbanas, Kevin Westcott, Allan V. Cook, and Chris Arkenberg, The metaverse and Web3: The next internet platform, Deloitte Insights, July 25, 2022.Ibid, Arbanas, Westcott, et al.

    View in Article
  48. Aaron Klein, Adrianna Pita, “What do the SEC’s lawsuits signal for the future of cryptocurrency?,” Brookings Institution, June 14, 2023.

    View in Article
  49. “The term ‘high-volume third-party seller’ means a participant on an online marketplace’s platform who is a third-party seller, month period over the previous 24 months, has entered into 200 or more discrete sales or transactions of new or unused consumer products and an aggregate total of $5,000 or more in gross revenues.”; taken from United States Code, Title 15-Commerce and trade, accessed September 2023.

    View in Article
  50. Federal Trade Commission (US), “INFORM Consumers Act takes effect on June 27th. Is your business ready?,” June 8, 2023.

    View in Article
  51. Tropee, “5 brands using NFTs to sell physical products,” March 18, 2022.

    View in Article
  52. Rob Massey, Casey Kroma, Conor O’Brien, Jason Dimopoulos, Paul Gladman, Nate Tasso, and Sha Zhang, “Tax positions on crypto transactions: Preparing for the 2022 tax season,” Deloitte, accessed September 2023. January 2023.

    View in Article
  53. Conor Walsh and Alan Kilmartin, “Real-time VAT reporting – it’s not near, it’s here,” Deloitte, accessed September 2023. June 2022.

    View in Article
  54. Deloitte, “Blockchain: Ready for business,” December 7, 2021.

    View in Article

Acknowledgments

The authors would like to thank Jeff Loucks, Tanneasha Gordon, Joanna Conway, Nikki Cope, Ali Ziaee, Radhika Sanghani, Doreen Cadieux, Brian Pinto, Tyler Spalding, Barbara Bona, Kyle Cooke, Meghan Burns, Michele Jones, Nick Seeber, Duncan Stewart, Andy Bayiates, Blythe Hurley, Alexis Werbeck, and Prodyut Borah for their help with this project.

Cover image by: Alexis Werbeck