
expressed opinion entrepreneur Contributors are their own.
You are reading Entrepreneur India, the international franchise of Entrepreneur Media.
IT research firm Everest Group released a report titled “Metaverse Growth Increases Trust and Security (T&S) Risks for Businesses and Users,” and said it expects Metaverse to rapidly grow to $679 billion by 2030. industry, but this growth has implications for T&S, such as threats to user safety, increased abuse, proliferation of objectionable content and financial fraud.
focus on
“Metaverse is attracting significant investment from tech giants such as Google, Meta, Microsoft and Nvidia to make virtual worlds a reality, and these applications, for better or worse, have limitless economic and social potential. As organizations build for virtual worlds, trust Security and security issues need to be paramount considerations. Businesses may be able to adapt to some of today’s best practices, but they also need to address scenarios and use cases unique to virtual worlds. Addressing these new challenges will drive businesses A collaborative approach is needed between , policymakers, academia, and T&S service providers to realize Metaverse’s full potential as an immersive and safe place for users.
In exploring Metaverse’s impact on the third-party T&S market, the report also stated that T&S services are one of the fastest growing segments of the business process services market, expected to reach $1.5 billion to $20 billion by 2024. The market is expected to grow by 35% to 38% by 2024 and accelerate to 60% to 68% after 2024 as technology and infrastructure advance beyond the nascent stage.
The Everest Group report also proposes Metaverse risk mitigation strategies for T&S risks, including the abuse of avatars, including personal space intrusion, impersonation, harassment, assault, bullying, stalking, and espionage. The report further points to concerns about data privacy and user security, virtual asset security for financial crime and identity theft, the well-being of content moderators who may face physical and mental health hazards from prolonged exposure to VR headsets and content, and regulatory ambiguity.