Friday, November 22, 2024

Policing the Metaverse | Silicon UK Tech News

Must read

 

 
As the metaverse begins to unfold, a new digital landscape frontier emerges, bringing unique challenges and opportunities. The metaverse holds immense potential. However, like any new territory, it also introduces a complex array of legal, ethical, and societal dilemmas that need addressing. Policing this vast, interconnected space is one of the most pressing concerns as law enforcement agencies worldwide grapple with enforcing laws and protecting users in a realm that transcends physical borders.

The metaverse represents a convergence of the digital and physical worlds, where actions in a virtual space can have real-world implications. Dr. Madan Oberoi, Interpol’s Executive Director of Technology and Innovation, highlighted this in a 2023 interview: “My typically used example is that if you have to save a drowning person, you need to know how to swim.” This analogy aptly captures the current situation facing law enforcement. Just as the early internet was once viewed as a lawless space, the Metaverse is now a vast, largely unregulated digital frontier where anything seems possible. However, as with the internet, it will soon become clear that what happens in the metaverse can significantly impact the real world.

One of the most complex challenges of policing the metaverse is establishing clear jurisdiction for crimes and disputes. In a virtual world without physical borders, traditional concepts of jurisdiction become murky. As Benson Varghese, a Board Certified Criminal Lawyer, and the founder and Managing Partner of Varghese Summersett points out, “Defining jurisdictional boundaries will be tremendously difficult in metaverses without regard for national borders. International cooperation will be paramount, yet navigating diplomatic sensitivities is no easy task.”

The Metaverse requires rethinking legal frameworks that traditionally rely on geographical boundaries. Determining which laws apply becomes a significant challenge in a space where users from different countries can interact seamlessly. Varghese suggests that specialised legal codes, with authority based on user location or dashboard origins rather than arbitrary geographical limits, might be necessary. This approach would require unprecedented levels of international collaboration and the creation of new legal mechanisms to handle cross-border virtual crimes.

Interpol is already taking steps in this direction, recognising that cyberspace does not respect national borders. The organisation is working to develop laws and regulations that can be enforced across different virtual platforms, but the challenges are immense. Law enforcement agencies will need to adapt and evolve to meet the demands of policing in the metaverse, much like they did during the early days of the internet.

As the metaverse expands, new types of criminal activities are likely to emerge, many of which may differ significantly from traditional cybercrimes. The 2023 Interpol report highlights early examples, such as sexual harassment cases in virtual spaces. However, what constitutes a crime in the real world is only sometimes immediately applicable to the metaverse. New laws and regulations are needed to address these emerging threats.

In addition to harassment, other forms of crime that could become prevalent in the metaverse include fraud, identity theft, and even virtual property theft. The anonymity provided by virtual avatars can make it easier for criminals to engage in these activities without fear of detection. Also, the ability to create and manipulate multiple identities in the metaverse adds another layer of complexity for law enforcement. As Varghese notes, “Accountability too poses headaches as individuals represent themselves through multiple avatars. Strong identification and documentation protocols coupled with legal recognition of virtual identities may help, though privacy concerns abound.”

One potential solution to these challenges is the use of cryptographic signatures to track avatar actions, ensuring that individuals can be held accountable for their behaviour in the metaverse. However, this approach raises significant privacy concerns and would require careful oversight to prevent abuse. Balancing security and privacy will be crucial in creating a safe and fair metaverse.

Privacy concerns are particularly relevant in the Metaverse, where users may be more vulnerable to surveillance and data collection than the physical world. The metaverse has the potential to become a space where every action is tracked and monitored, raising questions about how to balance the need for security with the right to privacy.

This divergence in privacy standards could challenge law enforcement as they attempt to navigate the different legal frameworks governing the metaverse. One potential approach is to develop international agreements that establish common privacy standards for virtual environments. These agreements could be modelled on existing treaties, such as the General Data Protection Regulation (GDPR) in Europe, which has set a global benchmark for data privacy.

Future policing

Effective international cooperation will be essential for enforcing laws across the metaverse. These treaties could require participating countries to dedicate some cyber investigation resources to monitoring the metaverse, encouraging collaboration and intelligence sharing between jurisdictions.

Such cooperation will be vital in addressing the unique challenges of policing the Metaverse. As technology continues to evolve, the line between virtual and real legal consequences will become increasingly blurred.

Policing the metaverse presents new challenges, from establishing jurisdiction in a borderless world to balancing privacy and security concerns. As the Metaverse evolves, law enforcement agencies must adapt and collaborate internationally to address these challenges. We can create an innovative and safe Metaverse for all users by developing new legal frameworks and fostering cooperation between countries. The journey ahead is complex, but with careful planning and open dialogue, we can navigate this new frontier and ensure that the Metaverse remains a space where creativity and safety coexist.

Adam Pilton, Senior Cyber Security Consultant at CyberSmart and former Detective Sergeant investigating Cybercrime.

How do you envision the “law enforcement” concept evolving in the Metaverse context? Are there historical parallels that can help us understand this evolution?

“To continue evolving, law enforcement needs money, which is money that they won’t be receiving. The money is needed to continue to develop knowledge and skills around digital evidence recovery and digital investigations. Police forces are equipped to investigate digital crimes, but to continue to adapt and evolve at the speed of our digital world and maintain the breadth and depth of knowledge requires government funding.

“The real issue is the criminal justice system. Courts must be equipped to handle modern data and its vast amounts. Court processes cause delays in justice; we see this with the backlog in cases when we look at fraud cases, which frequently require a digital investigation and the backlogs rocket.”

How can we establish clear jurisdiction for crimes and disputes in a virtual world without borders? What challenges do you foresee in enforcing laws across different virtual platforms?

“International law enforcement works well together; look at your Europol and Interpol; we are seeing success from them on an ever-increasing frequency.

“Earlier this year, we saw Lockbit taken under the control of law enforcement. This started with work from the UK-based South West Regional organised crime unit and culminated in international partners working together.

“The challenges that I have seen firsthand are: Offenders are frequently based in countries that do not work with us, such as Russia. Volatile data is just that, volatile. And Engagement from some platforms is limited and slow, impacting investigation speed and quality and amplifying the volatility issue.”

What types of criminal activities are most likely to emerge in the Metaverse? How might these differ from traditional cybercrimes?

“The metaverse will only increase the types of crime that utilise social engineering. Also, the Metaverse will heighten cyber criminals’ ability to gain trust. I believe that people will struggle to distinguish between real-life and virtual-world relationships and believe everything they see and hear on screen. In the real world, if I’m standing in front of a person, I have an understanding of who they are. I do not know who I am talking to in the virtual world.”

How should we balance privacy concerns with the need for security in the Metaverse? Should virtual environments have surveillance measures similar to those of the real world?

“Regulations and legislation that can and do cater to real and virtual-world privacy concerns already exist. GDPR and the Investigatory Powers Act and surveillance exists in both worlds. As a byproduct of the Metaverse working, all the commands we submit and actions computers output are recorded. Whether this data is retained, reviewed, or utilised is another question that will vary from platform to platform.”

What role do AI and other advanced technologies play in policing the Metaverse? Could these technologies create ethical dilemmas?

“AI can be supremely powerful in policing the Metaverse and should be used to support investigations. It can sift through vast amounts of data swiftly and provide human-readable consolidated data that law enforcement can use for investigations and the criminal justice system can consume simply. However, AI cannot be trusted to give a true, accurate picture. We Have all heard about AI hallucinations, and this is something that could not be allowed in the criminal justice system.”

How should law enforcement address the concept of identity in the Metaverse, where users can assume multiple avatars? What are the implications for Accountability?

“The Investigatory Powers Act supports law enforcement. Governments need to do more to regulate online platforms and ensure users are identifiable, should that be required. The issue is not one for law enforcement to address but for Governments. Accountability is essential for democracy. Looking at the recent riots, the feeling of anonymity increased the number of people involved. When Accountability came into focus, and those responsible were promptly brought to justice, the numbers of rioting declined, and the illegal activity stopped.”

Can proactive strategies be implemented to prevent crime in the Metaverse? How can virtual worlds be designed to promote safety?

“Yes, at a basic level, visual cues in the Metaverse remind people of key elements such as ‘do you really know who you’re talking to?’ These are the same kind of reminders that we get from our banks when we transfer money.

“In addition, platform regulation ensures that they are compelled to protect their users both before any possible criminal act has taken place and after a criminal offense has taken place when they must work with law enforcement to help identify offenders and provide evidence to the criminal justice system to ensure justice is done.

“Ultimately, though, awareness is and always will be crucial when it comes to cybercrime. Those individuals who can recognise potential criminal activity will be able to question it and seek the support and guidance they need to stop it from evolving and prevent themselves from falling victim.”

How important is international cooperation in policing the Metaverse? What frameworks or agreements might be necessary for effective collaboration?

“In the UK, we have legislation such as the Investigatory Powers Act, which international organisations recognise, but it does not compel them to act. In 2019, the UK and America signed a lateral data access agreement that modernised the way law enforcement between the two countries can operate. The agreement increased efficiency, focusing on obtaining volatile data swiftly so justice could be done for victims. This agreement replaces the often cumbersome mutual legal assistance process, which would often take years to obtain the necessary data, with an efficient process that reduces it to mere weeks.”

Looking ahead, how do you think the concept of justice will evolve as the Metaverse becomes more integrated into our daily lives? What are the potential long-term impacts on society?

“Anonymity, if allowed, will see an increase in poor behaviour and criminal activity. Justice may take more work to come by. This may create a polarised world in which we can choose the profile of people we interact with and compound our current beliefs and values, rightly or wrongly. A virtual world fight is not and should not be a crime. A real-world fight and assault will always be a crime. I do foresee, though, that Harassment laws will need to be reviewed, particularly when looking at areas such as domestic abuse.

Latest article