'Its 'curators' invite you through peer-to-peer networks (social media sites), and peer-to-peer networks mean there's no centralised authority to destroy,' says Ashish Sharma.
Illustration: Dominic Xavier/Rediff.com
After several claims that an online challenge game called Blue Whale is leading children to their deaths, the Indian government has swung into action.
But bringing down the game won't be so easy, because its 'curators' invite you through peer-to-peer networks (social media sites), and peer-to-peer networks mean there's no centralised authority to destroy.
No wonder sociopaths are creating addictive worlds on peer-to-peer networks, prompting the government to intervene.
This will get especially true of the even-more immersive virtual worlds that will come, where we will face threats to the one organ exposed there -- the mind.
Unfortunately, should the mind end up being hosted by a game like Blue Whale, virtually all of its inputs will be mediated by a 'curator'.
There is obvious potential for abuse here.
For example, Blue Whale manages to keep children involved in its virtual world by offering structures of gameplay through a curator.
The authorities of the game require everyone to agree to all kinds of conditions before entering their world.
A journalist working at Radio Free Europe created an account on a network, posing as a girl willing to join the game. Here's a transcript.
Player: I want to play the game.
Administrator: Are you sure? There's no way back.
Player: Yes. What does that mean? There's no way back?
Administrator: You can't leave the game once you begin.
Player: I'm ready.
Administrator: Carry out each task diligently and no one must know about it.
When you finish the task, you send me a photo.
At the end of the game, you die. Are you ready?
Player: And if I wanna get out?
Administrator: I have all your information, they will come after you.
As seen above, the game asks you to surrender. This is by no means insignificant.
In fact, this toxic plunge into a virtual world calls for real-life intervention, because in case of a toxic plunge, the victims won't cry for help, because they won't know they need help.
Perhaps now would be a good time to refer to science fiction film, The Matrix, where machines live off the humans' body heat and electrochemical energy, imprisoning their minds within an artificial reality.
The humans, of course, can't cry for help, because they won't know they need help, imprisoned as they are by a simulated reality.
Replace evil machines with evil curators, and Blue Whale looks no different from The Matrix.
Now, suppose that a real-life security force has been told to disable or control a virtual world like Blue Whale.
What can it do? Its options may be limited.
In the case of Napster, a music-file-sharing site, the police could simply walk up to central boxes and the wires that ran in and out of them, and pull the plug. In fact, it did.
This owes to the fact of a central server holding all of Napster's critical data. And as Napster's file-sharing posed a threat to the profits of certain corporate interests, all the police had to do was go over to its headquarters and pull the plug on the central server: Easy.
But then a new music-sharing technology came along, not based on a central server, but on a peer-to-peer system, where each one acts like a mini-server.
So the network won't stop, even if the police arrests one, two, or two dozen teenagers for 'stealing' music.
Indeed, this holds true for Blue Whale as well, which resides on peer-to-peer networks, where the police can arrest only one, two, or two dozen players, but not the entire network.
The only way to affect events in the world, thus, would be to enter the world. By doing so, you can at best interact with other people who use the network in such a way that the great majority decides not to use the network anymore.
As society moves its interests to cyberspace, the likelihood that our security forces would be called upon to engage in this way is great.