It’s just a technicality.

Ars Technica:

“Domain tasters” take advantage of a five-day domain name grace period to perform risk-free cybersquatting. Since ICANN upped the penalty for excessive cancellations, however, the practice has essentially disappeared.

As our economy becomes increasingly symbolic, the exploitation of technical loopholes becomes an ever more lucrative business. Consider the recent coverage of flash orders or high speed trading which are “useful only to traders who have computers powerful enough to act on the data within milliseconds.”

The quickening pace of this game of cops and robbers is not, however, what should concern us. It’s the new players who are joining. Players whose involvement may change the rules of the game permanently and not necessarily in our favor.

Consider the incredible but all too real Storm botnet which has exploited loopholes in the Windows operating system to create a criminal network of roughly 500,000 slave computers.

At face value, deriving power from a technicality is nothing new. Arcane scrolls and their elite interpreters gave ancient kingdoms as much stability as swords and soldiers – if not more. (See imperial cult, apostolic succession or the satire Dead Souls.) What is new is the complexity of our code and the ways in which our society is organizing around “black boxes.”

Unlike Dr. Frankenstein’s ignoble monster, Asimov’s noble creations or Rossum’s rebellious robots, what makes the automated processes with which we must now contend so awesome is not that they amplify some flaw or virtue of human society but that they will increasingly exploit what we do not – or cannot – know about ourselves. They are becoming a new kind of parasite.

In previous fiction, the narrative ends when the monster (noble or not) teaches us something about ourselves. This is not by chance: the word monster is derived from the Latin monstro for “showing” – as in to “demonstrate” or render intelligible. Tomorrow’s monsters may not be able to teach us anything about ourselves though they could certainly end up teaching us a lesson we’ll never forget.