Tuesday, July 24, 2012

Rule 34

A few years ago I had the opportunity to go and hear one of my literary heroes, William Gibson, give a talk. After the talk I asked him which author he thought was his literary heir apparent. He said Charles Stross without hesitation. The next day I picked up a copy of Halting State, and thus began my love affair with the writings of Mr. Stross.

I recently finished Rule 34, which was published in 2011 and is the second in the Halting State Series. I loved this book. Not only is Stross’ a brilliant storyteller, but he also has his pulse on the current bleeding edge of cultural and technological evolution, and an amazing vision of where that evolution is going to take our species in the near and far future.

I don’t plan to give away the plot, but there are a couple of themes in this book that I will hint at because I have spent a lot of time cogitating about them over the last year.


The Rate of Technology Adoption by Law Enforcement

In Rule 34 Stross imagines a near-future where the criminal justice system is driven by sophisticated machine learning and augmented reality software systems. Stross’ technical vision cannot be faulted; all of the applications of technology that he describes are not only entirely feasible, but are mostly already commercially available today in some form or another. However, having worked in the Justice and Public safety domain for the last year, I have to question the rate of cultural evolution in law enforcement that this vision implies.

Despite what we see on television, in shows like CSI, in reality law enforcement agencies are glacially slow at adopting even relatively new technologies. In the United States and Canada today, other than the large federal and state/provincial agencies, most state/provincial and local agencies and police departments do not have fully integrated systems, let alone systems that leverage emerging technologies. The US Federal Government has a number of initiatives to address this, including the GRA and NIEM standards, but adoption of these has been very slow indeed.

In Rule 34 Stross hints at a cataclysmic cyber-attack that motivates a significant acceleration in technology adoption by law enforcement agencies across the planet. The 9/11 attacks failed to motivate this acceleration, so I am not convinced that a cyber-attack would, even if it went so far as to take out national power, telecommunication, and transportation infrastructure.

The current Holy Grail of law enforcement is Predictive Policing (think Minority Report sans the half-naked psychics in the wading pool). Despite the fact that the data storage and real-time analysis technologies needed to implement  predictive policing solutions are available today in off-the-shell products, e.g. Hadoop, Mahout, MarkLogic Server, and even products that they are probably already using, like Microsoft SQL Server; most agencies are now only beginning to look into how they might integrate their existing systems, automate currently paper-based processes, analyse their data using 40-year-old OLAP technologies, and migrate existing 25-year-old mainframe applications. I imagine that it is going to be a while before we see predictive, machine learning-based systems going into production in these agencies, let alone systems that leverage augmented reality user interfaces.


Free Will and Criminal Justice

One of the characters in Rule 34 asks a very interesting question; if humans do not have free will then how relevant or useful is a system of justice based on increasingly more complex laws and statutes, and the enforcement of those laws and statutes primarily through the threat of punishment? Recent research in cognitive neuroscience, using functional brain imaging techniques, has revealed that we start acting on a decision, i.e. muscle-actuating nerve impulses are sent, before we consciously become aware that we have made that decision. This implies that most, if not all, of the decision making process happens in the unconscious mind, and that once the decision has been made, the conscious mind is informed of the decision as a courtesy.

This seems to imply that Martin Luther was right[ish] all along; man does not have free will after all. If this is the case then how can we expect that a threat of punishment aimed at the conscious mind will motivate obeyance? And more importantly, how can we justify punishing those that transgress? Are they not merely victims of their own pathology? I don’t claim to have answers to these questions, but I do know that the findings of this research brings into question many of our most deeply held assumptions about our own nature, and the structures that we have put in place to govern that nature at scale.

Update (August 26th 2012): If you are at all interested in this latter topic, I highly recommend that you read “Free Will” by Sam Harris. It represents a thorough analysis of the current state of the demise of our illusions of Free Will. 

There is a lot more to Charles Stross’ Rule 34 than these two themes; it is a feast for the [conscious] mind, so I highly recommend that you pick-up or download a copy and read it.

Wednesday, July 4, 2012

Pure Genius!

Here is a bit of pure genius that I have to share:


This should be mandatory reading for all software developers and architects.