Alan Turing, Theory, and the Diminishing Hegemony of the Status Quo

(This piece initially appeared in TechCrunch, here)

Like many in Silicon Valley, I recently saw Morten Tyldum’s “The Imitation Game”. Admittedly, I have a soft spot for underdog academic narratives and actually teared up. However, I couldn’t shake the feeling the film pigeonholed the breadth and depth of Turing’s work to early cryptography and its mechanized instantiation in WWII.

Cryptography aside, Turing’s work, theory, and models still underline undergraduate curriculums in Computer Science, Mathematics, and Philosophy. His models for computation form the basis for how mathematicians and computer scientists structure both what is solvable and the efficiency with which we can algorithmically solve answerable questions. His Church-Turing Thesis coupled with Godel’s Incompleteness Theorems still has philosophers debating the existence of universal constraints around human knowledge itself.

Finally, and perhaps most pressing given the ongoing renaissance in Machine Learning, the inconspicuously titled ‘Turing Test’ remains the de facto yardstick against which we measure progress and traction in Artificial Intelligence. Whereas “The Imitation Game” centers its focus on cryptography and wartime technology, it’s without doubt that Turing’s work covers not only cryptography, but AI, theoretical Computer Science, Mathematics, and even Epistemology.

The realization that Turing’s work still has such high relevance in both academia and industry got me mulling over how and why this is the case. Can we - as entrepreneurs and technologists - learn from Turing’s intellectual longevity to help guide our own directionality and focus?

I start with a few very basic and generalized broad stroke assumptions. (The focus on high level ‘Big-O’ approximations only seems fitting for this piece). On Theory, Pace of Innovation, and Life Expectancy of the Status Quo.png

  1) Take for granted that with time, the state of infrastructure grows linearly

That is infrastructure layers of software and technology have pressure to innovate and mature yet they don’t necessarily compound. The development of SQL didn’t push the standardization of TCP/IP didn’t accelerate the advent and growth of Bitcoin. While certainly some are necessary precursors to others - we see linear growth and creation here: one building off the other at a roughly constant velocity.

2) As infrastructure grows we see an exponential increase in the pace of innovation

Innovators no longer need to build full stack and proprietary solutions and hence are able to spend the majority of their cycles and bandwidth on the most important and differentiated components of their businesses. As such, latency in feedback loops drop, and the pace of innovation itself increases.

Taking 1) and 2) into account, we get the following rule:

3) The pace of innovations grows exponentially with respect to time

Conversely, as the pace of innovation increases, the life expectancy or expected window of relevancy for the status quo decreases. This should intuitively checkout: the more innovation there is, the more disruption occurs, the shorter any one entity can retain hegemony without being thwarted by a disruptant upstart. Calculus aside, let’s take for granted that this too has a linear relationship.

4) Life expectancy of the status quo decreases linearly as pace of innovation increases

And combining 3) and 4) we get our final result:

5) Life expectancy of the status quo decreases exponentially with time

This is perhaps the most interesting. We have an argument that knowledge, hegemony, and power associated with the current status quo is becoming increasingly less valuable and stable. As example, top software companies from the late 80s and 90s had less pressure to innovate because the technology was more difficult to disrupt and hence had longer shelf lives. Over time, infrastructure matured, the pace of innovation increased, the potential of disruption grew, and the inertia associated with the current state, or status quo, dropped.

How does Turing’s work relate to this pseudo-anthropologic and economic postulating? Well, etched in Turing’s work was his ability to cut through the engineering limitations of his time and grapple with the underlying theory. By decoupling the current state of the art from the theoretical ground truth Turing produced work that hasn’t lost applicability and has shown a near infinite shelf life. His work today carries just as much, if not more (given the just-now relevant engineering possibilities), applicability as it did in his own time. This isn’t wholly dissimilar from how theoretical physicists working on chalk boards view their work in juxtaposition to applied physicists in cutting edge linear accelerators.

As the window of relevancy for the status quo shrinks, investors and entrepreneurs alike must shift emphasis towards a Turing-like approach and work to hone a deeper understanding of underlying theory associated with the applied engineering of the day-to-day. While technology and technology hegemony is increasingly vulnerable to disruption, we can use a sound grasp of baseline theory to start, build, and invest in companies that aren’t fully dependent on the status quo but rather are aware of the accelerating state-of-art. As seen through the longevity of Turing’s work, the increasing pace of innovation highlights the increasing importance of theoretical understanding for entrepreneurs and investors alike.


Now read this

Reflections on current movements in the Bitcoin Ecosystem

An earlier version of this was published on VentureBeat. A few interesting bits of cryptocurrency news were announced in the recent past. First, a team and project near and dear to me, Blockstream, publicly launched with a white paper... Continue →