Seeing Like a Statistical Learning Algorithm

image

I recently had the pleasure of reading James Scott’s “Seeing Like a State,” which examines a certain strain of failure in large centrally-organized projects. These failures come down to the kinds of knowledge available to administrators and governments: aggregates and statistics, as opposed to the kinds of direct experience available to the people living ‘on the ground,’ in situations where the centralized knowledge either fails to or has no chance to describe a complex reality.  The book classifies these two different kinds of knowledge as techne (general knowledge) and metis (local knowledge).  In my reading, the techne – in both strengths and shortcomings – bears similarity to the knowledge we obtain from traditional algorithms, while metis knowledge is just starting to become available via statistical learning algorithms.

In this (kinda long) post, I will outline some of the major points of Scott’s arguments, and look at how they relate to modern machine learning.  In particular, the divides Scott observes between the knowledge of administrators and the knowledge of communities suggest an array of topics for research.  Beyond simply looking at the difference between the ways that humans and machines process data, we observe areas where traditional, centralized data analysis has systematically failed. And from these failures, we glean suggestions of where we need to improve machine learning systems to be able to solve the underlying problems.

Continue reading

Code, Debt, and Bitcoin

frontOnce upon a time in the late nineties, the internet was a crypto-anarchist’s dream.  It was a new trans-national cyberspace, mostly free of the meddling of any kind of government, where information could be exchanged with freedom, anonymity, and (with a bit of work) security.   For a certain strain of crypto-anarchist, Temporary Autonomous Zone was a guiding document, advocating small anarchist societies in the blank spaces of existing society temporarily beyond the reach of government surveillance or regulation.  This was a great idea with some obvious drawbacks: On the one hand, TAZ served as a direct inspiration for Burning Man.  On the other hand, it eventually came out that Peter Lamborn Wilson (who authored TAZ under the pseudonym Hakim Bey) was an advocate of pedophilia, which had clear implications as to why he wanted freedom from regulation.  It’s a document whose history highlights the simultaneous boundless possibilities and severe drawbacks of anarchism.

Against this background, Lawrence Lessig’s Code made the case that the internet TAZ was in fact temporary.   Lessig argued that the internet’s behaviour is determined by a combination of computer code and legal code, and that while the legal code hadn’t been written yet, it would be soon.  His prediction (which has largely been realized) was that the internet would lose its anarchic character through government regulation mixed with a need for security and convenience in commercial transactions. (In addition to these forces, social media also came along, in which people largely sacrificed their anonymity willingly for the convenience of being able to easily communicate with their meatspace social networks.)

In thinking about Bitcoin, it’s useful to see how the regulation came to change the internet.  The prediction (again pretty much correct) was that regulations would target large companies instead of individual users.  Companies are compelled to follow the law under the ultimate threat of not being allowed to operate at all.  Because of the tendency for people to glom onto just a few instances of working solutions, it becomes easy to target a few large entities to enact regulation on a broad base of users.

Continue reading