Saturday, August 10, 2013

Truth exposed by Big Data: Humans are fundamentally of a greater order than algorithms

This is a bit of review of something that was explained in a presentation by a founding member of Palantir, Stephen Cohen at a Wired Magazine seminar last year. For the purpose of this discussion, algorithms are defined as a plan so well defined that there is no ambiguity to its execution

That might need to sit for a bit, but Cohen explains it through history: Before the industrial revolution, every product was produced individually using an ad hoc method. Then came the industrial revolution where we learned to mass produce and therefore created algorithms, which consistently reproduced steps in a production process.

This is the fundament of an algorithm, which is the foundation of all computing process based on the confines of Boolean algebra (if then else).  The algorithm is ruthless in its ability to repeat performance based on a set scale of explicit and upfront inputs. It never wavers and can do this repetitive task many times over.

Big data is the phenomenon of these algorithms to not only do their task, but put out information along the way. So, the big data concept is a phenomenon of algorithms self-propagating an information flow.  Due to this recurring nature, the shear amount of data is exploding not only in size, but also in type.

So, as exciting as the algorithms are, what can’t they do?

Well as amazing as algorithms are, in that they make decisions without context for quality is exactly their limitation, which eventually bounds their potential. So an algorithms cannot treat or produce qualitative data like hunger, fear, happiness, etc. And it is exactly this qualitative data that we humans need to make decisions.

In order for algorithms to be able to make decisions, I have to strip the data of its qualitative nature and make it shallow and open for interpretations. I have to scale my hunger on a scale from one to ten, which cannot be done without ambiguities.

Algorithms also fail to capture subtle contexts, which would make their efficiency go away. Algorithms are efficient because everything fed to them has to be explicit and upfront. A complex human situation can be understood by humans, but very hard to communicate. Thus they are hard to break down to purely quantitative data and fed into an algorithm.

The final result of all this good stuff: Computers will never replace humans.

All of this information is copied 100% from a speech given by Stephen Cohen, founding member of Palantir. No original parts have been added and I take no credit for authorship.

No comments:

Post a Comment