Saturday, October 3, 2015

Is data the new oil in the information era?

The article by Anderson extols Google’s analytical tools and how successful they have been in making profits for the company. I find this approach of big data/Petabytes analysis, such as Netflix’s Cinematch, quite useful for audience research. But wouldn’t it polarize the audience? We can argue indefinitely whether this mathematical/defying-traditional-scientific-method approach is right or wrong because it’s an epistemological matter.
Google’s founding philosophy is that they don’t care about human beings and their causal relationships, cultures, contexts, behaviors, motivations, etc.; all they care about is correlation. If the statistics of incoming links say this page is better than that page, then that’s good enough. This might be useful for commercial business people or economists. The epistemological problem here is that studying about social science, human being and communication doesn’t work like that.
Attempting to know about human beings and their driving motivations, cultures and contexts matter because of the importance of accuracy and outliers. For example, Google translator is anything but useful. Google translator uses an enormous amount of data and textual references to translate one language to another. But if two languages are radically different in terms of grammar, structure and nuances, it gets so inaccurate to an extent that it’s just not useful at all. Also, sometimes it’s the outlier that really matters in the realms of society and communication.
The article contends that we don’t need to know why people do what they do as long as they do it. It seems to criticize traditional inductive scientific reasoning, but it neglects to mention the probability of deductive reasoning. I’m not sure whether the author is equating statistical algorithms to interpretations of qualitative researchers when they analyze their corpus of research data.

No comments:

Post a Comment