Total Pageviews

Wednesday, June 19, 2013

Handicapping the Half-Life of ‘Big Data’

The technologies that now fly under the banner of Big Data are undeniably advancing across the economy, well beyond their early stronghold in consumer Internet companies like Google, Amazon and Facebook. Exploring that evolution is the main theme of the articles in a special section of The New York Times on Thursday.

But what about the term Big Data itself? What is its likely life span?

I’ve written about the origins of the term before. The first person to use Big Data in its current meaning, it seems, was John Mashey, chief scientist of Silicon Graphics in the 1990s. But that is a probability, not a certainty â€" appropriately enough, since Big Data is all about probabilities and correlations.

As to its longevity, Big Data, I’m betting, will cycle out of general use over time. Not because it is a catchall marketing term, hich it is, among other things. The best marketing and sales language is distilled communication.

My bet against Big Data, I suppose, boils down to a linguistic bias. It is too straightforward. It lacks the whiff of poetry, the tension that tightens the bond between words in a phrase. By way of contrast, look at “artificial intelligence.” Its inspiration was a sales pitch of sorts.

In 1955, John McCarthy, a mathematician and computer scientist, was seeking funds from the Rockefeller Foundation for a conference the following year that would explore the growing excitement that computers might become more than big number-crunching calculators â€" that they might, in their way, actually be able to mimic human thought.

In an interview 45 years later, Mr. McCarthy explained that he “just cooked up the phrase” when drafting the grant proposal. As Mr. McCarthy recalled, “My idea was to nail the flag to the mast, as it were.”

Mr. McCarthy, who died two years ago, certainly did that. Artificial intelligence was a deft turn of phrase in the 1950s, and one that took on greater meaning over the years, evoking both the inspiring ambition of science and unnerving qualms about machine intelligence.

But the history of artificial intelligence is instructive in another way. The enthusiasm for the technologies of artificial intelligence, or A.I., has gone through up and down cycles, and at times they were very down. There were two long stretches when investment and interest in artificial intelligence fell sharply â€" roughly 1974-80 and 1987-93. Those years were known as “A.I. winters.” Researchers scrambled to find other names for projects, anything but artificial intelligence.

A similar pattern seems likely for the technologies of Big Data. There may be up and down cycles, the term Big Data may fall from favor, but the technology itself will keep progressing Indeed, the tools of artificial intelligence, like machine learning, are behind the promise of Big Data, which is to find useful insights in an ever-growing universe of digital data.