acm - an acm publication

June 07, 2012

How Data Became Big

In addition to researching A Very Short History of Data Science, I have also been looking at the history of how data became big. Here I focus on the history of attempts to quantify the growth rate in the volume of data or what has popularly been known as the information explosion (a term first used in 1941, according to the OED). The following are the major milestones in the history of sizing data volumes plus other firsts or observations pertaining to the evolution of the idea of big data. Please let me know if you think there’s a relevant event, milestone, study, or observation I overlooked.

http://whatsthebigdata.com/2012/06/06/a-very-short-history-of-big-data/#more-501

April 26, 2012

Milestones in the Evolution of "What is Data Science?"

I'm in the process of researching the origin and evolution of data science as a discipline and a profession. See here, http://whatsthebigdata.com/2012/04/26/a-very-short-history-of-data-science, for the milestones that I have picked up so far, tracking the evolution of the term "data science," attempts to define it, and some related developments. I would greatly appreciate any pointers to additional key milestones (events, publications, etc.).

November 03, 2011

"Innovation"as "Technology" has Become the Human Experience

Those philosophers of technologies and scholars, who are aspiring to be working on the theme of "innovation" and "design", would have heard that Steve Jobs is no more. On the sad demise of Apple co-founder Steve Jobs we as philosophers of technologies should pay our homage and tribute to the all time and unique great innovator.

Steve Jobs once said on accessibility/disability and technology use:

"Having children really changes your view on these things. We're born, we live for a brief instant, and we die. It's been happening for a long time. Technology is not changing it much - if at all."

"These technologies can make life easier; can let us touch people we might not otherwise. You may have a child with a birth defect and be able to get in touch with other parents and support groups, get medical information, the latest experimental drugs. These things can profoundly influence life. I'm not downplaying that. But it's a
disservice to constantly put things in this radical new light - that it's going to change everything. Things don't have to change the world to be important."

However, we all know that technology has created new opportunities to connect and interact. But, researchers are increasingly concerned that heavy technology usage is changing people's behavior in less than desirable ways.

Despite all this facts, Steve Jobs with his thoughts and innovations has changed the world we live in, and revolutionized our world with technology and innovation which will affect our human experience in future and years to come.

October 07, 2011

Steve Jobs and the NeXT Big Thing

On October 12, 1988, Steve Jobs unveiled the NeXT Computer at Symphony Hall in San Francisco. A day or two later, I was among a standing-room only crowd at Symphony Hall in Boston, admiring the all-black, beautifully-designed workstation with a brand-new optical drive (no hard disk drive in the computer of the future according to Jobs) that played a duet with a human violinist.

That night I sent a gushing memo to my colleagues at DEC, telling them that the future has arrived and that Jobs education-sector-first marketing strategy was brilliant. Indeed, CERN was one of the early adopters and Tim Berners-Lee developed the first WWW browser/editor on the NeXT workstation. But NeXT Computer, Inc. went on to sell only 50,000 beautifully-designed "cubes," getting out of the hardware business altogether in 1993.

For many years, I have kept in my office the "Computing Advances to the NeXT Level" poster I got that night as a reminder that forecasting the next big (or small) thing in technology is tough, even impossible. And yet, many people believe that technology marches according to some "laws" or pre-defined trajectory and that all we have to do is decipher the "evolutionary" path technology (or the economy or society) is destined to follow.

Jobs went on to introduce the iPod and the iPad, industry-changing devices whose invention was made possible, among other things, by a tiny disk drive. The possibility of a significant boost to the simultaneous shrinking (of size) and enlarging (of capacity) of disk drives was known since the discovery of the giant magnetoresistance effect in the very same year the NeXT Computer was introduced, 1988. Still, no one predicted the iPod. Similarly, in 1990 no one predicted how the Web will change our lives or in 2000, how virtualization will change the lives of IT managers, although both technologies existed at the time.

To quote Ebenezer Scrooge,who had the opportunity to meet his future, "Men's courses will foreshadow certain ends, to which, if persevered in, they must lead. But if the courses be departed from, the ends will change."

We cannot predict our future. But, like Steve Jobs, we can create it.

January 24, 2011

Algorithm-less Information Processes?

The opening article of the symposium on computation suggested that some information processes may not have a controlling algorithm. A reader said that would be a profound claim. As author of the opening statement, here's my take on it.

If, as do many of the contributors, you regard the information process as the fundamental entity, you are led to ask: "If I observe an information process, can I infer that it was generated by an algorithm?" The way some contributors define information process, it is not obvious that you can always observe an algorithm. So this is a legitimate question.

I suggested a definition of information process: a sequence of representations, with each transition controlled by a representation. This is more constrained than some of the other proposed definitions for information process -- for example, Paul Rosenbloom does not agree to the notion that each transition is controlled. If you accept my definition, then you can look for a controlling algorithm; but there is no guarantee you can find it. If you accept Rosenbloom's definition, you have no assurance there is an algorithm behind it -- but you can certainly search for one and test your hypotheses against the observed information process.

There may be countability arguments applicable here. What if the number of information processes is non-denumerably infinite? That is a possibility with continuous processes. Then there aren't enough algorithms to go around and most information processes would not have a controlling algorithm.

Notice that I am not making claims, just speculating about possibilities.

December 08, 2010

Doing Business With Leaks

Wikileaks has been the the news a lot lately. There has been plenty of discussion about whether the founder (Assange) has committed crimes or whether he was exercising free speech rights. That debate may go on for a while.

However, whether or not Wikileaks survives, we may be entering a new era where private communications can not be considered truly private. New groups of freedom fighters may seek to steal private corporate or government information in order to make it public. Others will retaliate by trying to block the websites of those doing the releasing.

Will this change the way we do corporate business? State diplomacy?

October 22, 2010

Symposia on Ubiquity

We computing professionals are confronted with new and complex questions nearly every day. The events of the day move so fast that we do not have the chance to reflect on those questions and find new answers or even just new understanding. With the launch of the New Ubiquity, we are introducing a new feature, the Symposium. The editors will pose a question and invite a set of leading thinkers to reflect on it. We will publish their reflections. Their thinking may help you accelerate yours on that question.

The first symposium, which will launch around November 1, addresses the question, "What is computation?" This question is at the heart of everything we do as computing professionals. There have been so many changes and new developments, with computation appearing in so many new fields and being discovered in nature, that many of us have sought out new thinking and new clarity to help us deal with our customers and our projects.

A new insight emerged for me as editor during the process of reading drafts of symposium essays. That is the notion that a computation is the actions of the computational agent doing the computing. I had not been making a sharp enough distinction between the structure of the agent and the types of computations it most easily generates. I tried to interpret everything as a Turing machine, and often the Turing machine just didn't resemble the computational entity. For example, DNA transcription looks computational and can be modeled as a Turing machine; but that model does not give a lot of insight into the natural processes behind the transcription. Many researchers, therefore, have created computational models that look like DNA strings being transformed. They hope one day to understand what is controlling those computations and perhaps to modify the controls.

Come join us for this exciting Symposium!