Source: Entrepreneur and Media Lab researcher David Rose talks ‘enchanted objects’
The book on Amazon: Enchanted Objects: Design, Human Desire, and the Internet of Things
Source: Entrepreneur and Media Lab researcher David Rose talks ‘enchanted objects’
The book on Amazon: Enchanted Objects: Design, Human Desire, and the Internet of Things
In November 1998, I sent to my then-colleagues at EMC an email with the subject line “The Demise of Dell.” I wrote:
“My fail-proof crystal ball just talked to me again: By the end of 2000, Dell’s market cap (today at $80B) will be cut in half.
Dell’s only strength, as we all know, is in low-cost distribution. Distribution (of everything) is going to undergo a radical change in the near future because of the Internet. There will be new players in the PC market that will figure out how to sell PCs over the Internet at half the cost of Dell’s distribution infrastructure. On top of that, the corporate PC market will grind to a halt and we may even see a slight drop in PC revenues in the year 2000. On the consumer side, appliances is where the action will be—led by new players. “
After I sent my email, Dell’s stock went on to almost double to a peak of just over $56 in March 2000. It closed yesterday at $14.09, about half of where it was in late 1998.
In many companies today, the “consumerization of IT” is turning into the “Digitization of IT.” The spreading of consumer technologies and services into the workplace is being expanded into a larger set of IT practices, borrowed from Silicon Valley innovators and adapted to the needs of enterprises in a variety of industries.
The old IT was analog IT: A single-purpose function designed to automate specific business activities, provide support and governance, and “keep the trains running on time.” The new IT is digital: Multi-purpose, extremely flexible, weaved into every aspect of the business, and gushing with unexplored and previously unknown opportunities.
The digitization of IT means that the IT organization is both stable and innovative, fault tolerant and fast learning, reliable and experimental. It solves the paradox of “safe is risky, stable is dangerous.” It promotes a culture of constant change which ensures resilience, and experimentation which safeguards continuity. Yes, you can have the best of both worlds.
“Early adopters of Big Data analytics have gained a significant lead over the rest of the corporate world. Examining more than 400 large companies, we found that those with the most advanced analytics capabilities are outperforming competitors by wide margins.”
Source: Bain & Company
30 years ago today, Steve Jobs unveiled the Macintosh. More accurately, The Great Magician took it out of a bag and let it talk to us. The Macintosh, as I learned from first-hand experience in 1984, was a huge leap forward compared to the PCs of the time. But I couldn’t have written and published the previous words and shared a digitized version of Jobs’ performance so easily, to a potential audience of 2.5 billion people, without two other inventions, the Internet and the Web.
45 years ago this year (October 29, 1969), the first ARPANET (later to be known as the Internet) link was established between UCLA and SRI. 25 years ago this year (March 1989), Tim Berners-Lee circulated a proposal for “Mesh” (later to be known as the World Wide Web) to his management at CERN.
The Internet started as a network for linking research centers. The World Wide Web started as a way to share information among researchers at CERN. Both have expanded to touch today a third of the world’s population because they have been based on open standards. The Macintosh, while a breakthrough in human-computer interaction, was conceived as a closed system and did not break from the path established by its predecessors: It was a desktop/personal mainframe. One ideology was replaced by another, with very little (and very controlled) room for outside innovation. (To paraphrase Search Engine Land’s Danny Sullivan, the big brother minions in Apple’s “1984” Super Bowl ad remind one of the people in Apple stores today).
This is not a criticism of Jobs, nor is it a complete dismissal of closed systems. It may well be that the only way for his (and his team’s) design genius to succeed was by keeping complete ownership of their proprietary innovations. But the truly breakthrough products they gave us—the iPod (and iTunes), and especially the iPhone (and “smartphones”)—were highly dependent on the availability and popularity of an open platform for sharing information, based on the Internet and the Web.
Creating a closed and proprietary system has been the business model of choice for many great inventors and some of the greatest inventions of the computer age. That’s where we were headed towards in the early 1990s: The establishment of global proprietary networks owned by a few computer and telecommunications companies, whether old (IBM, AT&T) or new (AOL). Tim Berners-Lee’s invention and CERN’s decision to offer it to the world for free in 1993 changed the course of this proprietary march, giving a new—and much expanded—life to the Internet (itself a response to proprietary systems that did not inter-communicate) and establishing a new, open platform, for a seemingly infinite number of applications and services.
As Bob Metcalfe told me in 2009: “Tim Berners-Lee invented the URL, HTTP, and HTML standards… three adequate standards that, when used together, ignited the explosive growth of the Web… What this has demonstrated is the efficacy of the layered architecture of the Internet. The Web demonstrates how powerful that is, both by being layered on top of things that were invented 17 years before, and by giving rise to amazing new functions in the following decades.”
Metcalfe also touched on the power and potential of an open platform: “Tim Berners-Lee tells this joke, which I hasten to retell because it’s so good. He was introduced at a conference as the inventor of the World Wide Web. As often happens when someone is introduced that way, there are at least three people in the audience who want to fight about that, because they invented it or a friend of theirs invented it. Someone said, ‘You didn’t. You can’t have invented it. There’s just not enough time in the day for you to have typed in all that information.’ That poor schlemiel completely missed the point that Tim didn’t create the World Wide Web. He created the mechanism by which many, many people could create the World Wide Web.”
“All that information” was what the Web gave us (and what was also on the mind of one of the Internet’s many parents, J.C.R. Licklider, who envisioned it as a giant library). But this information comes in the form of ones and zeros, it is digital information. In 2007, when Jobs introduced the iPhone, 94% of storage capacity in the world was digital, a complete reversal from 1986, when 99.2% of all storage capacity was analog. The Web was the glue and the catalyst that would speed up the spread of digitization to all analog devices and channels for the creation, communications, and consumption of information. It has been breaking down, one by one, proprietary and closed systems with the force of its ones and zeros.
Metcalfe’s comments were first published in ON magazine which I created and published for my employer at the time, EMC Corporation. For a special issue (PDF) commemorating the 20th anniversary of the invention of the Web, we asked some 20 members of the Inforati how the Web has changed their and our lives and what it will look like in the future. Here’s a sample of their answers:
Guy Kawasaki: “With the Web, I’ve become a lot more digital… I have gone from three or four meetings a day to zero meetings per day… Truly the best will be when there is a 3-D hologram of Guy giving a speech. You can pass your hand through him. That’s ultimate.”
Chris Brogan: “We look at the Web as this set of tools that allow people to try any idea without a whole lot of expense… Anyone can start anything with very little money, and then it’s just a meritocracy in terms of winning the attention wars.”
Tim O’Reilly: “This next stage of the Web is being driven by devices other than computers. Our phones have six or seven sensors. The applications that are coming will take data from our devices and the data that is being built up in these big user-contributed databases and mash them together in new kinds of services.”
John Seely Brown: “When I ran Xerox PARC, I had access to one of the world’s best intellectual infrastructures: 250 researchers, probably another 50 craftspeople, and six reference librarians all in the same building. Then one day to go cold turkey—when I did my first retirement—was a complete shock. But with the Web, in a year or two, I had managed to hone a new kind of intellectual infrastructure that in many ways matched what I already had. That’s obviously the power of the Web, the power to connect and interact at a distance.”
Jimmy Wales: “One of the things I would like to see in the future is large-scale, collaborative video projects. Imagine what the expense would be with traditional methods if you wanted to do a documentary film where you go to 90 different countries… with the Web, a large community online could easily make that happen.”
Paul Saffo: “I love that story of when Tim Berners-Lee took his proposal to his boss, who scribbled on it, ‘Sounds exciting, though a little vague.’ But Tim was allowed to do it. I’m alarmed because at this moment in time, I don’t think there are any institutions our there where people are still allowed to think so big.”
Dany Levy (founder of DailyCandy): “With the Web, everything comes so easily. I wonder about the future and the human ability to research and to seek and to find, which is really an important skill. I wonder, will human beings lose their ability to navigate?”
Howard Rheingold: “The Web allows people to do things together that they weren’t allowed to do before. But… I think we are in danger of drowning in a sea of misinformation, disinformation, spam, porn, urban legends, and hoaxes.”
Paul Graham: “[With the Web] you don’t just have to use whatever information is local. You can ship information to anyone anywhere. The key is to have the right filter. This is often what startups make.”
How many startups have flourished on the basis of the truly great products Apple has brought to the world? And how many startups and grown-up companies today are entirely based on an idea first flashed out in a modest proposal 25 years ago? And there is no end in sight for the expanding membership in the latter camp, now also increasingly including the analogs of the world. All businesses, all governments, all non-profits, all activities are being eaten by ones and zeros. Tim Berners-Lee has unleashed an open, ever-expanding system for the digitization of everything.
We also interviewed Berners-Lee in 2009. He said that the Web has “changed in the last few years faster than it changed before, and it is crazy to for us to imagine this acceleration will suddenly stop.” He pointed out the ongoing tendency to lock what we do with computers in a proprietary jail: “…there are aspects of the online world that are still fairly ‘pre-Web.’ Social networking sites, for example, are still siloed; you can’t share your information from one site with a contact on another site.” But he remained both realistic and optimistic, the hallmarks of an entrepreneur: “The Web, after all, is just a tool…. What you see on it reflects humanity—or at least the 20 percent of humanity that currently has access to the Web… No one owns the World Wide Web, no one has a copyright for it, and no one collects royalties from it. It belongs to humanity, and when it comes to humanity, I’m tremendously optimistic.”
[Originally published on Forbes.com]
Chris Pouliot, the Director of Analytics and Algorithms at Netflix: “…my team does not only personalizations for movies, but we also deal with content demand prediction. Helping our buyer down in Beverly Hills figure out how much do we pay for a piece of content. The personalization recommendations for helping users find good movies and TV shows. Marketing analytics, how do we optimize our marketing spin. Streaming platform, how do we optimize the user experience once I press play. There’s a wide range of data, so theres a lot of diversity. We have a lot of scale, a lot of challenging problems. The question then is, how do we attract great data scientists that can just see this as a playground, a sandbox of really exciting things. Challenging problems, challenging data, great tools, and then just the ability to have fun and create great products.”
[youtube http://www.youtube.com/watch?v=pJd3PKm9XUk]
The Goal of Data Science is to Study the Phenomena and Laws of Datanature
Yun Xiong is an Associate Professor of Computer Science and the Associate Director of the Center for Data Science and Dataology at Fudan University, Shanghai, China. She received her Ph.D. in Computer and Software Theory from Fudan University in 2008. Her research interests include dataology and data science, data mining, big data analysis, developing effective and efficient data analysis techniques for various applications including finance, economics, insurance, bioinformatics, and sociology. The following is an edited version of our recent email exchange.
How has data science developed in China? Continue reading
[Source: U.S. Census Bureau]
“I am a firm believer that without speculation there is no good and original observation”—Charles Darwin
“It is the theory that determines what we can observe”—Albert Einstein
“I suspect, however, like as it is happening in many academic fields, the NSA is sorely tempted by all the data at its fingertips and is adjusting its methods to the data rather than to its research questions. That’s called looking for your keys under the light”—Zeynep Tufekci
“Large open-access data sets offer unprecedented opportunities for scientific discovery—the current global collapse of bee and frog populations are classic examples. However, we must resist the temptation to do science backwards by posing questions after, rather than before, data analysis. A scant understanding of the context in which data sets were collected can lead to poorly framed questions and results, and to conclusions that are plain wrong. Scientists intending to make use of large composite data sets need to work closely with those responsible for gathering the data. Standard scientific principles and practice then demand that they first frame the important questions, then design and execute the data analyses needed to answer them”—David B. Lindenmayer and Gene E. Likens
“The wonderful thing about being a data scientist is that I get all of the credibility of genuine science, with none of the irritating peer review or reproducibility worries… I thought I was publishing an entertaining view of some data I’d extracted, but it was treated like a scientific study… I’ve enjoyed publishing a lot of data-driven stories since then, but I’ve never ceased to be disturbed at how the inclusion of numbers and the mention of large data sets numbs criticism”—Pete Warden
In the first quarter of 2013, the stock of big data has experienced sudden declines followed by sporadic bouts of enthusiasm. The volatility—a new big data “V”—continues this month and Ted Cuzzillo summed up the recent negative sentiment in “Big data, big hype, big danger” on SmartDataCollective:
“A remarkable thing happened in Big Data last week. One of Big Data’s best friends poked fun at one of its cornerstones: the Three V’s. The well-networked and alert observer Shawn Rogers, vice president of research at Enterprise Management Associates, tweeted his eight V’s: ‘…Vast, Volumes of Vigorously, Verified, Vexingly Variable Verbose yet Valuable Visualized high Velocity Data.’ He was quick to explain to me that this is no comment on Gartner analyst Doug Laney’s three-V definition. Shawn’s just tired of people getting stuck on V’s.”
Indeed, all the people who “got stuck” on Laney’s “definition,” conveniently forgot that he first used the “three-Vs” to describe data management challenges in 2001. Yes, 2001. If big data is a “revolution,” how come its widely-used “definition” is based on a dozen year-old analyst note?