IBM, Watson, and Cognitive Computing

ThinkMagReacting to 10 quarters in a row of declining revenues and the abandonment of IBM’s profit target for 2015, UBS’s Steve Milunovich asked on the Q3 earnings call about IBM’s appeal to Silicon Valley startups. Giving voice to the rising conviction on Wall Street and beyond that the answer to the “disruption” of large companies is to “focus,” Milunovich stated that “they all argue of course they are going to disrupt the large companies, that the large companies basically have to break up.”

IBM’s CEO Ginni Rometty had a two-fold answer. The new areas of “higher value”–big data analytics, the cloud, social/mobile/security–grew almost 20%. IBM’s investments and offerings in these markets appeal to startups, argued Rometty, as evident by the 3,000 applications to join the Watson ecosystem. IBM can and will deliver the type of innovative, non-traditional IT infrastructure and solutions startups typically use.

Innovation is also very much on the mind of IBM’s traditional customers. Rometty reported on a meeting she had recently with 30 CIOs of IBM’s largest customers, where IBM was called “a navigator,” the company that understands “how an enterprise operates and how you should pull all of this together.”

It is important to keep in mind that innovation—new technologies, new business models, new processes—is making a big impact not only on IT vendors such as IBM but also on the customers of these vendors. The investments IBM is making in new growth areas are important not only for its appeal to startups, but also for its ability to help its traditional customers innovate. The success of IBM’s reinvention hangs on its ability to help others reinvent themselves.

At the forefront of IBM’s reinvention journey is a $1 billion investment in Jeopardy-winning Watson, which it hopes will usher in a new era of “cognitive computing.” Earlier this month, Rometty and Mike Rhodin, head of IBM’s Watson business unit, opened its worldwide headquarters at the heart of New York Silicon Alley, across the street from Facebook. IBM also announced new customers for Watson in 20 different countries, new partners developing Watson apps, five new Watson client experience centers around the world, and that Watson has started to learn Spanish so it could help Spain’s CaixaBank employees advise the bank’s customers.

The cognitive computing era is defined by “systems that can understand natural language, that can start to connect the dots or create an understanding of what they read, and then learn through practice,” Rhodin told me last month on the sidelines of the EmTech MIT event hosted by MIT Technology Review. He added: “Eras are measured in decades. We are in year three. Every day we are finding new things we could be doing.”

These are indeed early days. At the time of the Jeopardy! contest, each time a new document was added to Watson’s library, it needed to read the entire library again. Now, Watson can ingest new information in real time. Other challenges are yet to be resolved. For example, teaching Watson to carry context from question to question to enable continuous dialog. Or teaching Watson when not to answer a question and how to break a question into multiple questions.

As IBM learns from its work with customers and partners and overcomes these type of challenges, Rhodin sees Watson’s great promise mainly in its ability to help humans deal with information overload. He says: “In many professions, what we are seeing is that the information is overwhelming. I don’t know how doctors or lawyers or teachers keep up with the amount of things that are changing around them. The idea of tooling to help them makes sense to me.”

In medicine, the answer to information overload is over-specialization. But specialization can stand in the way of more holistic treatments of patients and personalized medicine. Watson can help a highly specialized physician—or just about any other professional—see the bigger picture but it can also help newcomers to the profession learn best practices and get answers to their questions.

Help or replace? At the end of his 2011 Jeopardy! contest with Watson, Ken Jennings added to his final response “I for one welcome our new computer overlords.” He later wrote: “When I was selected as one of the two human players… I envisioned myself as the Great Carbon-Based Hope against a new generation of thinking machines… ‘Quiz show contestant’ may be the first job made redundant by Watson, but I’m sure it won’t be the last.”

IBM responds to the endless talk about “the rise of the machines” by emphasizing Watson’s “partnership” with humans and the way it “enhances” their work. As an example, Rhodin brought up IBM’s work with Genesys, a leading call center vendor. Watson is used both to help callers by answering frequently asked questions and as agent-assist technology when the call is escalated to a human. Rometty is quoted by Walter Isaacson in his new book, The Innovators: “I watched Watson interact in a collegial way with the doctors. It was the clearest testament of how machines can truly be partners with humans rather than try to replace them.”

In addition to age-old fears about automation and loss of jobs, there are other potential societal challenges to Watson and cognitive computing. One that Rhodin talked about is the need to educate the market that Watson was designed as a probabilistic, rather than a deterministic system. “Probabilistic systems are going to give you different answers in different times based on the best available information,” says Rhodin. “They are going to be based on a confidence level supported by evidence as opposed to a degree of certainty. Watson is giving you hypotheses with a confidence factor and these help you explore other avenues.”

Indeed, explaining to the public and to Watson’s users, how it works and what to expect from it, would require a concerted educational effort by IBM. People, including educated professionals, demand answers and certainty, not hypotheses, especially when they interact with technology and engage with science. Priyamvada Natarajan sums up this educational challenge in The New York Review of Books, questioning the degree to which people understand the scientific method and “whether they have an adequate sense of what a scientific theory is, how evidence for it is collected and evaluated, how uncertainty (which is inevitable) is measured, and how one theory can displace another, either by offering a more economical, elegant, honed, and general explanation of phenomena or, in the rare event, by clearly falsifying it…. In a word, the general public has trouble understanding the provisionality of science.”

Automation and augmentation of work can free us to engage in more interesting tasks or become more productive or simply enjoy life better… as long as we don’t blindly rely on it and believe that the machine can “think” for us, completely replace us, even have better judgment without us. In Smart Machines: IBM’s Watson and the era of cognitive computing, John E. Kelly III (head of IBM’s research organization) and Steve Hamm state this position clearly: “The goal isn’t to replicate human brains… This isn’t about replacing human thinking with machine thinking. Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results, each bringing their own superior skills to the partnership.”

Still, while the goal “isn’t to replicate the human brain,” Kelly and Hamm devote an entire chapter to IBM’s TrueNorth chip. The language used to describe the effort is far from consistent (maybe Watson could have helped). Is it a “brain-inspired” chip? Or is it a “brain-based” chip? (“Based” means, at least to me, that we have a complete understanding of how the brain works.) And why lump Watson, TrueNorth, and attempts at computer simulation of the brain (e.g., European Union’s Brain Simulation Platform) together as “cognitive computing”?

These are not just some minor quibbles. A number of prominent academics have recently commented on the “brain-like” hype. Cognitive scientist and machine learning expert Michael Jordan: “We have no idea how neurons are storing information, how they are computing, what the rules are, what the algorithms are, what the representations are, and the like. So we are not yet in an era in which we can be using an understanding of the brain to guide us in the construction of intelligent systems.” Deep Learning expert Andrew Ng agrees, stating at the EmTech event that “We don’t really know how the brain works.”

When you have a massive educational project on your hands, you’d better be very cautious, accurate, and consistent about your claims for a “new era” and what it represents. Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, writes: “Watson was an impressive demonstration but it was narrowly targeted at Jeopardy and exhibited very little semantic understanding. Now Watson has become an IBM brand for any knowledge based activity they do. The intelligence is largely in their PR department.” It well may be that the IBM DNA, while providing it with a great blueprint for getting a message out and getting people excited about what it does, could also be the wrong path to follow today.

In 1948, IBM opened its first frontier homestead in New York, that one for the era of (just) computing. In late 1947, Thomas Watson Sr., IBM’s CEO at the time, “made a decision that forever altered the public perception of computers and linked IBM to the new generation of information machines,” writes Kevin Maney in The Maverick and his Machine. Maney: “He told the engineers to disassemble the SSEC [IBM’s Selective Sequence Electronic Calculator] and set it up in the ground floor lobby of IBM’s 590 Madison Avenue headquarters. The lobby was open to the public and its large windows allowed a view of the SSEC for the multitudes cramming the sidewalks on Madison and 57th street. … The spectacle of the SSEC defined the public’s image of a computer for decades. Kept dust-free behind glass panels, reels of electronic tape ticked like clocks, punches stamped out cards and whizzed them into hoppers, and thousands of tiny lights flashed on and off in no discernable pattern… Pedestrians stopped to gawk and gave the SSEC the nickname ‘Poppy.’ … Watson took the computer out of the lab and sold it to the public.”

Watson understood that successful selling to the public was an important factor in the success of selling to businesses (today it’s called “thought leadership”). IBM has successfully continued to capitalize and improve on this tradition.

It may well be, however, that our times call for a somewhat different approach. IBM should extend and expand the brilliant Jeopardy! public relations coup, maybe even provide the public with free access to some of Watson’s capabilities (IBM already provides a cloud-based version of Watson to 10 universities in North America for their students to use in cognitive computing classes). At the same time, it’s probably best not to generate unnecessary hype and speculation, and not indulge in grand visions of where computing may be going. After all, we’ve gotten used to surprising and useful new technologies coming from unexpected corners that succeed or fail based on the benefits they provide us. Google (and Facebook, and Baidu, and all the other companies investing in a new generation of artificial intelligence systems) don’t talk about a new era.

What Watson has done so far is quite impressive, so why not stick to its achievements and avoid using vague language about a new era of computing? Isn’t Watson Oncology, providing medical diagnostics to parts of the world where access to modern medicine is limited, an impressive achievement all on its own?

It will be great to see many more similar achievements by IBM and its partners in the years to come. What’s required are long-term investments, eliminating unnecessary hype, and not breaking-up IBM. The abandonment of the profit road map first announced by Rometty’s predecessor is a giant leap on the road to reinvention.

[Originally Published on Forbes.com]

 

Posted in AI | Leave a comment

Jeopardy champion Jennings on how a computer beat him at his own game (Video)

[youtube https://www.youtube.com/watch?v=b2M-SeKey4o?rel=0]

Jennings in Slate:

…there’s no shame in losing to silicon, I thought to myself as I greeted the (suddenly friendlier) team of IBM engineers after the match. After all, I don’t have 2,880 processor cores and 15 terabytes of reference works at my disposal—nor can I buzz in with perfect timing whenever I know an answer. My puny human brain, just a few bucks worth of water, salts, and proteins, hung in there just fine against a jillion-dollar supercomputer.

“Watching you on Jeopardy! is what inspired the whole project,” one IBM engineer told me, consolingly. “And we looked at your games over and over, your style of play. There’s a lot of you in Watson.” I understood then why the engineers wanted to beat me so badly: To them, I wasn’t the good guy, playing for the human race. That was Watson’s role, as a symbol and product of human innovation and ingenuity. So my defeat at the hands of a machine has a happy ending, after all. At least until the whole system becomes sentient and figures out the nuclear launch codes. But I figure that’s years away.

Posted in AI | Leave a comment

Recruiting Data Scientists to Mine the Data Explosion

DigitalUniverse_WSJ

 

Wes Hunt, Chief Data Officer (CDO) at Nationwide Mutual Insurance Co. on recruiting data scientists:

Finding talent is my largest challenge. Someone who understands our business, who has quantitative skills, who has the technical skills to create the models, and who is able to persuade others that the insights they’ve come up with are ones you can trust and take action on. The hardest part is persuasion. You get the quantitative skills, but there’s a struggle in that ability to communicate effectively. We’ll often pair people together, but we’d really like to grow the talent.

When I was in marketing, we put a focus on liberal-arts-educated individuals, because abstract thinking where there are ambiguous data sets is an area where they are comfortable. Ph.D.s in psychology were a great recruiting pool. A psych Ph.D. has a fair amount of statistical training. We created a program to recruit Ph.D.s.

There’s not yet an educational discipline and curriculum that produces data scientists at the scale that would clear the market. So the way we’ve focused on it is to find people with innate curiosity and critical thinking. You can teach the other skills. On my team, I have a pathologist, a bioengineering student who trained in doing heart research, an M.B.A., and someone who is trained in traditional data architecture. I also have a landscape construction engineer and a psychology Ph.D.

 

Posted in Data Growth, Data Science, Data Science Careers | Leave a comment

Doug Cutting on Hadoop, October 2014 (Video)

[youtube https://www.youtube.com/watch?v=0GOxDBR6VAU?rel=0]

Posted in Misc | Leave a comment

Home Monitoring mHealth Wearable Devices 2013-2019

Wearables_health_ABI

ABI Research:

Over the next 5 years, a new generation of elderly home care services will drive wearable device shipments to more than 44 million in 2019 up from just 6 million in 2013. In 2014 alone, shipments of wearable devices linked to elderly care systems will more than double over those in 2013, finds the latest ABI Research analysis of the mHealth market.

Growing adoption comes as tech savvy families increasingly turn to home monitoring offerings for assurance their aging parents and family members are safe and well. In addition, new offerings are boosting and extending a market that has long been the territory of dedicated, “Help! I’ve fallen and I can’t get up”-type personal emergency response systems. A host of niche players including BeClose, GrandCare Systems, Independa and others have all emerged to capitalize on a combination of market demand and the potential to leverage connected devices and systems.

In the past few months alone, one start-up, Live!y, has revamped and re-launched its offering to include a watch that offers activity tracking alongside personal emergency response services, while AT&T has added elderly care monitoring to its Digital Life smart home package. These players reflect how device manufacturers and service providers alike are increasingly targeting the elder care market and doing so with more feature rich offerings.

These findings are part of ABI Research’s mHealth Wearables, Platforms and Services Market Research which looks at the rapidly developing market for wearable wireless sensors, by device, connectivity and region across sports, fitness and wellbeing, home care monitoring, remote patient monitoring, and on-site professional healthcare markets.

Posted in Misc | Leave a comment

Most companies analyze a mere 12% of their data (Infographic)

Big_Data_Platfora_infographic

 

Source: Platfora

 

Posted in Big Data Analytics | Leave a comment

Gartner on Top Trends for 2015 and Beyond

PredictionsOn the occasion of its sold-out Symposium/ITxpo this week, Gartner revealed its predictions for top technology trends, the impact of technology on businesses and consumers, and the continuing evolution of the IT organization and the role of the CIO. Here is my summary of Gartner’s press releases:

Digital disruption will give rise to new businesses, some created by machines

By 2017, says Gartner, a significant disruptive digital business will be launched that was conceived by a computer algorithm. The most successful startups will mesh the digital world with existing physical logistics to create a consumer-driven network—think Airbnb and Uber—and challenge established markets consisting of isolated physical units.

The meshing of the digital with the physical will impact how we think about product development. By 2015, more than half of traditional consumer products will have native digital extensions and by 2017, 50% of consumer product investments will be redirected to customer experience innovations. The wide availability of product, pricing and customer satisfaction information has eroded the competitive advantage afforded in the past by product innovation and is shifting attention to customer experience innovation as the key to a lasting brand loyalty.

The meshing of the digital with the physical also means the rise of the Internet of Things. “This year,” says Gartner, “enterprises will spend over $40 billion designing, implementing and operating the Internet of Things.” That’s still a tiny slice of worldwide spending on IT which is projected to surpass $3.9 trillion in 2015, a 3.9% increase from 2014. But much of this spending will be driven by the digital economy and the Internet of Things is in the driver seat.  Gartner defines digital business as new business designs that blend the virtual world and the physical worlds, changing how processes and industries work through the Internet of Things.

Consider this: Since 2013, 650 million new physical objects have come online; 3D printers became a billion dollar market10% of automobiles became connected; and the number of Chief Data Officers and Chief Digital Officer positions have doubled. But fasten your seat belts: In 2015, Gartner predicts, all of these things will double again.

Managing and leading the digital business, mostly by humans

Gartner’s survey of 2,800 CIOs in 84 countries showed that CIOs are fully aware that they will need to change in order to succeed in the digital business, with 75% of IT executives saying that they need to change their leadership style in the next three years.

“The exciting news for CIOs,” says Gartner, “is that despite the rise of roles, such as the chief digital officer, they are not doomed to be an observer of the digital revolution.” According the survey, 41% of CIOs are reporting to their CEO. Gartner notes that this is a return to one of the highest levels it has ever been, no doubt because of the increasing importance of information technology to all businesses.

Still, reporting to the CEO does not necessarily mean leading the digital initiatives of the business. Reports from the Symposium highlighted another Gartner finding: While CIOs say they are driving 47% of digital leadership only 15% of CEOs agree that they do so. Similarly, while CIOs estimate that 79% of IT spending will be “inside” the IT budget (up slightly from last year), Gartner says that 38% of total IT spending is outside of IT already, and predicts that by 2017, it will be over 50%. This is a “shift of demand and control away from IT and toward digital business units closer to the customer,” says Gartner. It further estimates that 50% of all technology sales people are actively selling direct to business units, not IT departments.

Gartner sees the established behaviors and beliefs of the IT organization, the “best practices” that have served it well in previous years, as the biggest obstacle for CIOs in their pursuit of digital leadership. Process management and control our not as important as vision and agility. Compounding the problem of obsolete leadership style and inadequate skills, is a fundamental requirement of the digital business: It must be unstable. By 2017, 70% of successful digital business models will rely on deliberately unstable processes designed to shift as customer needs shift. “This holistic approach,” says Gartner, “blending business model, processes, technology and people will fuel digital business success.”

The rise of smart machines

By 2015, there will be more than 40 vendors with commercially available managed services offerings leveraging smart machines and industrialized services. By 2018, the total cost of ownership for business operations will be reduced by 30% through smart machines and industrialized services.

Smart machines are an emerging “super class” of technologies that perform a wide variety of work, of both the physical and the intellectual kind. Smart machines will automate decision making. Therefore, they will not only affect jobs based on physical labor, but they will also impact jobs based on complex knowledge worker tasks. “Smart machines,” says Gartner, “will not replace humans as people still need to steer the ship and are critical to interpreting digital outcomes.” But these humans will have new types of jobs.

Top digital jobs

By 2018, Gartner predicts, digital business will require 50% less business process workers and 500% more key digital business jobs, compared with traditional models. The top jobs for digital over the next seven years will be:

•             Integration Specialists

•             Digital Business Architects

•             Regulatory Analysts

•             Risk Professionals

Gartner: “You must build talent for the digital organization of 2020 now. Not just the digital technology organization, but the whole enterprise. Talent is the key to digital leadership.”

Where things can go wrong

Gartner highlights two areas of potential vulnerabilities as business pursue digital opportunities: Lack of portfolio management skills and inadequate risk management.

By year-end 2016, 50% of digital transformation initiatives will be unmanageable due to lack of portfolio management skills, leading to a measurable negative lost market share. The digital business brings with it vastly different and higher levels of risk, say 89% of CIOs and 69% believe that the discipline of risk management is not keeping up.

Gartner: “CIOs need to review with the enterprise and IT risk leaders whether risk management is adapting fast enough to a digital world.”  This is also an urgent tasks, I might add, for CEOs and the board of directors.

The pursuit of longer life and increased happiness (i.e., better shopping experience)

Gartner predicts that all these connected devices will have a positive impact on our health. By 2017, the use of smartphones will reduce by 10% the costs for diabetic care. By 2020, life expectancy in the developed world will increase by 0.5 years due to widespread adoption of wireless health monitoring technology.

The retail industry could be the industry most impacted by the digital tsunami, drastically altering our shopping experience. By year-end 2015, mobile digital assistants will have taken on tactical mundane processes such as filling out names, addresses and credit card information. By year-end 2016, more than $2 billion in online shopping will be performed exclusively by mobile digital assistants. Yearly autonomous mobile assistant purchasing will reach $2 billion dollars annually, representing about 2.5 percent of mobile users trusting assistants with $50 a year.

By 2017, U.S. customers’ mobile engagement behavior will drive mobile commerce revenue in the U.S. to 50% of U.S. digital commerce revenue. A renewed interest in mobile payments will arise in 2015, together with a significant increase in mobile commerce. By 2016, there will be an increase in the number of offers from retailers focused on customer location and the length of time in store. By 2020, retail businesses that utilize targeted messaging in combination with internal positioning systems (IPS) will see a five percent increase in sales.

These trends seem to be a logical extension of current technologies. But 3D printing will bring us a completely new shopping experience and will expand widely what’s on offer. By 2015, more than 90% of online retailers of durable goods will actively seek external partnerships to support the new “personalized” product business models and by 2017, nearly 20% of these retailers will use 3D printing to create personalized product offerings.

Gartner: “The companies that set the strategy early will end up defining the space within their categories.”

This statement, which Gartner made specifically about the personalized products business, is true for all types of digital businesses and for all the new management processes and attitudes that all organizations should put in place, sooner rather than later.

Sources:

Gartner Reveals Top Predictions for IT Organizations and Users for 2015 and Beyond

Gartner Survey of More Than 2,800 CIOs Reveals That CIOs Must “Flip” Their Leadership Styles to Grasp the Digital Opportunity

Gartner Says Digital Business Economy is Resulting in Every Business Unit Becoming a Technology Startup

Gartner Identifies the Top 10 Strategic Technology Trends for 2015

[Originally published on Forbes.com]

Posted in Predictions | Leave a comment

Oren Etzioni on Building Intelligent Machines

[youtube https://www.youtube.com/watch?v=E_6AZ8slivc?rel=0]
“There are more things in AI than classification… the entire paradigm of classification, which has fueled machine learning and data mining, is very limited… What we need is a process that is structured, multi-layered, knowledge-intensive, much more like kids playing Lego, instead of a karate chop that divides things into categories… Current knowledge bases are fact-rich but knowledge poor…’You can’t play 20 questions with Nature and win’ (Allen Newell, 1973)… What we need is knowledge, reasoning, and explanation.”

Slides (from KDD Keynote, scroll all the way down the page) are here 

From GigaOm:

Oren Etzioni, executive director of the Allen Institute of Artificial Intelligence (formerly founder of Farecast and Decide.com), takes a contrarian view of all the deep learning hype. Essentially, he argues, while systems that are better than ever at classifying images or words are great, they’re still not “intelligent.” He describes work underway to build systems that can truly understand content, including one capable of passing fourth-grade short-answer exams.

Etzioni on reddit:

“I think that poeple are often confusing computer autonomy with computer intelligence. computer viruses are autonomous, dangerous, but not particularly intelligent. Chess playing programs are intelligent (in a sense) but very limited. They don’t even play checkers!”

“I love star trek and particularly the star trek computer because AI is used there as a tool to help and inform Captain Kirk and the crew. That’s a much better model than fear mongering in movies like HER and Transcendence. AI can be used to help us and enhance our abilities. For example, we are all inundated with huge amounts of text, articles, technical papers, and nobody can keep up! How about if your doctor had a tool that would help him or her to figure out the latest studies and procedures relevant to your condition? Even better—what if you had a tool to help figure out what’s going on that’s much better [than] google or webmd.”

[Why do you enjoy working on AI? What first motivated you to get into the field?] “It is one of the most fundamental intellectual problems and it’s really, really hard. I find computers so rigid, so stupid that it’s infuriating. My goal is to fight “artificial stupidity” and to build AI programs that help scientists, doctors, and regular folks make sense of the world and the tsunami of information that we all face every day.”

The Turing test is about tricking someone to believe that a computer is human. At AI2 we are working on programs that will try to pass tests in science & math which requires the program to understand the questions (hard) utilize background knowledge (even harder) and accumulate that knowledge automatically (great big challenge).”

[Would the IBM computer that played Jeopardy be called intelligent by your metrics?] “Watson was an impressive demonstration but it was narrowly targeted at Jeopardy and exhibited very little semantic understanding. Now Watson has become an IBM brand for any knowledge based activity they do. The intelligence is largely in their PR department.”

 

 

Posted in AI, Machine Learning | Leave a comment

Josh Wills on Machine Learning in a Business Setting

[youtube https://www.youtube.com/watch?v=IgfRdDjLxe0?rel=0]
Academic machine learning is all about optimization. Machine learning in a business setting is all about understanding: “My focus is always on how do I understand what the system is doing, come up with new hypotheses about this very complex system, test them, and then use what I’ve learned from those tests to find new ways to improve the system.”

An overview of Cloudera’s current data science tools, including Oryx and Spark for building and serving machine learning models, Gertrude for multivariate testing, and Impala for ludicrously high-performance SQL queries against HDFS.

Josh Wills is Cloudera’s Senior Director of Data Science

Posted in Data Science, Data Scientists, Machine Learning | Leave a comment

Neil Gershenfeld on Turning Data into Things and Things into Data (Video)

[youtube https://www.youtube.com/watch?v=L0RDrSKenGo]

Neil Gershenfeld, Director of MIT’s Center for Bits and Atoms, at the 2014 Solid Conference: Analog telephone calls degraded with distance; digitizing communications allowed errors to be detected and corrected, leading to the Internet. Analog computations degraded with time; digitizing computing again allowed errors to be detected and corrected, leading to microprocessors and PCs. Manufacturing today remains analog; although the designs are digital, the processes are not. Neil Gershenfeld presents emerging research on digitizing fabrication by coding the construction of functional materials, and explores its implications for programming the physical world.

Gershenfeld wrote in his 1999 book, When Things Start to Think: “Beyond seeking to make computers ubiquitous, we should try to make them unobtrusive…. For all the coverage of the growth of the Internet and the World Wide Web, a far bigger change is coming as the number of things using the Net dwarf the number of people. The real promise of connecting computers is to free people, by embedding the means to solve problems in the things around us.”

Recently, Gershenfeld published (with JP Vasseur) “As Objects Go Online” in Foreign Affairs:

“Although the Internet of Things is now technologically possible, its adoption is limited by a new version of an old conflict. During the 1980s, the Internet competed with a network called BITNET, a centralized system that linked mainframe computers. Buying a mainframe was expensive, and so BITNET’s growth was limited; connecting personal computers to the Internet made more sense. The Internet won out, and by the early 1990s, BITNET had fallen out of use. Today, a similar battle is emerging between the Internet of Things and what could be called the Bitnet of Things. The key distinction is where information resides: in a smart device with its own IP address or in a dumb device wired to a proprietary controller with an Internet connection. Confusingly, the latter setup is itself frequently characterized as part of the Internet of Things. As with the Internet and BITNET, the difference between the two models is far from semantic. Extending IP to the ends of a network enables innovation at its edges; linking devices to the Internet indirectly erects barriers to their use…

The size and speed of the Internet have grown by nine orders of magnitude since the time it was invented. This expansion vastly exceeds what its developers anticipated, but that the Internet could get so far is a testament to their insight and vision. The uses the Internet has been put to that have driven this growth are even more surprising; they were not part of any original plan. But they are the result of an open architecture that left room for the unexpected. Likewise, today’s vision for the Internet of Things is sure to be eclipsed by the reality of how it is actually used. But the history of the Internet provides principles to guide this development in ways that are scalable, robust, secure, and encouraging of innovation.

The Internet’s defining attribute is its interoperability; information can cross geographic and technological boundaries. With the Internet of Things, it can now leap out of the desktop and data center and merge with the rest of the world. As the technology becomes more finely integrated into daily life, it will become, paradoxically, less visible. The future of the Internet is to literally disappear into the woodwork.”

 

Posted in Internet of Things | Leave a comment