Human level AI has been 15 to 25 years away for the past 50 years https://t.co/DTP0mvjnaN
— Mitch Kapor (@mkapor) November 12, 2014
Source: Armstrong, Stuart, and Kaj Sotala. 2012. “How We’re Predicting AI—or Failing To.”
Human level AI has been 15 to 25 years away for the past 50
400 million connected light bulbs installed by 2020
ABI Research:
The market for connected LED or “smart” light bulbs is in its embryonic stage of development, with shipments totaling fewer than 2.5 million units in 2013. However, falling LED prices and the endorsement of ZigBee Light Link as the preferred connectivity solution by a number of leading lighting manufacturers means that annual smart bulb shipments are set to increase to 223 million by 2020, achieving a total installed base of over 400 million. “Because of the additional dimensions smart lighting brings to the consumer lifestyle, including lighting automation, and because of its carbon footprint efficiency, this industry will rapidly become one of the key technologies that could bring IoE closer to consumers,” commented Malik Saadi, Practice Director at ABI Research.
802.15.4-enabled light bulbs will become the dominant connectivity solution, taking a three-quarter share of the market throughout this period. Among these, ZigBee will be the most prolific. As time progresses ZigBee will face increasing competition from emerging alternatives such as the 6LoWPAN based protocols such as Thread, Bluetooth Smart, and Wi-Fi—particularly mesh and low-power variants.
The prices of LED bulbs continue to fall rapidly and because their replacement cycles are significantly longer than traditional light bulbs, the market is predicted to quickly saturate. “Overall LED light bulb sales are currently driven by new additions but as the market saturates, industry players will increasingly use connectivity to bring added value features and services to residential lighting systems and attempt to boost consumers’ interest in upgrading to a smarter lighting environment,” Saadi said.
These findings are part of ABI Research’s IoE Semiconductors and Smart Home Market Research.
IDC: 60% of CIOs will be supplanted by Chief Digital Officers by 2025
In a webinar last week, IDC discussed its CIO predictions for 2015 and beyond (Louis Columbus provides an excellent summary here). Of IDC’s 10 predictions, the one that intrigued me most, predicted that most of the audience, presumably CIOs and senior IT executives, will eventually relinquish their most interesting and important responsibilities to Chief Digital Officers (CDOs):
“By 2020, 60% of CIOs in global organizations will be supplanted by the Chief Digital Officer (CDO) for the delivery of IT-enabled products and digital services.”
In its presentation, IDC added that “some CIOS will find an opportunity to expand their role leveraging their experience in setting strategy, innovation, and relationships.” But why only 40% will be able to expand their role and the others “will be challenged to fill the growing gap for the CDO role”?
Chief Digital Officers (CDOs) are a new breed of senior executives and their ranks have doubled since 2013, Gartner recently estimated. With all the excitement about the promise of “digital” in general and “big data” in particular, there is an urgent need to harness and manage the multiple, sometime duplicate or even conflicting activities across the business. The Chief Digital Officer is expected to provide a unifying vision and develop a digital strategy, transforming existing processes and products and finding new digital-based profit and revenue opportunities.
In short, the Chief Digital Officer is in charge of digital governance.
Here we go again: The CIO is relegated to “keeping the trains running on time.” Shouldn’t CIOs take charge of digital governance? Why not entrust CIOs with finding new insights in the data or uncovering the new digital business opportunities? The answer seems to be that they are not considered to have the right experience and skills to lead the “digital transformation” of the business. As Dan Woods observed: “Most of the time CDOs are people who had marketing, PR, or communications backgrounds or other roles that were focused on the customer…. the CDO role, staffed by people who were technology outsiders, may become the most important… in defining the role of technology in many businesses. CIOs and CTOs who let this happen only have themselves to blame.”
Indeed, it all depends on what is expected of CIOs, but even more, on what CIOs themselves expect from their roles. Many of IDC’s other predictions illustrate why adding CDO-type responsibilities (or not relinquishing them) is so difficult in the current environment—there’s so much else to do.
Security is identified by IDC as the most important issue to consider, especially in the next 12 month: “By 2016, security will be a top 3 business priority for 70% of CEOs of global enterprises.” Note the “CEOs” in the prediction—as the IDC analysts noted, the Target security breach showed the potential personal consequences for CEOs and CIOs everywhere. “It’s not just high school kids anymore,” IDC noted, as nations, corporations, and organized crime “institutionalize cyber warfare.”
The same force that has given rise to the CDO—the digitization of all businesses and the increased use of information technology in all aspects of the business—is also responsible for the rise in importance of the CIO. As Garnter observed recently, based on its survey of 2,800 CIOs in 84 countries, 41% of CIOs are reporting to their CEO, a return to one of the highest levels ever.
As the IDC analysts said on the webinar, “the CIO is perfectly positioned to take a senior leadership role regarding security.” And possibly a few other new responsibilities, according to their predictions:
“By 2017, 80% of the CIO’s time will be focused on analytics, cybersecurity and creating new revenue streams through digital services.”
“By 2016, 80% of CIOs will deliver a new architectural framework that enables innovation and improved business decision-making.”
“By 2018, 30% of CIOs of global organizations will have rolled out a pan-enterprise data and analytics strategy.”
Looks like there are many new opportunities for CIOs “to take a leadership position.” Still, the IDC analysts said in response to a question that the majority of CIOs are “survivors,” adopting a wait and see attitude, waiting for the lines of business to introduce new technologies. This is in contrast to the “thrivers,” who make IT more strategic, are not afraid to experiment and fail, opt for the anti-fragile approach (see my take on the subject here) and go beyond their comfort zone.
The central IT organization and CIOs may become irrelevant in the digital economy. Or, CIOs could use this opportunity to demonstrate leadership that is based on deep experience with and understanding of what data, big or small, is all about—its management, its analysis, and its use in the service of innovation, the driving force of any enterprise.
[Originally published on Forbes.com]
IBM, Watson, and Cognitive Computing
Reacting to 10 quarters in a row of declining revenues and the abandonment of IBM’s profit target for 2015, UBS’s Steve Milunovich asked on the Q3 earnings call about IBM’s appeal to Silicon Valley startups. Giving voice to the rising conviction on Wall Street and beyond that the answer to the “disruption” of large companies is to “focus,” Milunovich stated that “they all argue of course they are going to disrupt the large companies, that the large companies basically have to break up.”
IBM’s CEO Ginni Rometty had a two-fold answer. The new areas of “higher value”–big data analytics, the cloud, social/mobile/security–grew almost 20%. IBM’s investments and offerings in these markets appeal to startups, argued Rometty, as evident by the 3,000 applications to join the Watson ecosystem. IBM can and will deliver the type of innovative, non-traditional IT infrastructure and solutions startups typically use.
Innovation is also very much on the mind of IBM’s traditional customers. Rometty reported on a meeting she had recently with 30 CIOs of IBM’s largest customers, where IBM was called “a navigator,” the company that understands “how an enterprise operates and how you should pull all of this together.”
It is important to keep in mind that innovation—new technologies, new business models, new processes—is making a big impact not only on IT vendors such as IBM but also on the customers of these vendors. The investments IBM is making in new growth areas are important not only for its appeal to startups, but also for its ability to help its traditional customers innovate. The success of IBM’s reinvention hangs on its ability to help others reinvent themselves.
At the forefront of IBM’s reinvention journey is a $1 billion investment in Jeopardy-winning Watson, which it hopes will usher in a new era of “cognitive computing.” Earlier this month, Rometty and Mike Rhodin, head of IBM’s Watson business unit, opened its worldwide headquarters at the heart of New York Silicon Alley, across the street from Facebook. IBM also announced new customers for Watson in 20 different countries, new partners developing Watson apps, five new Watson client experience centers around the world, and that Watson has started to learn Spanish so it could help Spain’s CaixaBank employees advise the bank’s customers.
The cognitive computing era is defined by “systems that can understand natural language, that can start to connect the dots or create an understanding of what they read, and then learn through practice,” Rhodin told me last month on the sidelines of the EmTech MIT event hosted by MIT Technology Review. He added: “Eras are measured in decades. We are in year three. Every day we are finding new things we could be doing.”
These are indeed early days. At the time of the Jeopardy! contest, each time a new document was added to Watson’s library, it needed to read the entire library again. Now, Watson can ingest new information in real time. Other challenges are yet to be resolved. For example, teaching Watson to carry context from question to question to enable continuous dialog. Or teaching Watson when not to answer a question and how to break a question into multiple questions.
As IBM learns from its work with customers and partners and overcomes these type of challenges, Rhodin sees Watson’s great promise mainly in its ability to help humans deal with information overload. He says: “In many professions, what we are seeing is that the information is overwhelming. I don’t know how doctors or lawyers or teachers keep up with the amount of things that are changing around them. The idea of tooling to help them makes sense to me.”
In medicine, the answer to information overload is over-specialization. But specialization can stand in the way of more holistic treatments of patients and personalized medicine. Watson can help a highly specialized physician—or just about any other professional—see the bigger picture but it can also help newcomers to the profession learn best practices and get answers to their questions.
Help or replace? At the end of his 2011 Jeopardy! contest with Watson, Ken Jennings added to his final response “I for one welcome our new computer overlords.” He later wrote: “When I was selected as one of the two human players… I envisioned myself as the Great Carbon-Based Hope against a new generation of thinking machines… ‘Quiz show contestant’ may be the first job made redundant by Watson, but I’m sure it won’t be the last.”
IBM responds to the endless talk about “the rise of the machines” by emphasizing Watson’s “partnership” with humans and the way it “enhances” their work. As an example, Rhodin brought up IBM’s work with Genesys, a leading call center vendor. Watson is used both to help callers by answering frequently asked questions and as agent-assist technology when the call is escalated to a human. Rometty is quoted by Walter Isaacson in his new book, The Innovators: “I watched Watson interact in a collegial way with the doctors. It was the clearest testament of how machines can truly be partners with humans rather than try to replace them.”
In addition to age-old fears about automation and loss of jobs, there are other potential societal challenges to Watson and cognitive computing. One that Rhodin talked about is the need to educate the market that Watson was designed as a probabilistic, rather than a deterministic system. “Probabilistic systems are going to give you different answers in different times based on the best available information,” says Rhodin. “They are going to be based on a confidence level supported by evidence as opposed to a degree of certainty. Watson is giving you hypotheses with a confidence factor and these help you explore other avenues.”
Indeed, explaining to the public and to Watson’s users, how it works and what to expect from it, would require a concerted educational effort by IBM. People, including educated professionals, demand answers and certainty, not hypotheses, especially when they interact with technology and engage with science. Priyamvada Natarajan sums up this educational challenge in The New York Review of Books, questioning the degree to which people understand the scientific method and “whether they have an adequate sense of what a scientific theory is, how evidence for it is collected and evaluated, how uncertainty (which is inevitable) is measured, and how one theory can displace another, either by offering a more economical, elegant, honed, and general explanation of phenomena or, in the rare event, by clearly falsifying it…. In a word, the general public has trouble understanding the provisionality of science.”
Automation and augmentation of work can free us to engage in more interesting tasks or become more productive or simply enjoy life better… as long as we don’t blindly rely on it and believe that the machine can “think” for us, completely replace us, even have better judgment without us. In Smart Machines: IBM’s Watson and the era of cognitive computing, John E. Kelly III (head of IBM’s research organization) and Steve Hamm state this position clearly: “The goal isn’t to replicate human brains… This isn’t about replacing human thinking with machine thinking. Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results, each bringing their own superior skills to the partnership.”
Still, while the goal “isn’t to replicate the human brain,” Kelly and Hamm devote an entire chapter to IBM’s TrueNorth chip. The language used to describe the effort is far from consistent (maybe Watson could have helped). Is it a “brain-inspired” chip? Or is it a “brain-based” chip? (“Based” means, at least to me, that we have a complete understanding of how the brain works.) And why lump Watson, TrueNorth, and attempts at computer simulation of the brain (e.g., European Union’s Brain Simulation Platform) together as “cognitive computing”?
These are not just some minor quibbles. A number of prominent academics have recently commented on the “brain-like” hype. Cognitive scientist and machine learning expert Michael Jordan: “We have no idea how neurons are storing information, how they are computing, what the rules are, what the algorithms are, what the representations are, and the like. So we are not yet in an era in which we can be using an understanding of the brain to guide us in the construction of intelligent systems.” Deep Learning expert Andrew Ng agrees, stating at the EmTech event that “We don’t really know how the brain works.”
When you have a massive educational project on your hands, you’d better be very cautious, accurate, and consistent about your claims for a “new era” and what it represents. Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence, writes: “Watson was an impressive demonstration but it was narrowly targeted at Jeopardy and exhibited very little semantic understanding. Now Watson has become an IBM brand for any knowledge based activity they do. The intelligence is largely in their PR department.” It well may be that the IBM DNA, while providing it with a great blueprint for getting a message out and getting people excited about what it does, could also be the wrong path to follow today.
In 1948, IBM opened its first frontier homestead in New York, that one for the era of (just) computing. In late 1947, Thomas Watson Sr., IBM’s CEO at the time, “made a decision that forever altered the public perception of computers and linked IBM to the new generation of information machines,” writes Kevin Maney in The Maverick and his Machine. Maney: “He told the engineers to disassemble the SSEC [IBM’s Selective Sequence Electronic Calculator] and set it up in the ground floor lobby of IBM’s 590 Madison Avenue headquarters. The lobby was open to the public and its large windows allowed a view of the SSEC for the multitudes cramming the sidewalks on Madison and 57th street. … The spectacle of the SSEC defined the public’s image of a computer for decades. Kept dust-free behind glass panels, reels of electronic tape ticked like clocks, punches stamped out cards and whizzed them into hoppers, and thousands of tiny lights flashed on and off in no discernable pattern… Pedestrians stopped to gawk and gave the SSEC the nickname ‘Poppy.’ … Watson took the computer out of the lab and sold it to the public.”
Watson understood that successful selling to the public was an important factor in the success of selling to businesses (today it’s called “thought leadership”). IBM has successfully continued to capitalize and improve on this tradition.
It may well be, however, that our times call for a somewhat different approach. IBM should extend and expand the brilliant Jeopardy! public relations coup, maybe even provide the public with free access to some of Watson’s capabilities (IBM already provides a cloud-based version of Watson to 10 universities in North America for their students to use in cognitive computing classes). At the same time, it’s probably best not to generate unnecessary hype and speculation, and not indulge in grand visions of where computing may be going. After all, we’ve gotten used to surprising and useful new technologies coming from unexpected corners that succeed or fail based on the benefits they provide us. Google (and Facebook, and Baidu, and all the other companies investing in a new generation of artificial intelligence systems) don’t talk about a new era.
What Watson has done so far is quite impressive, so why not stick to its achievements and avoid using vague language about a new era of computing? Isn’t Watson Oncology, providing medical diagnostics to parts of the world where access to modern medicine is limited, an impressive achievement all on its own?
It will be great to see many more similar achievements by IBM and its partners in the years to come. What’s required are long-term investments, eliminating unnecessary hype, and not breaking-up IBM. The abandonment of the profit road map first announced by Rometty’s predecessor is a giant leap on the road to reinvention.
[Originally Published on Forbes.com]
Jeopardy champion Jennings on how a computer beat him at his own game (Video)
[youtube https://www.youtube.com/watch?v=b2M-SeKey4o?rel=0]
Jennings in Slate:
…there’s no shame in losing to silicon, I thought to myself as I greeted the (suddenly friendlier) team of IBM engineers after the match. After all, I don’t have 2,880 processor cores and 15 terabytes of reference works at my disposal—nor can I buzz in with perfect timing whenever I know an answer. My puny human brain, just a few bucks worth of water, salts, and proteins, hung in there just fine against a jillion-dollar supercomputer.
“Watching you on Jeopardy! is what inspired the whole project,” one IBM engineer told me, consolingly. “And we looked at your games over and over, your style of play. There’s a lot of you in Watson.” I understood then why the engineers wanted to beat me so badly: To them, I wasn’t the good guy, playing for the human race. That was Watson’s role, as a symbol and product of human innovation and ingenuity. So my defeat at the hands of a machine has a happy ending, after all. At least until the whole system becomes sentient and figures out the nuclear launch codes. But I figure that’s years away.
Recruiting Data Scientists to Mine the Data Explosion
Wes Hunt, Chief Data Officer (CDO) at Nationwide Mutual Insurance Co. on recruiting data scientists:
Finding talent is my largest challenge. Someone who understands our business, who has quantitative skills, who has the technical skills to create the models, and who is able to persuade others that the insights they’ve come up with are ones you can trust and take action on. The hardest part is persuasion. You get the quantitative skills, but there’s a struggle in that ability to communicate effectively. We’ll often pair people together, but we’d really like to grow the talent.
When I was in marketing, we put a focus on liberal-arts-educated individuals, because abstract thinking where there are ambiguous data sets is an area where they are comfortable. Ph.D.s in psychology were a great recruiting pool. A psych Ph.D. has a fair amount of statistical training. We created a program to recruit Ph.D.s.
There’s not yet an educational discipline and curriculum that produces data scientists at the scale that would clear the market. So the way we’ve focused on it is to find people with innate curiosity and critical thinking. You can teach the other skills. On my team, I have a pathologist, a bioengineering student who trained in doing heart research, an M.B.A., and someone who is trained in traditional data architecture. I also have a landscape construction engineer and a psychology Ph.D.
Doug Cutting on Hadoop, October 2014 (Video)
[youtube https://www.youtube.com/watch?v=0GOxDBR6VAU?rel=0]
Home Monitoring mHealth Wearable Devices 2013-2019
ABI Research:
Over the next 5 years, a new generation of elderly home care services will drive wearable device shipments to more than 44 million in 2019 up from just 6 million in 2013. In 2014 alone, shipments of wearable devices linked to elderly care systems will more than double over those in 2013, finds the latest ABI Research analysis of the mHealth market.
Growing adoption comes as tech savvy families increasingly turn to home monitoring offerings for assurance their aging parents and family members are safe and well. In addition, new offerings are boosting and extending a market that has long been the territory of dedicated, “Help! I’ve fallen and I can’t get up”-type personal emergency response systems. A host of niche players including BeClose, GrandCare Systems, Independa and others have all emerged to capitalize on a combination of market demand and the potential to leverage connected devices and systems.
In the past few months alone, one start-up, Live!y, has revamped and re-launched its offering to include a watch that offers activity tracking alongside personal emergency response services, while AT&T has added elderly care monitoring to its Digital Life smart home package. These players reflect how device manufacturers and service providers alike are increasingly targeting the elder care market and doing so with more feature rich offerings.
These findings are part of ABI Research’s mHealth Wearables, Platforms and Services Market Research which looks at the rapidly developing market for wearable wireless sensors, by device, connectivity and region across sports, fitness and wellbeing, home care monitoring, remote patient monitoring, and on-site professional healthcare markets.







