Source: ObjectRocket
The Internet of Things is Social Media for Machines (Video)
[youtube https://www.youtube.com/watch?v=qWR32v5uaI8?rel=0]
A panel discussion at the Milken Institute Global Conference 2015, moderated by Michael Schrage, Research Fellow, MIT Sloan Initiative on the Digital Economy. Panelists: Marc Goodman, Author, ?Future Crimes?; Chair for Policy, Law and Ethics, Singularity University; Alex Hawkinson, Founder and CEO, SmartThings; Bridget Karlin, Managing Director, IoT Strategy and Technology Office, Internet of Things Group, Intel Corp.; Gary Shapiro, President and CEO, Consumer Electronics Association.
Schrage: “The glib way of thinking about the Internet of Things is social media for machines.”
Marc Goodman: “The concept of my refrigerator tweeting, snapchatting, or sexting is kind of disturbing.”
Shapiro: “The Internet spreads information; the Internet of Things is actually actionable information.”
Why The Future Is Not What It Used To Be
The IEEE Computer Society published in March a report titled “What Will Our World Look Like in 2022?” It identified 23 technology areas that we can expect to disrupt the state-of-the-art. These range from medical robotics to big data and analytics to photonics to 3D integrated circuits to quantum computing.
The unifying theme for all these technologies is “seamless intelligence,” where everything is connected through ubiquitous networks and interfaces. “We project that by 2022,” the authors of the report say, “society will advance so that intelligence becomes seamless and ubiquitous for those who can afford and use state-of-the-art information technology.”
The IEEE report is a bit different from similar attempts at predicting the future because it comes from technologists, some in academia but others who work at corporate research labs, and is based in part on a survey of members of the IEEE Computer Society. Typically, predictions are the stock-in-trade of think tanks and research firms. Last year, for example, the McKinsey Global Institute published “Disruptive technologies: Advances that will transform life, business, and the global economy,” identifying 12 technologies that “could drive truly massive economic transformations and disruptions in the coming years.” Earlier this year, Juniper Research published “The World in 2020: A Technology Vision,” identifying 11 key technologies that it believes “will become the most disruptive by 2020.”
Beyond the use of the word “disruptive,” there are other commonalities between the three reports. Robotics and drones, 3D printing, the Internet of Things and wearables, self-driving cars, and cloud computing appear in all or at least two of the reports. But, for the most part, there is not a whole lot of agreement on the disruptive technologies of the future. Photonics, real-time translation, and renewable energy, for example, appear in only one of the reports.
The IEEE report opens with the famous Yogi Berra quote: “It’s tough to make predictions, especially about the future.” In the rest of this post, I will discuss three reasons why.
- Innovations that have made a strong impact on us in recent times obscure more important recent innovations.
The first item on The New York Times’ list of greatest inventions of the 19th century, published in 1899, was friction matches, introduced in their modern form in 1827. “For somebody to whom the electric light was as recent an innovation as the VCR is to us, the instant availability of fire on demand had indeed been one of the greatest advances of the century,” wrote Frederick Schwartz in 2000 in Invention & Technology. Which invention of the last 100 years or even 10 years is overshadowing an even more important invention of recent years?
- The road taken is no less important than the end result.
Another difficulty in predicting what the world will look like in just 5 or 7 years from now, is that some predictions eventually become a reality but they still miss altogether exactly how we are going to get there. Often, this is the most important (and practical) part of the prediction. To paraphrase Lewis Carroll, if you know where you are going, it matters a lot which road you are taking.
Many commentators writing this month about the 50th anniversary of Gordon Moore’s article charting a future course for the semiconductor industry (what became to be known as “Moore’s Law”), mentioned his predictions regarding home computers and “personal portable communications equipment.” But they ignored Moore’s prediction that “the biggest potential lies in the production of large systems. In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. Integrated circuits will also switch telephone circuits and perform data processing.”
Moore was right that integrated circuits will have an impact on large systems but failed to see that “the biggest potential” of the constant and predictable miniaturization he forecasted will be in smaller and smaller devices, in ubiquitous computing. In 1965, it was difficult to see that centralized systems will be replaced by distributed, anywhere computing. Which is why Moore added to his use of the term “home computers”—“or at least terminals connected to a central computer.”
- We extrapolate from the present and ignore or misunderstand non-technological factors.
Many predictions are what the forecasters want the future to be or simply an extension of what they are familiar and comfortable with. I have in my files a great example of the genre, a report published in 1976 by the Long Range Planning Service of the Stanford Research Institute (SRI), titled “Office of the Future.”
The author of the report was a Senior Industrial Economist at SRI’s Electronics Industries Research Group, and a “recognized authority on the subject of business automation.” His bio blurb indicates that he “also worked closely with two of the Institute’s engineering laboratories in developing his thinking for this study. The Augmentation Research Center has been putting the office of the future to practical test for almost ten years… Several Information Science Laboratory personnel have been working with state-of-the-art equipment and systems that are the forerunners of tomorrow’s products. The author was able to tap this expertise to gain a balanced picture of the problems and opportunities facing office automation.”
And what was the result of all this research and analysis? The manager of 1985, the report predicted, will not have a personal secretary. Instead he (decidedly not she) will be assisted, along with other managers, by a centralized pool of assistants (decidedly and exclusively, according to the report, of the female persuasion). He will contact the “administrative support center” whenever he needs to dictate a memo to a “word processing specialist,” find a document (helped by an “information storage/retrieval specialist”), or rely on an “administrative support specialist” to help him make decisions.
Of particular interest is the report’s discussion of the sociological factors driving the transition to the “office of the future.” Forecasters often leave out of their analysis the annoying and uncooperative (with their forecast) motivations and aspirations of the humans involved. But this report does consider sociological factors, in addition to organizational, economic, and technological trends. And it’s worth quoting at length what it says on the subject:
“The major sociological factor contributing to change in the business office is ‘women’s liberation.’ Working women are demanding and receiving increased responsibility, fulfillment, and opportunities for advancement. The secretarial position as it exists today is under fire because it usually lacks responsibility and advancement potential. The normal (and intellectually unchallenging) requirements of taking dictation, typing, filing, photocopying, and telephone handling leave little time for the secretary to take on new and more demanding tasks. The responsibility level of many secretaries remains fixed throughout their working careers. These factors can negatively affect the secretary’s motivation and hence productivity. In the automated office of the future, repetitious and dull work is expected to be handled by personnel with minimal education and training. Secretaries will, in effect, become administrative specialists, relieving the manager they support of a considerable volume of work.”
Regardless of the women’s liberation movement of his day, the author could not see beyond the creation of a 2-tier system in which some women would continue to perform dull and unchallenging tasks, while other women would be “liberated” into a fulfilling new job category of “administrative support specialist.” In this 1976 forecast, there are no women managers.
But this is not the only sociological factor the report missed. The most interesting sociological revolution of the office in the 1980s – and one missing from most (all?) accounts of the PC revolution – is what managers (male and female) did with their new word processing, communicating, calculating machine. They took over some of the “dull” secretarial tasks that no self-respecting manager would deign to perform before the 1980s.
This was the real revolution: The typing of memos (later emails), the filing of documents, the recording, tabulating, and calculating. In short, a large part of the management of office information, previously exclusively in the hands of secretaries, became in the 1980s (and progressively more so in the 1990s and beyond) an integral part of managerial work.
This was very difficult, maybe impossible, to predict. It was a question of status. No manager would type before the 1980s because it was perceived as work that was not commensurate with his status. Many managers started to type in the 1980s because now they could do it with a new “cool” tool, the PC, which conferred on them the leading-edge, high-status image of this new technology. What mattered was that you were important enough to have one of these cool things, not that you performed with it tasks that were considered beneath you just a few years before.
What was easier to predict was the advent of the PC itself. And the SRI report missed this one, too, even though it was aware of the technological trajectory: “Computer technology that in 1955 cost $1 million, was only marginally reliable, and filled a room, is now available for under $25,000 and the size of a desk. By 1985, the same computer capability will cost less than $1000 and fit into a briefcase.”
But the author of the report (just like Gordon Moore in 1965) could only see a continuation of the centralized computing of his day. The report’s 1985 fictional manager views documents on his “video display terminal” and the centralized (and specialized) word processing system of 1976 continues to rule the office ten years later.
This was a failure to predict how the computer that will “fit into a briefcase” will become personal, i.e., will take the place of the “video display terminal” and then augment it as a personal information management tool. And the report also failed to predict the ensuing organizational development in which distributed computing replaced or was added to centralized computing.
Yes, predicting is hard to do. But compare forecasters and “analysts” with another human subspecies: Entrepreneurs. Entrepreneurs don’t predict the future; they make it happen.
A year before the SRI report was published, in January 1975, Popular Electronics published a cover story on the first do-it-yourself PC or what they called “first minicomputer kit,” the Altair 8800. Paul Allen and Bill Gates, Steve Jobs and Steve Wozniak, founded their companies around the time the SRI report was published not because they read reports about the office of the future. They simply imagined it.
Update: Gordon Moore quoted in VentureBeat — “Once I made a successful prediction, I avoided making another.” and “I wish I had seen the applications earlier. To me the development of the Internet was a surprise. I didn’t realize it would open up a new world of opportunities.”
Originally published on Forbes.com
ABI Research: 7.7 Million Autonomous Truck Fleets to Ship by 2025
ADAS=Advanced Driver Assistance Systems
Truck platoons are the most imminently anticipated application of highly automated driving in commercial vehicles. A fusion of forward-looking radar and V2V communication enable fleets of trucks to safely maneuver with a short distance between vehicles. The reduction in aerodynamic drag for following vehicles, and buildup of pressure behind the lead vehicle yields impressive fuel efficiencies, with various tests reporting convoy savings of between 5% and 10%. “With most fleet operators attributing some 30 to 40% of their operating costs to fuel expenditure, the savings presented by platooning are significant,” comments James Hodgson, Research Analyst, ABI Research.
As technology progresses and regulations adapt to accommodate greater vehicle automation, further benefits to fleet operators will come in the shape of labor productivity gains and better asset utilization. Currently, solutions from pioneers such as Peloton Technology require active intervention from the following driver to keep the vehicle within the lane of travel, but in the future the driver of the lead vehicle could be in sole control of all vehicles in the convoy; allowing following drivers to rest, or eliminating the need for them altogether.
Free ADAS and Active Safety Webinar on May 21, 2015 at 11 am ET for a deeper look at ABI Research’s commercial trucking coverage and the convergence of ADAS, telematics, autonomous driving, and big data,
These findings are part of ADAS and Autonomous Driving Technology in Trucks and Commercial Vehicles, a report from ABI Research’s Automotive Safety and Autonomous Driving and Commercial Vehicle Telematics Market Research.
Data Science on Cloud Foundry (Video)
[youtube https://www.youtube.com/watch?v=n95hCVvuPKQ?rel=0]
Data Scientists frequently need to create applications that enable interactive data exploration, deliver predictive analytics APIs or simply publish results. Cloud Foundry provides an ideal platform for data scientists by making it easy to quickly deploy data driven apps backed by a variety of data stores.
Big Data Market 2011-2026, From $7.6 to $84.69 Billion
Wikibon: For the calendar year 2014, the Big Data market – as measured by revenue associated with the sale of Big Data-related hardware, software and professional services – reached $27.36 billion, up from $19.6 billion in 2013. While growing significantly faster than other enterprise IT markets, the Big Data market’s overall growth rate slowed year-over-year from 60% in 2013 to 40% in 2014. This is to be expected in an emerging but quickly maturing market such as Big Data, and Wikibon does not believe this slightly slower growth rate indicates any structural market issues.
This Is Not Your Father’s IT Industry: The Rise of the Outsiders
You probably heard the news about the CFO of Morgan Stanley becoming the new Google CFO and the on-going migration of Wall Street bankers to Silicon Valley (here and here). But did you know that this is just a surface manifestation of an underlying secular trend, nothing less than a new restructuring of the IT industry and the emergence of completely new developers and sellers of information technology products and services?
You could argue that at any point in time in the last half a century you could say about the fast-changing IT industry that it is not your father’s IT industry. But remarkably, the structure of the IT industry has been very stable, with only one major redrawing of its landscape throughout its history. Today, the “IT industry” is being redefined with two major clusters of companies increasingly competing with the incumbents and reshaping the old IT world order.
First, what did your father’s IT industry look like? Until about 1990, the industry was dominated by IBM and a few smaller companies (e.g., DEC, HP, Wang, Prime). These companies were offering one-stop-shopping, providing to enterprises the entire IT stack, from hardware platforms (including proprietary chips) to peripherals to operating systems to networking to applications and to services.
The advent of the PC and, in particular, the networking of PCs in the 1980s, gave rise to a new tidal wave of digital data. As a result, between 1990 and 2000, the industry’s structure has expanded to include large vendors focused on one layer of the IT stack: Intel in semi-conductors, EMC in storage, Cisco in networking, Microsoft in operating systems, Oracle in databases. IBM saved itself from the fate of DEC, Wang, and Prime by focusing on services. DEC could have saved itself if it realized in time the value of focusing on its biggest competitive advantage—networking—instead of letting a number of focused players enter this market (which Cisco eventually dominated). At the same time, a number of focused PC vendors, primarily Compaq and Dell, carved a larger role for themselves by gradually expanding their reach into other layers of the IT stack, emulating the old model of a vertically-integrated IT vendor. By and large, they were less successful than the new, horizontally-focused IT vendors.
The restructured IT industry, and specifically, the focused, “best-in-class” vendors answered a pressing business need. Digitization and the rapid growth of data unleashed new business requirements and opportunities that called for new ways to sell and buy IT. The new competitive and business pressures to keep more and more data online and for longer duration, to mind and mine the data, to share and move it around, all contributed to the demand for a flexible IT infrastructure where buyers assemble together the pieces of their IT infrastructure from different vendors. Most important, in the late 1980s and early 1990s, businesses fundamentally changed their attitude towards and the scope of what they did with data: From primarily an internal, back-office bookkeeping, “how did we do last quarter?” focus on the past, to external, customer-, partner-, and supplier-oriented data creation, collection, and mining, with a focus on the present and “let’s understand how to serve our stakeholders better.”
Now, how is today’s or tomorrow’s IT industry different from what it was just fifteen years ago? We still have more or less the same dominant players—IBM, Cisco, EMC, Oracle, HP, a reinvigorated Dell, etc. If there has been any marked change, it has been a shift back to a vertically-integrated model, with all of these players providing the whole IT stack (or what they call, with their insatiable appetite for new buzzwords, “converged infrastructures”).
I would argue that this “back to the future” model is on its last legs and the real future belongs to two clusters of companies: Digital natives (e.g., Google, Facebook, Amazon) and large, IT-intensive enterprises (e.g., financial services companies).
In the mid-2000s, a new business need emerged: “Let’s make sense of the mountains of data we continue to accumulate,” focusing on the future and the mining of data to understand better the likely results of different courses of actions and decisions. Unlike the previous shift in the fortunes of the IT industry, the companies driving this shift were not old or new “IT vendors.” They were the companies born on the Web, the digital natives, with data (and ever more data) at the core of everything they did. Google, Facebook, Amazon, Netflix—all built from scratch their IT infrastructure to their own specifications, inventing and re-imagining in the process the entire IT stack. They were the first outsiders insisting that IT was such a core competency for them that they could not trust it to “IT vendors” and could do a more efficient and effective job on their own, thank you.
Google and Amazon later became insiders by offering their infrastructure as a service to small and large businesses. But other companies born on the Web, such as Netflix, simply continued to amass unparalleled knowledge about state-of-the-art IT, develop innovative IT solutions which they then made available to the world as open source software, and hire (or train) the best and the brightest IT professionals. This became a new IT ecosystem, a whole new world of IT far surpassing what was happening in the “IT industry” on any measure of innovation and competitiveness.
Some of the large enterprises which used to be the most important customers of the traditional IT vendors are now joining this new IT ecosystem, reinforcing the reinvention of how IT is developed, sold and bought. Consider these recent news items:
- Santander is the first global bank to offer cloud data storage services to corporate customers. “As I think how I am going to compete with all these new technology players, I can offer the same services as some of these big guys,” Santander’s chairman, Ana Botín told the Financial Times.
- By 2018, Bank of America plans to have 80% of its workloads running on software-defined infrastructure inspired by Web companies, a process that began in 2013. “The transformation at Bank of America reflects the migration of state-of-the art information technology developed by Internet companies into the broader economy,” reports the Wall Street Journal.
- Facebook announced a new type of server, a collaboration with Intel based on its own design which it hopes other companies will adopt as well. The announcement was one of many at a recent gathering of the Open Compute Project, a nonprofit group formed by Facebook in 2011 to adapt principles of open-source software to hardware. “Members develop and share designs for servers, networking gear and storage devices that any company can build and sell, creating competition that helps hold down hardware costs,” reports the Wall Street Journal.
- Fidelity Investments reconfigured its data centers to better fit its business needs, engaging its engineering team in redesigning a revolutionary new rack, and reducing energy consumption by 20%. The announcement from the Open Compute Project reads in part: “Fidelity’s Enterprise Infrastructure team also wanted to transform the way its members worked. Instead of maintaining a closed shop, the team was looking to open up and engage with an external community of engineers as a way to keep up with the latest developments and innovate in-house… The Open Bridge Rack is a convertible datacenter rack that can physically adjust to hold any size server up to 23 inches across. It’s a key link in helping enterprises make the switch from proprietary gear to Open Compute… Fidelity designed (patent pending) and donated it to the Open Compute Foundation, making it available to different manufacturers.”
- Apple has acquired speedy and scalable NoSQL database company FoundationDB, TechCrunch reports. With this acquisition, Apple aims to bolster the reliability and speed of its cloud services, and possibly provide video streaming for its rumored TV service.
The new “IT vendors” are companies that see IT not only as an integral part of their business strategy but go even further to view IT—and the data they collect and mine–as what their company is all about. This new breed of IT vendors have emerged over the last decade from the ranks of digital natives and they are joined now by large companies that want to get further return—and possibly new sources of revenues—from their large investments in IT. Furthermore, just like the digital natives, these large established companies don’t want to be beholden anymore to the lock-in tactics of traditional IT vendors.
The size of the IT industry worldwide in 2010 was about $1.5 trillion. Gartner predicts that the industry (including telecom) will grow to $3.8 trillion this year. The real “IT industry,” however, is much larger than that, given all the IT-related activities happening outside the traditional boundaries of the industry. And if you include in the “IT industry” everything that’s being digitized—content, communications, consumer electronics, commerce—we are looking at an industry that will grow to at least $20 trillion by 2020. In this largely expanded industry there will probably be still room for the traditional IT players. But the growth spurts, innovation, and new skills will come from today’s outsiders, from the companies whose core competency is the collection, processing, analysis, distribution, and use of data.
Originally published on Forbes.com
What is Data Science? (Video)
[youtube https://www.youtube.com/watch?v=tlEuunxdIVM?rel=0]
Analysis of Smartphone Usage on US Highways
Wefi, the Mobile Data Analytics company, analyzed data collected from over 70,000 devices driven over 3.5 million miles on six major interstates.
Data Science Is Making Trains More Efficient (Video)
[youtube https://www.youtube.com/watch?v=PxxUAlm2Jnw?rel=0&w=560&h=315]




