Worldwide Shipments of Wearables to Surpass 200 Million in 2019

IDC

The worldwide wearable device market will reach a total of 111.1 million units shipped in 2016, up a strong 44.4% from the 80 million units expected to ship shipped in 2015. By 2019, the final year of the forecast, total shipments will reach 214.6 million units, resulting in a five-year compound annual growth rate (CAGR) of 28%.

One of the most popular types of wearables will be smartwatches, reaching a total of 34.3 million units shipped in 2016, up from the 21.3 million units expected to ship in 2015. By 2019, the final year of the forecast, total shipments will reach 88.3 million units, resulting in a five-year CAGR of 42.8%.

Apple’s watchOS will lead the smartwatch market throughout our forecast, with a loyal fanbase of Apple product owners and a rapidly growing application selection, including both native apps and Watch-designed apps. Very quickly, watchOS has become the measuring stick against which other smartwatches and platforms are compared. While there is much room for improvement and additional features, there is enough momentum to keep it ahead of the rest of the market.

Android/Android Wear will be a distant second behind watchOS even as its vendor list grows to include technology companies (ASUS, Huawei, LG, Motorola, and Sony) and traditional watchmakers (Fossil and Tag Heuer). The user experience on Android Wear devices has been largely the same from one device to the next, leaving little room for OEMs to develop further and users left to select solely on price and smartwatch design.

Smartwatch pioneer Pebble will cede market share to AndroidWear and watchOS but will not disappear altogether. Its simple user interface and devices make for an easy-to-understand use case, and its price point relative to other platforms makes Pebble one of the most affordable smartwatches on the market.

Samsung’s Tizen stands to be the dark horse of the smartwatch market and poses a threat to Android Wear, including compatibility with most flagship Android smartphones and an application selection rivaling Android Wear. Moreover, with Samsung, Tizen has benefited from technology developments including a QWERTY keyboard on a smartwatch screen, cellular connectivity, and new user interfaces. It’s a combination that helps Tizen stand out, but not enough to keep up with AndroidWear and watchOS.

Posted in Misc | Tagged , , , | Leave a comment

Price is #1 Barrier to the Purchase of IoT Devices

IoT_Barriers

IoT_Devices Owned

Accenture:

Consumers report that price is the top barrier to the purchase of IoT devices, with 62 percent believing these devices are too expensive. This perception is almost consistent across age groups and countries—with mature markets only slightly less concerned about the price than emerging markets. Russia, Romania and the Philippines report the highest share of consumers stating price is a barrier.

Source: eMarketer

Posted in Internet of Things, Misc | Tagged , | Leave a comment

Data visualization: Plotting life expectancy against income for 200 countries over 200 years

 

 

[youtube https://www.youtube.com/watch?v=jbkSRLYSojo?rel=0]

Hans Rosling’s famous lectures combine enormous quantities of public data with a sport’s commentator’s style to reveal the story of the world’s past, present and future development. Now he explores stats in a way he has never done before – using augmented reality animation. In this spectacular section of ‘The Joy of Stats’ he tells the story of the world in 200 countries over 200 years using 120,000 numbers – in just four minutes. Plotting life expectancy against income for every country since 1810, Hans shows how the world we live in is radically different from the world most of us imagine.

Posted in Data Visualization, Misc, Statistics | Tagged | Leave a comment

A Reasonable Discussion of Deep Learning’s Great Expectations

[youtube https://www.youtube.com/watch?v=aygSMgK3BEM?rel=0]
Luke Hewitt:

I have no doubt that the next few years will see neural networks turn their attention to yet more tasks, integrate themselves more deeply into industry, and continue to impress researchers with new superpowers. This is all well justified, and I have no intention to belittle the current and future impact of deep learning; however, the optimism about the just what these models can achieve in terms of intelligence has been worryingly reminiscent of the 1960s.

Extrapolating from the last few years’ progress, it is enticing to believe that Deep Artificial General Intelligence is just around the corner and just a few more architectural tricks, bigger data sets and faster computing power are required to take us there. I feel that there are a couple of solid reasons to be much more skeptical.

To begin with, it is a bad idea to intuit how broadly intelligent a machine must be, or have the capacity to be, based solely on a single task. The checkers-playing machines of the 1950s amazed researchers and many considered these a huge leap towards human-level reasoning, yet we now appreciate that achieving human or superhuman performance in this game is far easier than achieving human-level general intelligence. In fact, even the best humans can easily be defeated by a search algorithm with simple heuristics. The development of such an algorithm probably does not advance the long term goals of machine intelligence, despite the exciting intelligent-seeming behaviour it gives rise to, and the same could be said of much other work in artificial intelligence such as the expert systems of the 1980s. Human or superhuman performance in one task is not necessarily a stepping-stone towards near-human performance across most tasks. ……

The many facets of human thought include planning towards novel goals, inferring others’ goals from their actions, learning structured theories to describe the rules of the world, inventing experiments to test those theories, and learning to recognise new object kinds from just one example. Very often they involve principled inference under uncertainty from few observations. For all the accomplishments of neural networks, it must be said that they have only ever proven their worth at tasks fundamentally different from those above. If they have succeeded in anything superficially similar, it has been because they saw many hundreds of times more examples than any human ever needed to.

Deep learning has brought us one branch higher up the tree towards machine intelligence and a wealth of different fruit is now hanging within our grasp. While the ability to learn good features in high dimensional spaces from weak priors with lots of data is both new and exciting, we should not fall into the trap of thinking that most of the problems an intelligent agent faces can be solved in this way. Gradient descent in neural networks may well play a big part in helping to build the components of thinking machines, but it is not, itself, the stuff of thought.

Posted in AI, deep learning, Misc | Leave a comment

The Economy of Cloud Computing (Infographic)

EconomyOfCloud-11-17.jpg

Source: Soliant Consulting

Posted in Infographics, Misc | Tagged , , , , | Leave a comment

IoT Data Traffic Growth: As many machines as people roaming by 2020

IoT_M2M traffic

Machina Research estimates that there are now 350 million cellular based connections worldwide, and this will grow to 1.3 billion over the next five years. However, the proportion of M2M connections accounted for by roaming is growing even faster. As a global provider of roaming services including billing and clearing to network operators, Starhome Mach is able to determine that the number of roaming registrations that can be attributed to M2M devices increased by 100% last year to reach 7% of all roamers. The rate of growth is such that it is entirely possible that there will be as many machines as people roaming by 2020.

Posted in Internet of Things, Misc | Tagged | Leave a comment

Use of IoT in Healthcare and Life Sciences

IoT_healthcare

eMarketer:

Despite numerous regulatory and privacy constraints, organizations inside and outside the healthcare industry are exploring ways to put the IoT to work. Players include pharma and biopharma manufacturers; hospitals and clinics; physicians, nurses and other healthcare providers (HCPs); health insurers; fitness companies; and tech firms. The goals are to cut costs, boost efficiency and improve the way illnesses are diagnosed, treated and prevented.

At the same time, an increasing number of digitally empowered consumers are taking more responsibility for their health. Primed to use fitness wearables and smartphone apps, people are growing more comfortable with new types of sensors that capture and analyze their health and medical data. It will only be a matter of time before this information is seamlessly integrated into larger healthcare systems to make their care more precise and efficient.

Though their number is growing steadily, many IoT healthcare projects are still in their infancy, and remain a patchwork of disparate and isolated initiatives. And while it’s not clear yet how things will shake out, there is also no shortage of ideas. Many large and influential tech firms—including Apple, Google (and its parent company Alphabet), Samsung, Philips, IBM, General Electric and SAP—have entered the IoT space in a big way and are hoping to make things happen quickly.

The result is that hospitals and healthcare systems are using the IoT to make their facilities more efficient. Initiatives include sharing records to ensure higher-quality care, tracking medical supply inventory and communicating with field personnel.

Many pharma companies and medical device makers are already incorporating IoT components into their manufacturing and distribution operations. They are also exploring more strategic ways to harness it to make their products better during the research and development phase and in clinical trials.

Posted in Internet of Things, Misc | Tagged , , | Leave a comment

The Machine Learning Landscape

MachineIntelligenceLandscape

Shivon Zilis:

Most of these machine intelligence startups take well-worn machine intelligence techniques, some more than a decade old, and apply them to new data sets and workflows. It’s still true that big companies, with their massive data sets and contact with their customers, have inherent advantages—though startups are finding a way to enter.

Achieving autonomy

In last year’s roundup, the focus was almost exclusively on machine intelligence in the virtual world. This time we’re seeing it in the physical world, in the many flavors of autonomous systems: self-driving cars, autopilot drones, robots that can perform dynamic tasks without every action being hard coded. It’s still very early days—most of these systems are just barely useful, though we expect that to change quickly.

These physical systems are emerging because they meld many now-maturing research avenues in machine intelligence. Computer vision, the combination of deep learning and reinforcement learning, natural language interfaces, and question-answering systems are all building blocks to make a physical system autonomous and interactive. Building these autonomous systems today is as much about integrating these methods as inventing new ones.

The new (in)human touch

The virtual world is becoming more autonomous, too. Virtual agents, sometimes called bots, use conversational interfaces (think of Her, without the charm). Some of these virtual agents are entirely automated, others are a “human-in-the-loop” system, where algorithms take “machine-like” subtasks and a human adds creativity or execution. (In some, the human is training the bot while she or he works.) The user interacts with the system by either typing in natural language or speaking, and the agent responds in kind.

These services sometimes give customers confusing experiences, like mine the other day when I needed to contact customer service about my cell phone. I didn’t want to talk to anyone, so I opted for online chat. It was the most “human” customer service experience of my life, so weirdly perfect I found myself wondering whether I was chatting with a person, a bot, or some hybrid. Then I wondered if it even mattered. I had a fantastic experience and my issue was resolved. I felt gratitude to whatever it was on the other end, even if it was a bot.

On one hand, these agents can act utterly professional, helping us with customer support, research, project management, scheduling, and e-commerce transactions. On the other hand, they can be quite personal and maybe we are getting closer to Her — with Microsoft’s romantic chatbot Xiaoice, automated emotional support is already here.

As these technologies warm up, they could transform new areas like education, psychiatry, and elder care, working alongside human beings to close the gap in care for students, patients, and the elderly.

Posted in AI, Machine Learning, Misc | Leave a comment

IIA, Forrester, IDC, and Gartner on the Future of Big Data Analytics and Cognitive Computing

CognitiveComputing

Big data analytics is the next trillion-dollar market, says Michael Dell. IDC has a more modest and specific prediction, forecasting the market for big data technology and services to grow at a 23.1% compound annual growth rate, reaching $48.6 billion in 2019.

The larger market for business analytics software and business intelligence solutions which now includes the new disciplines of data science and cognitive computing (e.g., IBM Watson) is at least 5 times bigger. But a much larger market, which may indeed approach a trillion dollar sometime in the not-distance future, includes the revenues companies in any industry will generate from “monetizing” their data and algorithms.

Here’s my summary of predictions for big data analytics and cognitive computing from the International Institute for Analytics (IIA), Forrester, IDC, and Gartner.

Big data analytics will be embedded everywhere

IIA predicts that computing will become increasingly microservice-enabled, where everything – including analytics – will be connected via an API. IDC predicts that by 2020, 50% of all business analytics software will include prescriptive analytics built on cognitive computing functionality and that Cognitive Services will be embedded in new apps. Embedded data analytics will provide U.S. enterprises $60+ billion in annual savings by 2020.

Goodbye data preparation, hello data science

IIA predicts that automated data curation and management will free up analysts and data scientists to do more of the work they want to do. Forrester says that in 2016, machine learning will begin to replace manual data wrangling and data governance dirty work, and vendors will market these solutions as a way to make data ingestion, preparation, and discovery quicker. Through 2020, according to IDC, spending on self-service visual discovery and data preparation tools will grow 2.5x faster than traditional IT-controlled tools for similar functionality.

The meager supply of people with the right data analysis skills will continue to baffle experts

Automated data preparation will help address the limited supply of analysts and data scientists. However, opinions differ regarding when supply will start meeting demand. The talent crunch, says IIA, will ease as many new university programs come online and it will stop being a challenge for large corporations—they will find ways to address their requirements for number-crunching, model-spewing staff.

No, says IDC, the shortage of skilled staff will persist and extend from data scientists to architects and experts in data management. As a result, the market for big data professional services will expand rapidly, with a CAGR of 23% through 2020. Forrester agrees that the “huge demand” will not be met in the short term, “even as more degree programs launch globally.” In 2016, Forrester predicts, firms will turn to insights-as-a-service providers and data science- as-a-service firms and to labor-savings options such as algorithm markets and self-service advanced analytics tools.

There’s risk in them thar data hills

Gartner predicts that due to the volume and variety of data and the sophistication of advanced analytics capabilities, the risks associated with big data analytics projects will continue to be larger than those associated with typical IT projects. In addition, by 2018, 50% of business ethics violations will occur through improper use of big data analytics, according to Gartner. Forrester highlights some of the risks associated with the ever-changing big data vendor hype, predicting that half of all “big data lake” investments will stagnate or be redirected. Forrester also warns that immature data science teams will improperly exploit algorithm markets, and spend precious time either developing an algorithm they could have bought or trying to apply an algorithm incorrectly.

We will have a new buzzword

Cognitive technology will become the follow-on to automated analytics, predicts IIA. For many enterprises, the association between cognitive computing and analytics will solidify in much the same way that businesses now see similarities between analytics and big data. IIA adds to the mix yet another term, predicting also that data science and predictive/prescriptive analytics will become one and the same.

How about going back to “data mining”?

Data monetization will take off

By 2020, IDC predicts, data monetization efforts will result in enterprises increasing the marketplace’s consumption of their own data by 100-fold or more. Also by 2020, the amount of data that is worth analyzing will double. Forrester predicts that as firms will try to sell their data, “many will sputter.” In 2016, an increasing number of firms will look to drive value and revenue from their “data exhaust.” Only 10% of enterprises took their data to market in 2014, but 30% reported data commercialization efforts in 2015, a 200% increase.

Forrester declares that “all companies are in the data business now.”  IDC predicts that by 2020, organizations able to analyze all relevant data and deliver actionable information will achieve an extra $430 billion in productivity benefits over their less analytically oriented peers. A similar figure for revenues associated with data monetization will get us closer to Michael Dell’s trillion-dollar prediction. In the same interview, Dell described the current state of data mining/predictive analytics/data science/prescriptive analytics/cognitive computing: “If you look at companies today, most of them are not very good at using the data they have to make better decisions in real time.”

Sources

IIA

2016 analytics priorities and predictions webinar

2016 analytics priorities and predictions research brief

Forrester

Predictions 2016: The Path From Data To Action For Marketers

IDC

IDC On-Demand Webcasts: Worldwide Big Data and Analytics 2016 Predictions

New IDC Forecast Sees Worldwide Big Data Technology and Services Market Growing to $48.6 Billion in 2019, Driven by Wide Adoption Across Industries

Gartner

Gartner Says Customer Data Has Monetary Value but Many Organizations Ignore It

Gartner Says, By 2018, Half of Business Ethics Violations Will Occur Through Improper Use of Big Data Analytics

Originally posted on Forbes.com

 

 

Posted in Big Data Analytics, Misc, Predictions | Tagged , , , | Leave a comment

Most In-Demand Data Science Skills

Data-Science-Skills2016

Source: CrowdFlower, based on “3500 relevant job openings from LinkedIn.”

The folks at CrowdFlower excluded Excel from their list but noted that “that’s still something you see in myriad job listings. Old habits die hard.” Of course, data scientists don’t want to associate the “sexiest job of the 21st century” with old habits. Employers, however, want to cover all bases, sexy or not.

Posted in Data Science, Data Science Careers, Data Scientists, Misc | Tagged | Leave a comment