DeepBench from Baidu: Benchmarking Hardware for Deep Learning

baidu3

Source: Greg Diamos and Sharan Narang, “The need for speed: Benchmarking deep learning workloads,” O’Reilly AI Conference

At the O’Reilly Artificial Intelligence conference, Baidu Research announced DeepBench, an open source benchmarking tool for evaluating the performance of deep learning operations on different hardware platforms. Greg Diamos and Sharan Narang of Baidu Research’s Silicon Valley AI Lab talked at the conference about the motivation for developing the benchmark and why faster computers are crucial to the continued success of deep learning.

The harbinger of the current AI Spring, deep learning is a machine learning method using “artificial neural networks,” moving vast amounts of data through many layers of hardware and software, each layer coming up with its own representation of the data and passing what it “learned” to the next layer. As a widely publicized deep learning project has demonstrated four years ago, feeding such an artificial neural network with images extracted from 10 million videos can result in the computer (in this case, an array of 16,000 processors) learning to identify and label correctly an image of a cat. One of the leaders of that “Google Brain” project was Andrew Ng, who is today the Chief Scientist at Baidu and the head of Baidu Research.

Research areas of interest to by Baidu Research include image recognition, speech recognition, natural language processing, robotics, and big data. Its Silicon Valley AI Lab has deep learning and systems research teams that work together “to explore the latest in deep learning algorithms as well as find innovative ways to accelerate AI research with new hardware and software technologies.”

DeepBench is an attempt to accelerate the development of the hardware foundation for deep learning, by helping hardware developers optimize their processors for deep learning applications, and specifically, for the “training” phase in which the system learns through trial and error. “There are many different types of applications in deep learning—if you are a hardware manufacturer, you may not understand how to build for them. We are providing a tool for people to help them see if a change to a processor [design] improves performance and how it affects the application,” says Diamos.  One of the exciting things about deep learning for him (and no doubt for many other researchers) is that “as the computer gets faster, the application gets better and the algorithms get smarter.”

Case in point is speech recognition. Or more specifically, DeepSpeech, Baidu Research’s “state-of-the-art speech recognition system developed using end-to-end deep learning.” The most important aspect of this system is its simplicity, says Diamos, with audio on one end, text on the other end, and a single learning algorithm (a recurring convolutional neural network), sitting in the middle. “We can take exactly the same architecture and apply it to both English and Mandarin with greater accuracy than systems we were building in the past,” says Diamos.

In Mandarin, the system is more accurate in transcribing audio to text than native speakers, as the latter may have difficulty understanding what is said because of noise level or accent. Indeed, the data set used by DeepSpeech is very large because it was created by mixing hours of synthetic noise with the raw audio, explains Narang. The largest publicly available data set is about 2000 hours of audio recordings while the one used by DeepSpeech clocks in at 100,000 hours or 10 terabytes of data.

The approach taken by the developers of DeepSpeech is superior to other approaches argue Narang and Diamos. Traditional speech recognition systems using a “hand-designed algorithm,” get more accurate with more data but eventually saturate, requiring a domain expert to develop a new algorithm. The hybrid approach adds a deep convolutional neural network. The result is better scaling but again the performance eventually saturates. DeepSpeech uses deep learning as the entire algorithm and achieves continuous improvement in performance (accuracy) with larger data sets and larger models (more and bigger layers).

Bigger is better. But to capitalize on this feature (pun intended) of deep learning, you need faster computers. “The biggest bottleneck,” says Narang, “is training the model.” He concludes: “Large data sets, a complex model with many layers, and the need to train the model many times is slowing down deep learning research. To make rapid progress, we need to reduce model training time. That’s why we need tools to benchmark the performance of deep learning training. DeepBench allows us to measure the time it takes to perform the underlying deep learning operation. It establishes a line in the sand that will encourage hardware developers to do better by focusing on the right issues.”

Originally published on Forbes.com

Posted in deep learning | Tagged | Leave a comment

Digital Tipping Point: Internet Advertising Surpassing TV Advertising in 2016

Source: PwC

Source: PwC

On October 27, 1994, HotWired, the first commercial Web magazine, gave birth to the first Web banner ad and the Internet advertising industry. PwC predicts that Internet advertising revenues worldwide will surpass TV advertising in 2016 and reach $260.4 billion in 2020.

More about the first six online ads here. One of them, an ad for AT&T, simply said: “Have you ever clicked your mouse right HERE? You will!”

first-banner-ad

 

Posted in Misc | Tagged | Leave a comment

Maana Deploys AI to Optimize Enterprise Knowledge at Maersk

2015.12.10-WSJ_Screenshot-01

Maana Knowledge Platform for Oil and Gas

Does your company suffer from corporate amnesia? Palo Alto, California-based startup Maana has developed a cure for what ails organizations everywhere: Knowledge of how to perform a certain task or make a specific decision walks out the door with employees migrating to another job or retiring. Even when this tacit knowledge is captured, codified and stored in a database, it may not be accessible to the people who need it, when they need it. “We patented a unique and novel way of indexing and organizing the knowledge that is locked in data silos across the organization,” says founder and CEO Babur Ozden. Today, Maana released a new version of its AI-driven platform.

Failing organizational memory is particularly harmful when there is a “decision deadline,” explains Ozden: “These are decisions that need to take place along the workflow of an operation and need to be taken in a few hours or a few minutes.” Maana’s knowledge graph, which captures complex relations between actions, processes, and assets, coupled with advanced AI algorithms, semantic search, and deep learning, helps employees make faster and more relevant data-driven decisions by providing them with the relevant pieces of organizational memory at the moment they need it most.

Maana’s technology “captures the knowledge people acquire on the job and enables other employees, who do not have a similar experience, to have a head start in making a decision instead of starting from zero,” says Ibrahim Gokcen, Head of Data Science & Analytics at Maersk. The Maersk Group is a worldwide conglomerate that operates in 130 countries with a workforce of over 89,000 employees. Headquartered in Copenhagen, Denmark, with 2015 revenues of $40.3 billion, it owns Maersk Line, the world’s largest container shipping company, and is involved in a wide range of activities in the shipping, logistics, and the oil and gas industries.

“We want to make AI part of our digital journey,” says Gocken. “Strong technology platforms with AI capabilities help the data science and analytics people focus on the business logic, on the algorithms, and on churning models very quickly. These platforms give a head start not just to employees making decisions but also to our data scientists.”

Adds Donald Thompson, Maana’s founder and president: “We capture in a pragmatic way the knowledge of subject matter experts and business users and make it explicit so more people can take advantage of it.”

In a statement, Thompson said that the new version of Maana’s platform is “introducing our first collection of Knowledge Assistants and Knowledge Applications that really bring out the value of our user-guided and machine-assisted approach. People at all levels are empowered to rapidly gain the understanding they need in order to make the best decisions, while generating new knowledge assets (models) that others can use or build upon.”
Originally published on Forbes.com
Posted in AI | Tagged , | Leave a comment

Visually Linking AI, Machine Learning, Deep Learning, Big Data and Data Science

ai_data-science-diagram

ai_data-science-diagram2

Source: Battle of the Data Science Venn Diagrams

HT: KDnuggets

ai_machine_deep_learning

What’s the Difference Between Artificial Intelligence, Machine Learning, and Deep Learning?

Over the past few years AI has exploded, and especially since 2015. Much of that has to do with the wide availability of GPUs that make parallel processing ever faster, cheaper, and more powerful. It also has to do with the simultaneous one-two punch of practically infinite storage and a flood of data of every stripe (that whole Big Data movement) – images, text, transactions, mapping data, you name it.

 

Posted in AI, Data Science, deep learning, Machine Learning | Tagged | Leave a comment

40 Startups in the IoT in Retail Market Landscape

cbinsights_retail-iot-market-map

CB Insights:

Beacon- and sensor-based analytics – These companies provide hardware and software to help stores track visitors. They focus on data collection for internal analytics, such as merchandise tracking, adjusting staffing levels, monitoring promotions, etc. Euclid Analytics, for example, promises visitor tracking to monitor the impact of promotions on driving store visits and better understand stores’ busy times and aisles.

Beacon-based marketing – These companies also track visitors, but focus more on proximity marketing use cases (which may also include some analytic insights). Estimote provides small, colorful beacons that send push notifications to users’ phones about products or promotions when it senses someone near. Kimetric’s sensors aim to visually identify shoppers’ age, gender, eye focus, and clothing style to present them with personalized marketing.

Beacon analytics and marketing – These startups track visitors and provide a mix of internal analytics and proximity marketing services.

Inventory tracking – QueueHop provides connected anti-theft tags for items that automatically unclip after payment. Cosy is building inventory-tracking robots that use indoor mapping software.

Indoor mapping – These startups take advantage of connected devices to create detailed indoor maps of stores and malls. Stores can help users find the right items and direct them to promotions.

Service robots – Simbe Robotics and Fellow Robots are designing robots for use in-store, to help customers find items and ensure the shelves stay stocked. Fellow Robots worked with Lowe’s to launch the LoweBot in eleven stores this fall.

Loss prevention – Gatekeeper uses RFID tags that work with wheel-locking features to automatically stop shopping carts that leave the store area, helping prevent theft. Carttronicstracks baskets and carts with RFID tags, provides cameras that receive signals from the tags and can film anyone leaving the store with unauthorized items.

At-home shopping buttons – Kwik and Hiku offer connected devices that can automatically place online orders for goods from users’ homes, comparable to the Amazon Dash buttons. Kwik will provide branded buttons that let customers re-order items with one touch, while the Hiku device can also scan barcodes, to identify items for re-ordering, and can recognize users’ voices.

Smart dressing rooms – Oak Labs created an interactive, touchscreen mirror that lets shoppers request new items, adjust fitting room lighting, and see outfit recommendations. The mirror can sense which products the shopper brought into the room using RFID technology, and then present related products, save the items to shoppers’ online accounts, or display related items. Oak has worked with Polo Ralph Lauren.

 

Posted in Internet of Things | Tagged | Leave a comment

Artificial Intelligence: Enterprise Adoption by Industry

aiworld_enterpriseadoption

AI Trends:

In terms of plans to deploy AI commercially in the near future (before the end of 2018), healthcare is clearly the vertical sector in which the largest percentage of companies worldwide will take action, followed by the consumer content and apps industry (including gaming), manufacturing, retail and automotive.

Posted in AI | Tagged | Leave a comment

IDC Survey Finds IoT is All About The Data

iot

IDC just released the results of its 3rd annual survey of IoT decision-makers (press release here and webcast with IDC Vernon Turner and Carrie MacGillivray here). The IoT market is maturing, says IDC, going beyond its initial focus on connecting more and more things. Data management is fast becoming the overarching theme, with Analytics and the IoT Platform emerging as the main requirements of the 31.4% of organizations surveyed that have already launched IoT solutions , and the additional 43% looking to deploy in the next 12 months.

The survey was conducted in July and August 2016, with 4,500 decision-makers from more than 25 countries participating, from enterprises with more than 100 employees in a wide range of industries. Here are the highlights:

  • 55% say IoT is strategic to their business as a means to compete more effectively. 21% regard it as “transformative”—they know it holds promise and are looking for the right investment, says IDC.
  • Top reasons to invest in IoT: Increase productivity (24%), time-to-market (22.5%), process automation (21.7%). IDC noted that internal and operational benefits are still the main drivers of IoT deployment. However, in a possible sign of market maturation, “reducing costs” was not mentioned much this year (in contrast to previous years) and “time-to-market” appeared for the first time.
  • The business side of the enterprise is responsible for most IoT initiatives. 62% of respondents said business units fund IoT as opposed to 3% being funded by IT and 35% where IT provides the funds and the business units are involved in managing the project. 18% of projects are run by specially created IoT business units. “It’s not a technology solution, it’s a business solution,” says IDC. Of the projects that are funded by the business, IT is involved in the project (36%) or IT is aware but not involved (25%). Only 1% of respondents reported “shadow IT for IoT” where IT is not aware of the IoT project.
  • Top IoT challenges include security (26%), privacy (21%), upfront cost (22%), on-going cost (19%), IT infrastructure (16%) and IoT skills (14%). The issues of IoT-related skills came up for the first time this year, in apparent response to the challenge of handling the influx of IoT data.
  • The security challenge is addressed in an as-hock manner, with enterprises opting for a variety of solutions: security processes are integrated into the IoT workflow (23%), a tiered approach where devices are secured by firewalls between tiers (21%), and as an extension of existing IT security policies (20%).Where is the IoT being processed: 54% collect data at the edge and transmit to the enterprise; 29% collect and process data at the point of creation; 14% collect and process some data at the point of creation and transmit the rest to the enterprise.
  • Industries that lead in the adoption of IoT include financial services (including insurance), retail, and manufacturing. Lagging sectors include government, healthcare, and (surprising to me) utilities.

Survey results show that IBM and Microsoft have taken a leading role in almost all IoT segments, especially the ones ascending in importance—analytics, software, systems integration and providing an IoT platform.  This is due, IDC says, to their success in blending a cloud strategy with analytics and software capabilities. To IDC’s question about most important digital transformation projects, survey respondents cited cloud transformation/transition (66%), IoT (32%), and big data/cognitive solutions (27%). IDC noted that these transformational initiatives are interlinked: The cloud gives IoT a platform on which to scale and the IoT lays the foundation for investments in big data and cognitive solutions, to make sense of all data generated by the IoT and residing in the cloud.

Originally published on Forbes.com

 

Posted in Internet of Things, Misc | Tagged | Leave a comment

Who is Buying All the AI Startups? Google, Intel, Apple, Twitter and Salesforce

cbinsights_race_for_ai

CB Insights:

Nearly 140 private companies working to advance artificial intelligence technologies have been acquired since 2011, with over 40 acquisitions taking place in 2016 alone (as of 10/7/2016). Corporate giants like Google, IBM, Yahoo, Intel, Apple and Salesforce, are competing in the race to acquire private AI companies, with Samsung emerging as a new entrant this month with its acquisition of startup Viv Labs, which is developing a Siri-like AI assistant.

Posted in AI | Tagged | Leave a comment

Only Humans Need Apply: Winners and Losers in the Age of Smart Machines

Under pressure to remove alleged human bias from its “Trending Topics” section, in August Facebook fired the editors who were selecting and writing headlines for the stories, explaining that this “will make the product more automated.” The results of trusting algorithms more than humans have continued to make headlines ever since with the Trending “product” promoting a fake news story about Fox News’ Meghan Kelly, a conspiracy article claiming the 9/11 twin towers collapsed because of “controlled demolition,” and Apple’s Tim Cook announcing that Siri will physically come out of the phone and do all the household chores (a story from an Indian satirical website, Faking News, that was Trending’s top story on the day of the iPhone 7 launch event), to mention just a few of the more embarrassing machine failures.

Silicon Valley has never displayed much love for fallible humans, but has shown a lot of confidence in the continuous improvement and now, self-improvement, of machines. Do humans still have an important role to play in our automated lives which are increasingly controlled by sophisticated algorithms and seemingly smarter machines?

onlyhumansneedapply

In Only Humans Need Apply: Winners and Losers in the Age of Smart Machines, knowledge work and analytics expert Tom Davenport and Julia Kirby, a contributing editor for the Harvard Business Review, offer optimistic, upbeat and practical answers to this much-debated question. “The upside potential of the advancing technology is the promise of augmentation—in which humans and computers combine their strengths to achieve more favorable outcomes than either could do alone,” they write.

There is not much difference, contend Davenport and Kirby, between technologies of automation and technologies of augmentation. The difference lies in the goals and attitudes behind the application of these technologies. Automation is unidirectional and focuses “primarily or exclusively on cost reduction” via the elimination of human labor. In contrast, “augmentation approaches tend to be more likely to achieve value and innovation” and they are bidirectional, making “humans more capable of what they are good at” and “machines even better at what they do.”

It is a shortsighted (and short-term) strategy for companies to favor automation over augmentation: “If the goal is to provide truly exceptional or differentiated products and services at scale, only an augmentation arrangement can accomplish that,” write Davenport and Kirby. They advocate a “workplace that combine sophisticated machines and humans in partnerships of mutual augmentation” and mutual benefit.

Competitive considerations apply not only to companies in the race against the machine, but also to their employees. The book addresses primarily the plight of knowledge workers who thought they would escape the fate of factory workers but are now increasingly automated out of a job. “The advice on avoiding that fate,” say Davenport and Kirby, “has been noticeably thin. For the most part, the experts boil it down to a single, daunting task: Keep getting smarter. We are going to argue that there are other strategies, all of them featuring augmentation of human work by machines.”

The authors describe in detail—with vivid and engaging examples—five “options for augmentation:” Stepping Up or moving a level above the machines and making high level decisions about augmentation; Stepping Aside or choosing to pursue a job that computers are not good at, such as selling or motivating; Stepping In or monitoring and improving the computer’s automated decisions; Stepping Narrowly or finding a specialty area in a specific profession that wouldn’t be economical to automate; and Stepping Forward or becoming involved in creating the very technology that supports intelligent decisions.

These strategies will work for knowledge workers (or all workers) who are “willing to work to add value to machines, and who are willing to have machines add value to them.” They will also work for organizations that understand that “no matter how smart these machines get, there is still some potential value from human augmentation.”

What a refreshing perspective in these times of machine-worship, where Silicon Valley’s automation addiction has spread far and wide. Mark Fields, the chief executive of Ford Motor Company, recently promised completely self-driving cars by about 2025, displaying a very Silicon Valley (and silly) attitude by saying “a driver is not going to be required.”

Fields and the many other executives of established companies racing against the disruptive Silicon Valley machine should read Only Humans Need Apply where Davenport and Kirby warn that companies investing in self-driving cars “could find that they have put a lot of energy into developing vehicles that drive themselves but are stuck with regulations that require an alert driver with hands on the steering wheel and feet on the pedals. If that happens, perhaps a company whose strategy all along has given careful thought to how to redeploy the human attention that is freed up by the technology—will win big.”

Just because you can automate, doesn’t mean you should. This is the important lesson of this contrarian, timely, and well-argued book. Augmentation, say Davenport and Kirby, is something “societies should encourage in ways big and small.” Hear! Hear!

Originally published on Forbes.com

Posted in Misc | Leave a comment

Neural Networks Typology

neuralnetworksSource: The Asimov Institute

With new neural network architectures popping up every now and then, it’s hard to keep track of them all. Knowing all the abbreviations being thrown around (DCIGN, BiLSTM, DCGAN, anyone?) can be a bit overwhelming at first.

So I decided to compose a cheat sheet containing many of those architectures. Most of these are neural networks, some are completely different beasts. Though all of these architectures are presented as novel and unique, when I drew the node structures… their underlying relations started to make more sense…

Composing a complete list is practically impossible, as new architectures are invented all the time. Even if published it can still be quite challenging to find them even if you’re looking for them, or sometimes you just overlook some. So while this list may provide you with some insights into the world of AI, please, by no means take this list for being comprehensive; especially if you read this post long after it was written.

For each of the architectures depicted in the picture, I wrote a very, very brief description. You may find some of these to be useful if you’re quite familiar with some architectures, but you aren’t familiar with a particular one.

 

Posted in AI | Tagged | Leave a comment