Machine learning is a mid-century concept, so why is it “suddenly” powering every hot technology on the planet, including Greenwave’s AXON Predict?
The term “machine learning” was coined by computer science legend Arthur Lee Samuel in reference to his work programming the IBM 704 to play checkers back in the 1950s. His then-revolutionary efforts explored non-numerical programming (functionality beyond mere computation), and utilized alpha-beta pruning to trim search trees and efficiently determine odds of scoring from various positions on a checkers board. He “trained” his program to remember what it had seen by having it play thousands of games against itself and storing the outcomes from each move to “learn” what moves would result in success. Still, the program never had much luck beating human experts. As a fascinating article in The Atlantic explained, “While his techniques were groundbreaking,10 years later Samuel had made, in his own words, ‘limited progress’…As with so many of the currently hot AI techniques, he just did not have the computational horsepower or datasets that he needed to make the beautiful ideas work.”
Some things are just ahead of their time, and similar stories hold true for advanced use of neural networks, natural language processing, Fast-Fourier Transforms, back-propagation, LSTM units, and a host of other “old” ideas that underpin modern wonders such as DeepMind and Watson and even Alexa. The full potential of machine-learning solutions couldn’t be realized until development in datacenters and processing resources — and the supporting tools to utilize those resources (TensorFlow, scikit, etc.) — matured enough to support their applications. This convergence of supporting technology is what has happened recently, making everything “old” new again. Another part of the equation is the increase in new business use cases for machine-learning application, such as speech processing, text scanning, image recognition, in conjunction with the ability to apply these in a scalable cloud environment.
Machine Learning Comes of Age
For a good indicator to just how accessible all this technology has become, try to order up a server instance at Amazon EC2. You now get the option to choose GPU-optimized servers for machine learning!
GPU optimization perfectly illustrates how a supporting technology feeds the machine-learning trend. What makes such an option so good for machine learning? While typically slower, GPUs have loads of individual, simple cores that can work in parallel (versus CPUs, which typically only have a few cores). Massive numbers of multiple cores operating in parallel are extremely good at processing massive numbers of floating-point operations. And when training complex networks for machine learning, a LOT of parallel floating-point operations are required. One GPU manufacturer recently previewed a board for autonomous cars that features new GPUs for machine learning and computer vision and will reportedly perform 320-trillion operations per second. GPUs also tend to have higher bandwidth to retrieve from memory, which makes them ideally suited for deep neural networks. After the network is trained, the computations required to provide answers are not very intensive and can be easily accommodated.
Modern Machine-Learning Application
Aside from fueling the future in robocars, machine learning is also increasingly used to tackle more commonplace concerns. At the moment, normal business cases are concentrated around three areas:
1. User Experience: Machine-learning systems are used to make it easier for the user to interface with technology; for instance, labeling the pictures in my folder according to what is contained in the images—family, friends, pets, landmarks.
2. Expert Systems: Machine learning is conveying additional knowledge or assisting in decision making, for example, analyzing speech and powering customer chatbots or performing image-recognition functions on X-rays for diagnostic purposes in medical settings.
3. Process Improvements: Machine learning is used to speed existing processes and make things more efficient. In industrial manufacturing, for example, predictive equipment monitoring or interpreted video-feeds of the assembly line that can automatically sort, resulting in faster production and less down time.
Many machine-learning solutions are being trained to address combinations of these areas. Greenwave’s AXON Predict can be loaded onto chipsets to analyze IoT/IIoT data in real-time, present that information in a visual interface for scoring advanced statistical or algorithmic models, and leverage machine learning to empower all kinds of companies to advance their products, provide better services, and offer more features to their customers.
With the advanced state of enabling technologies, machine learning really is newly capable of untold applications. Arthur Samuel’s checkers-playing program puzzle has been solved via machine learning, as have more complex games and more complex business issues. Better computing resources have enabled better business; you need only mount up and start applying “old” machine-learning knowledge to the problems of today.