Cloud Platforms

Google Cloud

Google Cloud Machine Learning

The Google Cloud Platform Console (GCPC), as the name implies, operates in the Google Cloud and is designed to facilitate ML for users through an easy interface to ML resources. The workflow process for a user of the GCPS ML application model includes: training on user provided data, evaluating the accuracy of the results, deploying the trained model on new data, requesting predictions from results, monitoring predictions on-line, continuance management and updates. The use of Google's TensorFlow tool for deep learning is a major part of the process.

IBM Watson

IBM Watson

In line with Global IBM's corporate core focus, Watson is considered a Cognitive Computing platform specifically designed for business enterprise solutions. The use of Watson is focused on several problems: easing the AI business skills shortage by simplifying AI usage; providing tools like SnapML to reduce the time needed to determine the appropriate ML technology/tool for a specific problem; with Watson Assistant, allow users to build conversational interfaces into applications. IBM Watson, partnering with Staples created an Easy Button to place orders by voice command, and with GM to add offer location-based products while you drive.



Infosys second generation Nia AI platform mergers several AI technologies: Big Data Analytics; ML; knowledge management; Cognitive Computing capabilities; optical character recognition (OCR); and natural language processing. Infosys Nia allows customers to build custom applications to suit their business requirements using a wide set of AI specific tools. The goal of Infosys Nia's flexible architecture is to leverage several AI technologies by: automating customer's redundant tasks with swift data-processing; enhancing data visualizations; and providing smarter analytics. The combination is aimed at supporting advantageous business decisions.


Microsoft Azure Machine Learning

One of the biggest issues facing the successful production of an application toady is the sheer scale of the amount of data being collected and the need to quickly understand and act upon what the data can tell you. Azure's Data Explorer is an ML tool used to sort through your data, quickly bring the data to a central store, and allow you to make ad-hoc queries in the process. Azure ML in the cloud propels large-scale consumerization of Machine Learning and Predicative Analytics to facilitate the effort of developers. Azure Marketplace hosts many Application Programming Interfaces (API's) built by innovative data scientists using Voice Recognition, ML, Predicative Analytics, etc. that are published, searchable, and are monetized by charging for their usage.

Software Tools



Deloitte and SAP created the "Reimagine Risk Sensing" application which runs on SAP's cloud and Leonardo platform, using cognitive computing, ML, and robotic processes. It is designed to help controllers, auditors, accountants and finance teams; identify, manage and mitigate risk, align with compliance requirements, and recognize potential financial abnormalities.



The H2O company's' vision is to make ML accessible to business users and allow them to extract insights without deploying ML learning models. The H2O open-source software provides data structures and methods allowing users to analyze and visualize big data rather than operate against a small subset with a conventional statistical package. This software depends on updates and innovative contributions from approximately 100,000 data scientists and 10,000 organizations. H20 for a fee provides customer service and customized software extensions and partners with major companies like IBM and Microsoft as users.



Uber's AI Lab, in concert with the corporate belief that AI technology is critical to Uber's growth, powers ML innovation and development. Recently the AI Lab engineers open-sourced several different building blocks for Deep Learning including Ludwig, a toolbox for training deep machine learning models without writing code. Ludwig can train a deep learning model from simple training files using the input and output data points of the model to predict all outputs simultaneously, evaluate the results. and provide a series of learning models that are constantly evaluated and can be combined in a final architecture.



The SAS Visual Data Mining and Machine Learning tool delivers a unified machine learning process using open-source software including statistics, deep learning and text analysis algorithms that accelerate the exploration of structured and unstructured data. This platform manages enterprise data requirements and unifies the process from data access/transformation to deployment with the express goal of better decision making.



TensorFlow, is and open-source ML framework with deep learning and Neural Network capabilities that enables users to quickly get started in a cloud environment in areas such as computer vision, natural language and speech translation. It is particularly effective against massive data sets such as hunting for new planets. TensorFlow is used as a core tool by Amazon Web Services, Google's AI subsidiary DeepMind AlphaGo system, and Googles Cloud Vision. It can also be easily used by individuals as it comes with extensive documentation, tutorials, and visualization tools.



Whetstone was developed by researchers at the Sandia National Laboratories. As an open-source software tool, its use allows engineers working in the Neural Networks field to refine the output of artificial neurons which, in turn, enables neural computer networks to process information significantly faster than currently in use by Industry. This greatly improves the amount of circuitry needed to complete difficult tasks in autonomous vehicles, image interpretation, etc.


Open-Source Software

Readily available AI software makes the development of an AI application accessible for small or medium scale solutions. Many of these have been available as open-source software for decades. Examples include: CLIPS, Drools, OpenL Tablets, Jess, Prolog among others.


Electronic Systems

The trend to pack greater computer power and storage into smaller space greatly impacts the AI field. Tiny computer chips and very small server hardware sets can hold AI algorithms designed to perform specific tasks for smart phones, drones, IoT sensors, satellites and autonomous cars that previously could only operate on larger computer systems. Examples would include: Microsoft HoloLens 2 augmented reality headset has a number of built-in AI technologies used in the manufacturing environment, hospitals and fields where training and maintenance is required; Microsoft Azure Kinect is an intelligent 5in long by 1.5 in deep device with high resolution panoramic cameras, and a 7 microphone array that can see, hear and understand people, and actions in an environment such a hospital to proactively alert nurses when a patient falls or is likely to fall; Intel's USB-stick with embedded computer vision and deep neural network software is made for a wide audience of data scientists and developers looking to implant AI capabilities into IoT applications or prototyping deep-learning on a laptop.

Continue to Commercial Examples.