What’s Next to Interact with Industrial Robots
Human interaction with industrial robots has been limited over the last 50 years. Imitation learning, cobots, and natural language processing are enabling new human-machine interactions.
The workhorses of leading manufacturing facilities are industrial robots. Industrial robots date backing to the 1960s, and they haven’t changed much. They often look like a human arm, called articulated robots, and perform tasks such as welding, painting, and material handling. These robots are installed in a fixed location on the factory floor and rarely move over their lifespan. They are often surrounded by metal cages and have blaring alarms so that humans do not get close to them while they are in operation. Despite their flaws, they are so incredibly productive that nearly 2 million robots are in use across the world’s factories. This large demand and install base has made small details such as ‘cable-path optimization’ a worthwhile problem to solve and new business models such as robots-as-a-service to emerge. These types of robots will likely remain in service for decades to come, but robot feature composition may change considerably.
Typical industrial robots lack one killer feature: human interaction. An operator can interact with a robot by pressing an on-off switch or selecting between a few modes of operation, but they cannot tell the robot what to do. That remains accessible only to a skilled programmer accessing the control system with their computer. For process steps such as welding and painting, setting up the robot once and letting it perform the operation millions of times is valuable while limiting. For products desiring a human touch to personalize, some human interaction is needed in order to work alongside the robot.
Human computer interaction research began in the 1950s. This research has compounded into incredible consumer electronics such as the Apple iPhone and Amazon Alexa. By comparison, industrial human computer interaction technologies have lagged behind. Due to the harsh environments and the safety requirements industrial equipment must operate within, many of the same technologies that work in a consumer product are not feasible in industry.
However, a new focus on human-machine interaction has begun due to recent advances in robotic technology. Cobots are being applied to “to improve the efficiency and versatility of factories” and are at core of Industry 5.0. Emerging startups such as Micropsi Industries are raising millions to “retrain industrial robots using human demonstrations” through a technique called imitation learning. Elsewhere in machine learning research, Google and Everyday Robots is teaching robots to follow verbal instructions. They even found that “periodic human interventions are a simple but important technique for achieving good performance.” Sounds just like training humans! Enabling robots to respond to human feedback in industrial environments will unlock the next wave of exponential growth for industrial robots.
Assembly Line
DeepETA: How Uber Predicts Arrival Times Using Deep Learning
Date: February 10, 2022
Authors: Xinyu Hu, Olcay Cirit, Tanmay Binaykiya, Ramit Hora
Tweet | Pocket | Instapaper
For several years, Uber used gradient-boosted decision tree ensembles to refine ETA predictions. The ETA model and its training dataset grew steadily larger with each release. To keep pace with this growth, Uber’s Apache Spark™ team contributed upstream improvements [1, 2] to XGBoost to allow the model to grow ever deeper, making it one of the largest and deepest XGBoost ensembles in the world at that time. Eventually, we reached a point where increasing the dataset and model size using XGBoost became untenable. To continue scaling the model and improving accuracy, we decided to explore deep learning because of the relative ease of scaling to large datasets using data-parallel SGD.
Read more at Uber Engineering Blog
Software-as-a-Service, Artificial Intelligence and Industry 4.0 with Daniel Harari, CEO of Lectra
Enhancing Datasets For Artificial Intelligence Through Model-Based Methods
Date: February 10, 2022
Authors: Dirk Mayer, Ulf Wetzker
Tweet | Pocket | Instapaper
In industrial processes, data from time series play a particularly important role (e.g., sensor data, process parameters, log files, communication protocols). They are available in very different temporal resolutions – a temperature sensor might deliver values every minute, while for a spectral analysis of wireless network requires over 100 million samples per second.
The objective is to reflect all relevant states of the processes and uncertainties due to stochastic effects within the augmented time series. To add additional values to measured time series of an industrial process, insights into the process are beneficial. Such representation of the physical background can be called a model.
Read more at Semi Engineering
Andrew Ng, The AI pioneer says it’s time for smart-sized, “data-centric” solutions to big issues
Semi-virtual site visits deliver enhanced customer value
Date: February 11, 2022
Author: Stacey Phillips
Tweet | Pocket | Instapaper
AR technology offers opportunities to schedule visits that have been difficult to arrange in the past due to sites being in remote locations, offshore environments or other restricted areas. Headset technology allows field engineers to reach someone where they are and streamline the visit process. As a result, someone can receive insights in real time because remote engineers can see what is happening onsite versus trying to identify issues through messages or images (see Figure 4). Problems can be solved more quickly, which saves time and money in the process.
Read more at Plant Engineering
Digital Twins Improve Plant Design and Operational Performance
Date: February 11, 2022
Tweet | Pocket | Instapaper
Commissioning and start-up are two of the most crucial use cases for digital twins, as people become less dependent on physical devices. The value of the digital twin is in quicker configuration and modernization of lifecycle processes in a simulated environment.
Imagine operating with all the accuracy but without the boundaries of a physical device. The simulated device can understand the environment and sends values back to the user. The information model is coming directly from the device.
Read more at FDT Group Blog