The Second Wave Of Digital Transformation: The New Intelligent Machine Economy Calls For Modern Ways Of Development

We are fast entering the era of the new intelligent machine economy. This is when machines are joining—not replacing—humans as intelligent participants in the software-defined and AI-driven environment.

For this era to flourish, we need to embark on the second wave of digital transformation. Many companies dove into the first wave of digital transformation when they invested in information technology. This first wave gave us the ability to search, shop or perform business transactions using a browser or mobile device, it gave us access to collaboration tools that make remote work possible, and it enabled many other capabilities and functions. The first wave focused mostly on humans using technology to find information, connecting with other humans and doing things more efficiently.

The second wave does a similar thing—but for machines. The second wave of digital transformation is the migration of the cloud-native and AI-driven application investments from the IT (Information Technology) world to the OT (Operational Technology) world. It applies to devices and machines in the physical world around us across multiple industries, including aerospace, automotive, defense, industrial, medical and telecommunications.

After years of digital transformation associated with IT—focusing mostly on information flow in the digital ether—the focus has now turned to machines. McKinsey’s Digital Manufacturing Global Expert survey reveals that most manufacturing companies (68%) consider connectivity, intelligence, and flexible automation to be their top priority. The global industrial automation market is expected to reach US$326.14 billion by 2027 after a decade of CAGR at 8.9%, according to Fortune Business Insights.

Unlocking The Opportunities At The Intelligent Edge

Standalone devices—say, a heart monitor whose only function was to measure the heart rate without doing much else about it—which we knew in the past—are hardly state-of-the-art anymore. Today’s devices gather and analyze data, communicate with one another and act on the data.

A heart monitoring device can transmit a patient’s data to a doctor or set off an alarm in real time when the results can be dangerous to health. Autonomous cars can talk to the road infrastructure in real time, sense other cars in the vicinity in real time and act on this information by initiating accident avoidance. AI-driven power grids can automatically manage production and use across multiple, distributed energy resources.

To execute such connected applications, networks are heavily dependent on cloud computing, analytics, AI and machine learning, and 5G as a connectivity mechanism to enable them—and all of these new opportunities are at the intelligent edge.

Edge is a location, not a thing. It defines where the data sensing and processing happens. The edge of the network is at the farthest distance from the central data center, and within or very close to the machines such as cars, planes or robots. Some of the processing of the data from the sensors embedded in machines needs to be in-situ, at the edge, making it the intelligent edge, while other data can be pushed to the cloud for further processing.

Multiple machines and devices operating at the intelligent edge share information with one another and the data centers, forming digital loops. Such digital feedback loops are tied to big data systems to perform functions such as predictive outage avoidance, event correlation for operational faults across subsystems, software automation and oversight, and event detection and resolution.

How To Develop For The New Intelligent Machine Economy

The complexity of intelligent systems means that embedded systems companies need to transform digitally to enable the development, deployment, operation and servicing of such systems. To this end, they need to adopt tools, capabilities and processes, including:

Cloud-native and edge-friendly development techniques and tools, which are necessary to keep pace with time-to-market, system complexity and resource shortages. As we move towards edge computing, cloud hosting platforms will need to adapt to become edge-friendly or be revamped to be edge-native. An edge-native platform will retain the capabilities of a cloud platform but will also address the new demands created by the edge. Wind River Studio provides a cloud- and edge-native platform for the development, deployment, operation and servicing of mission-critical intelligent systems. Such cloud-native tools also allow developers to work anywhere, anytime or anyhow (office, remote, PC, tablet, etc.).

High-level software automation. With intelligent systems, deployment at the edge often means that payloads need to be deployed at scale over hundreds or potentially tens of thousands of geographical locations. It is not possible to deploy, operate or service applications at the edge manually. Automation is key to reducing costs for deployed distributed-edge systems both for device and cloud infrastructure types of installations.

DevOps is the key component to assemble complex embedded software at the intelligent edge. Traditionally, embedded software developers wrote code. When they were finished, and the application had been through quality assurance, the embedded “Ops” (production) installed the systems. This sequential “waterfall” model is too slow for the intelligent edge, which is operating in real time.

Under the DevOps banner, different embedded developer personas (e.g., platform developers, application developers, operators, data scientists or DevOps engineers) work in scrums. They push out new software releases as part of agile teams and do it so rapidly that it’s better to integrate the Ops and QA (quality assurance, testing) teams into the development process.

Continuous integration and continuous development tools (CI and CD tools) take new code and place it right into a production application without stopping any functions from running. The pace of code releasing has grown so quickly—and many of the code releases are just small updates to existing applications—that it no longer makes little sense to do a big uninstall/reinstall routine every day. To solve this problem, continuous integration (CI) and continuous deployment (CD) of code was introduced. This is like the old “change the tire while the car is moving” concept. But here, it works.

Certification. Software development for critical infrastructure has moved to cloud-based, agile DevOps principles. However, the safety certification of such software still follows old-fashioned development paradigms and involves expensive manual work, which causes high costs per line of code, hinders fast adoption of new features, and slows down deployment and operation. To decrease certification costs and reach a faster time to market, a novel certification approach is required. This new approach needs to be aligned with a modern DevSecOps methodology and integrated into the continuous delivery process using automation, AI/ML and digital feedback loops. Releasing new code continuously creates security risk exposure. Developer teams started to add security practices to the software development and delivery process to protect valuable assets during start-up, runtime, and at rest. The result is a workflow known as DevSecOps.

The new intelligent machine economy promises not only to unlock economic value but also to make lives easier and safer. To succeed, embedded systems companies must undergo the second wave of digital transformation and use modern, digital edge-friendly platforms, tools and processes.

Leave a Reply

Your email address will not be published. Required fields are marked *