Architecting the edge for performance and reliability
For most OT professionals long responsible for edge-based computing systems in their organisations, the evolution of the edge may seem like a moving target.
Many companies are looking to their edge computing systems for improvements in the way they collect and process data resulting from the explosive growth of internet-connected devices. Yet for most operational technology (OT) professionals who have long been responsible for edge-based computing systems in their organisations, the evolution of the edge may seem like a moving target.
As business and technology trends continue to reshape how computing happens at the edge, many companies are searching for the best way to select, implement and capitalise on the opportunities that futureproof edge computing strategies offer today.
Welcome to the new edge
Today, many critical operations and processes take place at the edge of an organisation’s network, such as unmanned machinery, public safety systems, and power and energy production.
These operations, and the equipment they rely on, are connected to a rapidly growing number of IoT devices. These sensors and devices collect real-time data that can be used to streamline production cycles, improve product or service quality, or shorten response times. As a result, better management of edge data can help the entire business improve critical processes, reduce costs and gain a significant competitive advantage.
Yet reaping these benefits is more difficult than it might appear, especially when attempting to process and manage edge data through centralised cloud-based applications or on-premise data centres. While these computing models have their advantages, they are not particularly well suited to supporting environments where IT staffing is limited, connectivity is poor or expensive, and operations are especially time-sensitive. All of these conditions are usually found at the edge.
The potential of the edge
To overcome such inherent challenges, many organisations have begun to rethink their edge computing infrastructures, which generally have limited or no connectivity to a remote data centre or the cloud. Edge data may be processed directly at the point of origin or it can be sent to a gateway networking device or intermediary server. But either way, the usefulness of the data is often constrained by the scale and capacity limitations of hardware — sometimes 30 years old — that make up most edge infrastructure.
Edge computing today is going through significant changes, starting with increased usage. This offers new pressures as well as new opportunities that together are driving the rapid transformation of what is happening at the edge.
For organisations that capitalise on these opportunities, the payoff can be tremendous. For example, applications related to analytics, vehicle-to-vehicle communications, power production and manufacturing equipment benefit from improved edge computing infrastructure that more effectively processes and uses streaming data from IoT devices and sensors. In these cases, these applications can improve predictive maintenance to prevent defects and optimise production.
Changing requirements of edge computing
Edge systems are usually deployed in remote locations beyond the typical data centre or, by definition, up to the very edge of computing networks. Simplicity is key at the edge, but so too are elements such as serviceability and usability. As talented people become increasingly scarce, edge environments can often only afford to rely on OT professionals that may not be IT-specialised but are still able to get the product serviced and up and running quickly.
Many new edge applications are either holding a significant amount of data or streaming data to other locations, which makes data integrity extremely important. Any failures could cause a break in the data stream, presenting several issues for existing processes or compliance efforts.
Also, applications at the edge generally do not get updated very often. This means companies need technology with a service model and tools that enable them to deploy the system and essentially leave it alone for a very long time. In addition, in many manufacturing environments, edge systems are now being deployed with the intent of keeping them in place for 10 years or more. This contrasts with a traditional data centre where hardware infrastructure tends to be refreshed every three to five years, and its applications and middleware are routinely updated.
New trends, complications and opportunities
Organisations computing at the edge today may believe that their existing approach is sufficient — for now. However, many trends and shifts are occurring in technology that will require them to change how they think and act about edge computing:
- Device evolution: Increased volumes will drive down unit costs, making IoT devices cheaper, more capable and standardised. For example, PLCs are already becoming more intelligent, open standards-based and much more cost-effective.
- OT will evolve: The OT role will continue to evolve because IT departments are already stretched too thin, and the edge is simply not a top priority for them. OT will assume more responsibility for those technologies that are critical to them, such as edge computing.
- Increasing machine-based automation: Emerging technologies such as real-time analytics and AI will increase the criticality of edge data. Automation will become increasingly fast and much more capable, even surpassing the limits of human ability.
-
New vendors and new business models: New technological solutions, such as AI, machine learning and advanced analytics will continue to drive innovation, agility and capital efficiency. Organisations must consider partnering with vendors, technology and solutions that are best suited to their long-term strategies.
Clearly, operating at the edge will continue to require new ways of thinking and acting. These trends must be carefully evaluated by any organisation deploying an edge computing strategy.
Today’s solutions fall short
There is widespread recognition that today’s edge-based systems need new enabling technologies to meet changing demands. However, the supply of such technologies lags behind the need.
There are computing/communications technologies available that address the industrial automation (IA) sector. However, they tend to focus on the needs that IA end-user customers have had for decades, rather than on edge-specific pressures (such as the need to perform predictive maintenance or integrate with big data analytics) that are driving the IA field today. As such, they represent a suboptimal combination of capabilities and deficiencies.
Examples of these technologies include:
- IA network computers from traditional data-centre vendors can collect data from sensors or devices on the edge and send it to a data centre or public cloud. They are less expensive than other classes of edge solutions, but do not scale up well, and do not provide continuous availability or operational simplicity — both of which are essential on the edge.
- Hyper-converged platforms integrate data capture, control, compute and storage capabilities in a single hardware device. They emphasise operational simplicity and can perform real-time analytics at the edge, but at the sacrifice of continuous availability and affordability.
- High-availability solutions rely on virtualisation to deliver higher levels of availability at acceptable cost, but sacrifice operational simplicity by requiring IT-level skill sets. Many also lack the computational strength and performance required for real-time data synthesis and analysis.
Clearly, the 30-year-old industrial control systems that are still the norm in many manufacturing and IA settings need a substantial upgrade to move towards the Industrial IoT, pervasive clouds and fog computing. However, a rip-and-replace approach will not work. The challenges facing these companies during their transition to the edge includes continuing to collect data from their legacy systems, extending the lifecycle of those systems where possible and incrementally adding capabilities.
Additionally, not all companies need to move all the way along the continuum of the edge. Their business needs will dictate how much of an edge transformation they require. Yet in order for any organisation to reap the maximum benefits that the evolving edge can deliver to its business, a new technology approach is needed.
The ideal solution
To overcome these shortcomings, an ideal solution would satisfy enterprise edge needs for reliable, scalable, high-performance computing. Furthermore, it would deliver these near-term benefits, and meet current requirements, as well as accommodate an evolving edge infrastructure.
More specifically, the essential attributes for a move to the edge include:
- Operational simplicity that provides low-touch to zero-touch operation, serviceability and usability — all critical since IT resources are scarce at the edge or on the plant floor.
- Uninterrupted production for no unplanned downtime, plus the assurance of data integrity, since newer edge applications either hold large volumes of data or stream data to remote locations (such as the cloud).
- Virtualisation and reliability capabilities that enable manufacturers and industrial automation organisations to deploy highly available — yet highly efficient — business-critical systems and databases.
- Interoperability designs that leverage current standards (eg, operating systems) as well as emerging standards that will enable devices and systems to operate with each other.
- Cybersecurity, including built-in protection of all components of the edge computing system, the data they handle and their communications with each other and externally.
An example of edge-optimised architecture and delivery
Today, industry-leading technology vendors are delivering an innovative new edge infrastructure approach that incorporates all the attributes described above. More, this solution is comprehensive — a complete, turnkey computing platform that enables IA companies to update ageing systems, virtualise industrial control applications and increase operational efficiency.
This new edge approach consists of three essential elements:
- Zero-touch computing platform: An automated, fully integrated and self-protecting industrial computing platform that reduces the need for IT support while increasing operator efficiency and lowering downtime risk.
- Software-defined edge infrastructure: A single interface to a full stack of essential applications, including virtualised compute, storage and networking; cybersecurity; IoT gateways; VPNs; routers; analytics; and artificial intelligence.
-
Proactive managed services: A revolutionary approach to deploying, monitoring and maintaining edge solutions and infrastructure. Unlike yesterday’s reactive service, such a proactively managed service can be easily overseen by OT professionals, executed by the system itself and supported by a single, locally based service provider.
The combination of all three elements — a zero-touch computing platform, the software-defined edge infrastructure and proactive managed services — creates a new infrastructure solution that is optimised for organisations at the edge.
Key advantages of the optimised solution are:
- It is suitable for running business-critical industrial applications quickly and reliably in remote, locations with limited or no IT resources.
- It increases operational efficiency with pre-installed virtualisation software and intuitive, user-friendly configuration and management tools.
- It is easy to deploy and can be installed quickly by users, reducing the time it takes to get critical applications up and running.
- It reduces IT’s burden with self-monitoring and self-protecting capabilities that make it useful for unmanned facilities.
- It predictively prevents unplanned downtime, via redundant on-premise systems backed by a managed service cloud.
- It supports multiple ecosystems — a wide range of architectures and applications, without modification.
Such a comprehensive edge platform satisfies the essential requirements of the today’s edge environment, including operational simplicity, virtualisation, uninterrupted production and interoperability.
Mineral processing: a eulogy for analog
Leading mines have already accomplished an automated, digitally connected mine and are reaping...
What is TSN and do we really need it?
Whether or not TSN becomes an industry-wide standard remains to be seen.
AI and condition monitoring
The rise of artificial intelligence is seen as particularly useful in the field of condition...