Earlier this year, surgeons at the OLV Hospital in Belgium performed a successful partial nephrectomy on a patient with early-stage kidney cancer. But this wasn't just any surgery; this was the first-ever AI-powered, robot-assisted surgery.
The surgery had been in the works for a while, but the surgeons from Orsi Academy kept running into a latency problem. The data from the sensors in the operating room took too long to be processed at a nearby cloud server.
What solved the high latency problem? NVIDIA's Holoscan. More specifically, Holoscan's ability to process streaming data at the edge in real time. Instead of sending the data to a data center, the team moved the data processing ability into the operating room and completely eliminated the latency issue.
This computing model, where data is collected and processed on the outer edges of a system instead of at a central repository, is called edge computing.
Beyond healthcare, edge computing provides significant benefits to businesses and consumers in sectors such as industrial IoT, smart cities, retail, and autonomous vehicles. Qualities such as faster processing, lower latency, and reduced bandwidth costs position edge computing to become the foundation of the next generation of digital technologies.
1. Reduces Operational Costs
Edge computing reduces operational costs by moving data processing near the points of data collection. This reduces the costs associated with transmitting large amounts of data and maintaining expensive infrastructure such as servers and data centers. It can also help you save costs on your cloud storage plans, as you can scale them down and distribute some of the processing to your edge devices.
Organizations all over the world spent more than $191 billion on data centers in 2021, and costs are projected to exceed $212 billion in 2023. That doesn't even include the amount spent on communications services, devices, and software systems. But these data center costs can be reduced with the help of edge computing.
Think of a manufacturing company that uses hundreds of sensors to collect data about the status of its machinery on its production line. This data related to temperature, pressure, and vibration would go to a central data repository for processing and analysis in a conventional model. The repository would have to have multiple servers and enough computing capability to store and analyze this data. Data coming in from hundreds of sensors would also require a lot of bandwidth to make it to the central repository.
By processing the data at the edge (i.e., nearer the sensors), the company can reduce the amount of data that needs to be transmitted to the central data center, reducing network bandwidth requirements and associated costs.
2. Boosts Performance by Reducing Latency
Edge computing is used for latency-critical and bandwidth-hungry tasks because the data travels a shorter distance and is processed faster. The output also reaches the end user quicker than with cloud computing.
A study published by the National Science Foundation shows that 58% of end users can reach a nearby edge server in less than 10 milliseconds, which is significantly faster than the results obtained with a cloud model.
Remember the AI-powered, robot-assisted surgery mentioned earlier? It couldn't be done properly using a cloud-based model because the latency was too high. Even though the team at Orsi Academy was using the best available technology, the difference of a few milliseconds can mean life and death when it comes to invasive surgical procedures. By moving the data processing inside the operation room, the team boosted the performance of the procedure and increased their chances of success.
3. Provides Scalability to Solutions That Require Real-time Data Analysis
Edge computing uses a distributed computing model that provides horizontal scalability, where computing resources are distributed across multiple edge devices. This approach allows organizations to scale their computing resources by adding more edge devices as needed without experiencing the same network latency and bandwidth constraints associated with centralized data centers.
In contrast, cloud computing achieves the same scalability through vertical scaling, which means adding computing resources to a centralized data center. Vertical scaling makes rapid expansion possible, but it can also lead to issues with network latency and bandwidth constraints when processing large amounts of data.
To get the best of both worlds, organizations can use a hybrid approach where edge computing leverages cloud computing resources for more complex workloads. The hybrid model in edge computing deploys local edge devices and remote cloud servers to provide a more balanced and efficient computing infrastructure.
For example, a hybrid edge computing setup could involve a network of local edge devices, like sensors and cameras, that not only collect and process data but also leverage the computing power of remote cloud servers to perform more complex analyses or store large amounts of data. This can result in faster response times, reduced latency, and better scalability than relying solely on either local or remote computing resources.
Whether hybrid or not, the distributed architecture in edge computing enables organizations to handle large volumes of data and scale their applications in a cost-effective manner.
For instance, self-driving vehicles are equipped with cameras, radars, and lidars: sensors that collect data about the shape, image, size, range, direction of movement, and speed of objects on the road. That data is then processed by software equipped by the car and helps in making decisions such as slowing down, speeding up, steering, or stopping the car.
The edge computing model allows the manufacturers of these vehicles to scale horizontally without worrying about investing massive amounts of money into central data repositories. That's because the cars are equipped with data processing devices, and the cost of those devices is built into the car's price.
4. Maintains Stability Despite the Reduced Role of a Data CenterÂ
Edge devices can continue to operate and process data even if they lose connectivity to the network, which improves reliability in remote or challenging environments.
Using multiple edge devices also reduces the risk of a single point of failure. Even though central data repositories and data centers have multiple layers of redundancy, when they are compromised, the whole organization comes to a halt. In the case of edge devices, only the affected devices stop functioning.
For example, if the edge device on one self-driving car is disabled or disconnected from the network, all other self-driving cars by the same company remain unaffected and continue to function as usual.
Or consider a large-scale power plant that relies on a vast network of sensors to monitor its equipment and collect data such as temperature and pressure. By installing edge computing devices at each sensor location, the plant can perform real-time data processing and analysis of the sensor data, allowing for the early detection of potential issues that could cause equipment failures.
If all processing is performed at a central location, a failure at the center means the plant has to halt operations.
By using edge devices that operate independently, the plant will increase its redundancy and improve its overall reliability. If one device fails, the others can continue to operate and ensure that critical functions are maintained.
Because of these benefits, Siemens Energy, a leading supplier of power plant equipment and technologies, has started to use edge technology for power plant management.
5. Enables Real-time Data Analysis and Decision-Making
Edge computing is the fastest way to process and analyze data in real time, given that the workloads do not require large amounts of processing power or storage capacity.
Consider a video surveillance system that monitors a busy intersection in a city. The system captures live video feeds from multiple cameras and analyzes the data to detect traffic violations, such as speeding violations and red light violations. Some cameras that are part of the system also measure traffic flow and determine traffic light timing based on the number of vehicles on the road.
In this example, edge computing will prove to be a significantly faster option to process and analyze the data in real time. Edge computing devices at each camera location won't need to transmit the video data to a central server for processing. Data is analyzed locally, allowing for the rapid detection of traffic violations and flow, and enabling timely responses.
Stream Chat Used Edge Infrastructure To Reduce Latency by up to 5x
We introduced Edge API infrastructure in 2021 to improve the performance of messaging platforms built with Stream across the world.
We switched to edge computing because we needed to cater to customers located across multiple continents. Sending the data to a regional data center located hundreds of miles away, and in some cases on a different continent, can result in latency. Also, not all users have access to high-speed internet connections, and we wanted to cater to everyone.
By introducing edge servers into our infrastructure, we reduced latency by up to five times and eliminated timeout errors caused by poor internet connections. Stream's Chat API works faster than ever, thanks to the use of edge computing.