Edge computing and cloud computing aren't competing technologies — they're complementary. But knowing which to use where makes the difference between a system that works well and one that's either too slow, too expensive, or both.

The fundamental difference

Cloud computing: Data is sent from the device to a remote data centre for processing. Results come back over the network. You get virtually unlimited compute power, but you pay for bandwidth and you're dependent on connectivity and speed.

Edge computing: Data is processed on or near the device that generates it. A sensor with a small processor analyses data locally and only sends summarised results to the cloud. Less bandwidth, less latency, but limited compute power.

When edge computing wins

  • Limited connectivity: Remote sites, underground, offshore, rural Australia. If you can't reliably get data to the cloud, process it locally.
  • Latency matters: Manufacturing quality control, autonomous vehicles, safety systems. When the response needs to happen in milliseconds, a round trip to the cloud is too slow.
  • Bandwidth costs: Sending terabytes of video or sensor data to the cloud is expensive. Processing locally and sending only relevant summaries cuts costs dramatically.
  • Privacy requirements: If data can't leave the premises — medical imaging, certain government data — edge processing keeps it local.
  • Intermittent operations: Systems that need to keep working during network outages. Edge devices can store and forward when connectivity returns.

When cloud wins

  • Complex analytics: Training machine learning models, running large-scale data analysis, complex business intelligence. These need more compute than edge devices can provide.
  • Centralised data: When you need to aggregate data from many sources and analyse it together. A fleet of 500 sensors each processing independently can't spot cross-fleet patterns.
  • Elastic scaling: Workloads that spike unpredictably. Cloud infrastructure scales up and down; edge hardware doesn't.
  • Collaboration: When multiple users or systems need access to the same data and processing results. Cloud is inherently shared; edge is inherently local.

The hybrid approach

Most real-world IoT architectures use both. The pattern is usually:

  1. Edge collects and filters: Sensors gather data. Edge processors filter noise, detect anomalies, and make immediate decisions.
  2. Edge sends summaries to cloud: Instead of raw data, send aggregated metrics, detected events, and exceptions.
  3. Cloud analyses and learns: The cloud runs large-scale analytics, trains models, and generates insights across the full dataset.
  4. Cloud pushes updates to edge: Updated models, new detection rules, and configuration changes flow back down to edge devices.

This gives you the responsiveness and resilience of edge with the analytical power and scale of cloud. It's more complex to build, but it's usually the right architecture for any serious IoT or data-intensive system.

Think of it this way: the edge decides what to do right now. The cloud decides what to do better next time.

Kasun Wijayamanna Founder & Lead Developer Postgraduate Researcher (AI & RAG), Curtin University - Western Australia