Edge Computing:
Edge computing is a networking philosophy focused on bringing computing as close to the source of data as possible, in order to reduce latency and bandwidth usage.
In a simpler term, edge computing means running fewer processes in the cloud and moving those processes to local places, such as on a user’s computer, an IoT device, or an edge server. Bringing computation to the network’s edge minimizes the amount of long-distance communication that has to happen between a client and server.
It is important to understand that the edge of the network is geographically close to the device, unlike origin servers and cloud servers, which can be very far from the devices they communicate with.
Cloud computing offers significant amount of resources (e.g., processing, memory and storage resources) for the computation requirement of mobile applications. However, gathering all the computation resources in a distant cloud environment started to cause issues for applications that are latency sensitive and bandwidth hungry.
Akamai, CloudFront, CloudFlare and many other Edge Computing Providers provide edge services like WAF, Edge Applications, Serverless Computing, DDos Protection, Edge Firewall etc.
Fog Computing:
Both the Fog and Edge Computing are totally concerned and looks into the computing capabilities to be done locally rather than pushing it to the Cloud.Overall reason of having Fog computing is to reduce delay and bandwidth requirement from the network
Most of the Fog Computing use cases came from the IOT deployments. Industrial Automation, Intelligent Transportation, Smart Grid etc.
Edge Computing is heavily discussed with 5G. For the real-time applications , having computing resources closer to the source provides faster processing. Main difference between Fog Computing and Edge Computing is at where the data processing takes place.
Figure - Edge vs. Fog Computing