← Back to Glossary
Edge Computing
Definition
A distributed computing paradigm that processes data near the source of generation rather than in a centralised data centre, reducing latency, bandwidth costs, and data privacy risks. Edge computing is essential for real-time AI applications such as autonomous vehicles, industrial IoT, and point-of-sale analytics.
Related Terms
Put this knowledge to work
Use Opagio's free tools to measure and grow the intangible assets that drive your business value.