Combining Edge Computing and AI for A More Efficient and Secure IoT
Mon, 04 Nov 2019 || By Lachlan Colgrave

In October 2016 the US was hit with a staggering distributed denial of service (DDoS) attack causing extensive outages and interruptions of sites such as Netflix, Twitter, Spotify, Reddit, and others.[1] This attack was linked to the Internet of Things (IoT) where a piece of malware scanned the internet for smartphones and other connected devices with weak security credentials, hijacking them to form an assemblage of bots. These bots clogged the internet with malicious requests, overloading networks and servers, slowing speeds and even inciting shutdowns. Today, cyber-attacks increasingly continue.[2] Centralised cloud computing is insufficient in keeping up with IoT growth[3] whilst “trust and verify” security methods are incapable of meeting the scale of cyber threats today.[4] Fortunately, many have developed technologies that have the possibility to bridge such issues, such as, edge computing and artificial intelligence (AI). In relation to the aforementioned technologies, this article seeks to answer how decentralized cloud computing and AI can bridge the security gap in IoT. It argues that combining edge computing along with AI can safeguard a permeating IoT future.

IoT, Edge Computing, and AI

IoT is broadly agreed to involve anything assigned an IP address that automatically transfers data over a network without human intervention.[5] In the words of the father of ubiquitous computing, Mark Weiser[6], “the most profound technologies are those that disappear […] they weave themselves into the fabrics of everyday life”. This depiction of future technology represents the current condition of IoT. From smartphones to heart monitors, IoT encompasses all these things. However, the current IoT is still relying on centralised computing. This dependence may change with the advent of edge computing. It allows produced IoT data to remain close to “the edge” instead of sending it across a network.[7] Today, edge computing use includes monitoring factory machines, measuring moisture in agriculture fields and the processing of information by self-driving cars which requires real-time data analysis.[8] AI concerns improving algorithms by employing problem solving techniques used by humans in identifying and neutralising cyberthreats.[9]

An Approach to Securing a Growing IoT

In a future where the number of IoT devices increases exponentially, end-to-end communication with cloud computing will result in challenging high latency. This is not to mention greater volume and velocity of data accumulation. This is where edge computing plays an important role. In edge computing, latency is decreased as the data remains “at the edge”, remedying real-time data analysis issues predicted to coincide with exponential IoT device growth.[10] Indeed, security remains an important challenge for the future of edge computing given its complexity involving multiple technologies like Network Function Virtualisation and Software Defined Networking as examples. However, security is theoretically better in an edge computing environment because data is not made vulnerable to cyberattack, remaining close to its origin rather than exposed across a network.

Edge computing may be more susceptible to physical attacks, yet it becomes ideal in thwarting a similar 2016 DDoS attack where if the cloud is disrupted, an edge computing server stays online being one of many “micro data centers” independent to a centralised cloud. Given edge computing’s security shortfalls, AI is the next natural step in developing future-minded cybersecurity methods which at present, are akin to “tweaking and plumbing”.[11] AI can manage blockchains in ways humans cannot: analysing each lodged transaction in real-time whilst simultaneously reconfirming user authentication.[12] This ameliorates current “trust and verify” security issues where user-access can be denied at any point beyond initial entry, particularly effective if systems are compromised early on. Considering the features of both technology, combining both technology may results in an efficient and secure IoT environment.

Edge computing purports an agile model capable of meeting the demands of exponential IoT growth. Like cloud computing, edge computing is not security-perfect. The future of a sufficiently secure IoT is based in AI real-time threat detection, operating most successfully in low latency environments which edge computing can provide. Now we need to only ask, how far away is this future?

Editor: Janitra Haryanto

Read another article written by Lachlan Colgrave

 

[1] Statt, N. (2016). How an army of vulnerable gadgets took down the web today. The Verge [online] Available at: https://www.theverge.com/2016/10/21/13362354/dyn-dns-ddos-attack-cause-outage-status-explained [Accessed 21 October 2019].

[2] Abomhara, M. (2015). Cyber security and the internet of things: vulnerabilities, threats, intruders and attacks. Journal of Cyber Security and Mobility4(1), pp.65-88.

[3] Pan, J., and McElhannon, J. (2017). Future edge cloud and edge computing for internet of things applications. IEEE Internet of Things Journal5(1), pp.439-449.

[4] Columbus, L. (2019). Why AI Is the Future of Cybersecurity. Forbes [online] Available at: https://www.forbes.com/sites/louiscolumbus/2019/07/14/why-ai-is-the-future-of-cybersecurity/#e5766b0117eb [Accessed 23 October 2019].

[5] Atzori, L., Iera, A., and Morabito, G. (2017). Understanding the Internet of Things: definition, potentials, and societal role of a fast evolving paradigm. Ad Hoc Networks56, pp.122-140.

[6] Weiser, M. (1991). The Computer for the Twenty-First Century Scientific American. September Elsevier Ltd.

[7] Butler, B. (2017). What is edge computing and how it’s changing the network. NetworkWorld [online] Available at: https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html [Accessed 23 October 2019].

[8] Ouissal, S. (2019). What is Edge Computing? Forbes [online] Available at: https://www.forbes.com/sites/quora/2019/08/23/what-is-edge-computing/#4c096c4a690c [Accessed 29 October 2019].

[9] Frank, J. (1994). Artificial intelligence and intrusion detection: Current and future directions. In Proceedings of the 17th national computer security conference (Vol. 10, pp. 1-12).

[10] Miller, P. (2018). What is edge computing? The Verge [online] Available at: https://www.theverge.com/circuitbreaker/2018/5/7/17327584/edge-computing-cloud-google-microsoft-apple-amazon [Accessed 23 October 2019].

[11] Morel, B. (2011). Artificial intelligence and the future of cybersecurity. In Proceedings of the 4th ACM workshop on Security and artificial intelligence (pp. 93-98). ACM.

[12] Pan, J., and Yang, Z. (2018). Cybersecurity challenges and opportunities in the new edge computing+ IoT world. In Proceedings of the 2018 ACM International Workshop on Security in Software Defined Networks and Network Function Virtualization (pp. 29-32). ACM.