The combination of billions of interconnected IoT devices and the upcoming 5G network era, are about to drive a profound change in the way computing workloads are deployed. It is expected that there will be more than 5.600 million smart sensors and connected IoT devices across the globe by 2020. Furthermore, the data generated by these devices will be to the tune of 5.000+ zettabytes, while the IoT market size is expected to reach $724 billion by the end of 2023. Current state of the art faces the following challenges: (i) a severely fragmented market - since all major players try to push their own proprietary, closed-source solutions, they face limitations in connectivity, interoperability issues and lack of vendor cooperation; (ii) the conventional centralized cloud computing is encountering severe challenges - with the rapid development of mobile internet and IoT applications, conventional cloud computing is faced with high latency, low Spectral Efficiency, and non-adaptive machine type of communication. The most common approach for solving those issues has been centralized cloud computing, but due to the near-exponential increase of interconnected devices along with their diversity (vastly heterogeneous ecosystem), proved it not to be efficient.
Over the last few years, one of the dominant concepts for tackling increased latency and delay was Edge Computing. The fundamental idea of Edge computing is to eliminate distance between the actual source of the data and the corresponding computational resources responsible for handling it. Taking the notion of Edge and Cloud computing into account, several computing paradigms have already been introduced, such as Fog Computing and Multi-access Edge Computing. Fog computing is an architecture that distributes computation, communication, control and storage closer to the end users, along the cloud-to-things continuum, enabling edge computation and its functionality can also be seamlessly expanded to the core network. However, it lacks certain security features such as mechanisms for data integrity, authenticated access to services for its highly distributed architecture, along with a unified monitoring entity that ensures optimal resource management. Tackling this, Multi-access Edge Computing (MEC) is an architectural model for providing cloud computing capabilities paired with an IT service environment at the edge of the mobile network, within the Radio Access Network (RAN) and in proximity to the service subscribers. The networking environment created based on this model, offers ultra-low latency, high bandwidth, real-time access to radio network and context information, location awareness, efficient network operation and service delivery, thus ensuring high quality of experience for all interconnected users. The specific model highly enables innovation and value creation through application leveraging, bringing many advantages to all stakeholders.
The Spark Works IoT Platform, will enable through E2DATA project a novel Edge Analytics platform which will entail processing and analysing data closer to the source of where that data is collected, being able to: (i) ingest, pre-process and categorize data, essentially acting as a mini data centre and (ii) deliver the aggregated and pre-processed data to a nearby computational node, located into a gateway networking device, a computer, or an affiliated micro data centre for longer-term storage and further analysis. With the Spark Works IoT platform, a vast amount of processing power becomes decentralized, and is offloaded from cloud service providers while the end-to-end process consequently overcomes current limitations of conventional centralized cloud computing. The radical design of Spark Works IoT Platform: (i) increases the speed of data analysis and (ii) decreases the load placed on internet networks to transmit huge amounts of data.
The Spark Works Analytics platform will provide a generic execution environment custom-tailored for low-end IoT devices. This execution environment allows the execution of code that does not need to be pre-deployed or pre-configured to the devices.
The platform allows developers to modify and optimize parts of the source code even after the deployment of the IoT infrastructure while the IoT platform automatically identifies the optimal location for executing the updated code based on (i) the volume of data that is required, (ii) the location of the data, (iii) the available resources of the different components that constitute the system. Spark Works IoT Platform delivers an environment where domain experts can focus on specifying the rules for data-driven processing and event-based response while it automatically tunes the edge and cloud infrastructure for high-performance and high-availability.
The platform is built around a dedicated module for handling all data-related tasks in real-time, namely the Continuous Computation Engine. The Spark Works continuous computation engine is designed with emphasis on speed and data resilience, being able to process a large amount of data collected from sensor nodes within just seconds, while maintain its operational efficiently regardless of the underlying hardware platform. The platform’s ability to operate efficiently even over limited hardware resources and network connectivity is of paramount importance for the overall context of Edge Computing.
Spark Works IoT Platform is designed to enable easy and fast implementation of applications that utilize an IoT infrastructure. It offers extremely high scalability in terms of active users, number of interconnected devices and volume of processed data, while remaining communication protocol-agnostic and able to operate on limited hardware resources. The platform accommodates real-time information processing collected from mobile sensors and smartphones and offers fast analytic services. The integrated modules, if necessary, will partially depend on cloud infrastructure to offer real time processing and analysis of unlimited IoT data streams with minimal delay and processing costs; however, the platform is designed to ensure autonomous operability and native capability for data and information processing at an independent, local layer.
A high-level architecture diagram is shown in the figure above (for illustration purposes, the term “Edge” is interchanged with “Fog”). Designing an IoT infrastructure capable of being deployed at a national level, entails a broad range of functional and non-functional requirements. Such a platform will be used by a broad variety of people-centric applications, each with different roles and expectations from it. From a top-down architectural overview the Spark Works IoT platform: (i) is completely containerized (currently Docker containers), allowing seamless horizontal scaling based on the data load; (ii) offers cutting edge scheduling capabilities by exploiting all features of Docker Swarm; (iii) operates in a fully decentralized manner; (iv) efficiently executes on-the-edge data analysis algorithms; (v) is deployed using an optimized microservices-based scheme; (vi) supports modern and agile software deployment, updating, patching and reconfiguration practices, commonly referred as CI/CD. Facilitated by the Continuous Computation Engine and enhanced by certain auxiliary modules which handle dedicated tasks while being deployed using the contemporary “Microservices” paradigm, the Spark Works IoT Platform delivers a series of vital services, beneficial for every over-the-top IoT framework, installation or overall ecosystem: online analytics, storage & replay, end-to-end security and access management.