Professor Klaus Moessner
Academic and research departmentsInstitute for Communication Systems, Department of Electrical and Electronic Engineering.
Klaus Moessner is Professor in Cognitive Networks in the 5G Innovation Centre (5GIC). He has been actively involved in the various European Community funded research frameworks (from FP 5 onwards). He has also had involvement in some 20+ other EU funded projects, has been technical manager and project manager, has led the FP7 project SocIoTal and is currently leading the H2020 EU-Japan project iKaaS, the 5G PPP project Speed5G, and from September 2017 the EU-Taiwan project Clear5G. Klaus' research interests are around the aspects of resource management in wireless communication systems, reconfigurability on the different system levels, including reconfiguration management and scheduling in wireless networks as well as network supported adaptability of multimodal user interfaces. He was founding chair of the IEEE DySPAN working group 6 on Spectrum Sensing Interfaces.He is actively involved in the investigation and teaching of mobile service provision (including IoT deployments and services). His research includes the area of resource efficiency and on mechanisms for dynamic resource allocation. And he is contributing to the work on System Architecture and Co-existence aspects in the 5GIC, covering aspects including dynamic spectrum sharing/access, self-organisation of the radio access and the regulatory implications of DSA and Cognitive Radio Networks.
His work beyond 5G includes methods for dynamic capacity extension in mobile environments.
Since Mai 2019, Klaus is also head of the Professorship of Communications Engineering at the Technische Universität Chemnitz, Germany.
Affiliations and memberships
In the media
My research interests include cognitive networks, IoT deployments and sensor data based knowledge generation, as well as reconfiguration and resource management.
Research projects 5G promises to deliver throughput and capacity, together with ultra-low latency and reliability, at levels that allow the introduction of hitherto unseen services. At the operational level, 5G systems are expected to support: 1,000 times higher mobile data volume per geographical area, 10 to 100 times higher number of connected devices, and 10 to 100 times higher typical user data rate, and a host of other requirements on latency, energy-efficiency, device density etc.
It will take a multipronged approach to achieve these targets: allocation of new spectrum bands (including in the mmWave range), integration of multiple RATs (Radio Access Technology) for efficient utilization of heterogeneous resources as well as reduction of cell size and densification of cells will all contribute their share to achieve these increases. The Speed-5G project has tackled the capacity challenge through tight integration of different RATs at MAC-layer, spectrum sharing and a much more dynamic use of spectrum and radio resources, leading to significant efficiencies and higher capacity.
The Speed-5G project has defined and evaluated the eDSA (enhanced Dynamic Spectrum Access) approach based on novel solutions for resource aggregation, sharing and dynamic allocation of spectrum and radio resource management, i.e. the hierarchical RRM approach. The introduction of different levels of resource management, operating at different time-scales as well as the stratification of the traditional MAC layer, as introduced by Speed-5G, enable support of use-case and application-specific optimization of the radio resources.
The Speed 5G project has been funded by the European Commission under Horizon 2020, the EU Framework Programme for Research and Innovation, grant agreement number 671705. details to follow details to follow TagItSmart is about a "Smart Tags driven service platform for enabling ecosystems of connected objects".
IoT is about connecting objects, things, devices, billions of them, and not only the obvious ones like smart phones, TVs, cameras, or the usual examples of objects like cars, homes, fridges, coffee machines, kettles, glasses etc. All of these can be easily connected and that at a very low cost. The main challange is to connect any time of object or consumer good, and to do so at a very low cost: a carton of milk, a package of steak, a basket of apples, a book, a CD etc. Typically, these products are identified by printed tags (barcodes, QR codes). The codes relate directly to the product (type) they tag, however not to the unique unit/object that holds the tag. Once attached to an object, tags are usually static and the information they provide does not change, regardless of the state or events happening in the immediate environment of that product.
The main emphasis of TagItSmart is to use the feature of dynamic codes, consisting of codes printed with functional ink, that change according to the context changes to which a object is being exposed. Smart phones are used to capture/record/transmit these codes which then creates context sensors for mass-market products and converts these mass-market products into "connected mass-market products" with unique identity and unique records about their history.
TagItSmart: Functional ink + optical (area) tags + crowd sourced smart phones + cloud = IoT for mass-market domain across application sectors
My research interests include cognitive networks, IoT deployments and sensor data based knowledge generation, as well as reconfiguration and resource management.
Postgraduate research supervision
I am constantly searching for new PhD students for a wide range of research topics including (but not limited to):
- Sensor Networking and Mobile Networking
- IoT deployments and IoT Services
- Machine Learning for Context analysis and knowledge generation (for IoT deployments and network management)
- Resource Management in Networks, Network Architectures
- Distributed Service provision
There is no fixed funding allocation to support PhD students, meaning you will have to find your own funding from national programms or alike. However, there are many external funding sources and I am very happy to help, saying this, occasionally we may have funding available to fund research students. Independent of the funding situation, if you do have research ideas, or a research proposal in any of these areas, please do contact me.
Final year and MSc projects:
2020/21 projects for UGs and MSc's are available as shown below.
Typically, I am offering offering Final Year and MSc projects in the areas:
- IoT and mobile applications:
- Mobile services design and development (mobile apps, web apps, ...)
- Internet of Things experimentation and development (sensing, actuation, knowledge derivation)
- Location based services
- IoT based real world knowledge and event detection
- Wireless Communication Systems/5G Systems
- Spectrum sharing and resource scheduling
- Service provision
- Radio context
- Access Network Management
In this article, an enhanced model reference adaptive control (EMRAC) algorithm is used to design a generic lateral-tracking controller for a vehicle. This EMRAC is different from the EMRAC in the literature as it adopts a σ-modification approach to bind the adaptive gain of the switching action. Moreover, an extended Lyapunov theory for discontinuous systems is used to analytically prove the ultimate boundedness of the closed-loop control system when the adaptive gain of the switching action is bounded with a σ-modification strategy. The control algorithm is applied to a vehicle path-tracking problem and its tracking performance is investigated under conditions of: 1) external disturbances such as crosswind; 2) road surface changes; 3) modeling errors; and 4) parameter missmatches in a co-simulation environment based on IPG Carmaker/MATLAB. The simulation studies show that the controller is effective at tracking a given reference path for performing different autonomous highway driving maneuvers while ensuring the boundedness of all closed-loop signals even when the system is subjected to the conditions mentioned above.
The fifth generation (5G) of wireless networks promises to meet the stringent requirements of vehicular use cases that cannot be supported by previous technologies. However, the stakeholders of the automotive industry (e.g., car manufacturers and road operators) are still skeptical about the capability of the telecom industry to take the lead in a market that has been dominated by dedicated intelligent transport systems (ITS) deployments. In this context, this paper constructs a framework where the potential of 5G to support different vehicular use cases is thoroughly examined under a common format from both the technical and business perspectives. From the technical standpoint, a storyboard description is developed to explain when and how different use case scenarios may come into play (i.e., pre-conditions, service flows and post-conditions). Then, a methodology to trial each scenario is developed including a functional architecture, an analysis of the technical requirements and a set of target test cases. From the business viewpoint, an initial analysis of the qualitative value perspectives is conducted considering the stakeholders, identifying the pain points of the existing solutions, and highlighting the added value of 5G in overcoming them. The future evolution of the considered use cases is finally discussed.
There has been increasing interest in deploying Internet of Things (IoT) devices to study human behavior in locations such as homes and offices. Such devices can be deployed in a laboratory or “in the wild” in natural environments. The latter allows one to collect behavioral data that is not contaminated by the artificiality of a laboratory experiment. Using IoT devices in ordinary environments also brings the benefits of reduced cost, as compared with lab experiments, and less disturbance to the participants’ daily routines, which in turn helps with recruiting them into the research. However, in this case, it is essential to have an IoT infrastructure that can be easily and swiftly installed and from which real-time data can be securely and straightforwardly collected. In this article, we present MakeSense, an IoT testbed that enables real-world experimentation for large-scale social research on indoor activities through real-time monitoring and/or situation-aware applications. The testbed features quick setup, flexibility in deployment, the integration of a range of IoT devices, resilience, and scalability. We also present two case studies to demonstrate the use of the testbed: one in homes and one in offices.
This paper considers an optimization problem that maximizes an aggregate utility, formulated as the weighted geometric mean of the "in-context" suitability of a set of radio access technologies (RATs) to support adaptive video streaming, subject to the existence of legacy data transfers. Motivated by the unfeasibility of solving the formulated problem centrally when the various RATs are loosely integrated (i.e., at core network (CN) level), a hybrid (i.e., network-assisted user-driven) strategy is devised to approximate its optimum solution. Unlike previous hybrid approaches, the proposed methodology exploits network assistance to ensure a friendly co-existence between adaptive video streaming clients and legacy users. It operates on different timescales, where the fastest timescale operation is performed on the video clients according to a policy that is tuned by the network on slower timescales. A user tuning on the fastest timescale (i.e., tens of ms) enables to adapt video streaming depending on the perceived quality-of-experience (QoE) and local components of the context (e.g., remaining credit and battery level). A small-cell tuning on a slower timescale (i.e., hundreds of ms) enables to preempt the resources used by legacy users based on the operating conditions (e.g., load and type of scheduler). Finally, a tuning performed by the network on the slowest timescale (i.e., few seconds) offloads legacy data transfers to unlicensed bands whenever the amount of interference on licensed bands reaches critical levels, which helps to sustain good QoE for all video clients. A cost-benefit analysis reveals that the proposed methodology performs closely to its centralized counterpart with much less control overhead on the radio interface.
The manufacturing industry is regarded as one of the most demanding verticals with respect to URLLC requirements. A Device-to-Device (D2D) communication system is a key enabler that has been introduced to support URLLC. To enhance spectrum utilisation, D2D links can share access of radio spectrum resources occupied by cellular users. Spectrum reuse may result in performance degradation due to mutual co-channel interference of co-existing cellular and D2D users. Resource sharing in a D2D-based cellular network for wireless industrial applications is investigated in this paper. The presented schemes aim to maximise the overall system throughput while maintaining the Quality of Service / Experience (QoS/ QoE) requirements. Interference is managed by jointly considering admission and power control based on the relative distance between devices. A priced-Deferred Acceptance (p-DA) scheme with an incentive-based stability is developed to match D2D users to cellular resources. Numerical simulations show that the p-DA scheme is able to achieve better performance than the conventional DA algorithm and is close in performance to the presented centralised optimisation approach.
In this paper we present and evaluate the performance of a routing and link scheduling algorithm for millimeter wave (mmWave) backhaul networks. The proposed algorithm models the end user behavior as being selfish, i.e., it considers users always aiming to maximize their individual utility, rather than the global optimization objective. Our system utilizes popular concepts from the economics and fairness literature. Specifically, in order to forward packets between the access points that comprise the backhaul network the Shapley value method is applied, which is shown to induce solutions with reduced latency. The performance of the proposed algorithm is evaluated in terms of the total delay, as well as the price of anarchy, which represents the inefficiency of a scheduling policy when users are allowed to adapt their rates in a selfish manner and reach an equilibrium. A relaxed version of the problem is also presented, which provides a lower bound on the value of the optimal solution. This is used for the calculation of the price of anarchy, since the problem of finding the optimal solution is NP-hard. According to simulation results, the system that employs the proposed algorithm outperforms in terms of delay and price of anarchy a system that considers a First-In-First-Out (FIFO) packet forwarding policy, as well as a system that employs local search global optimization, under which users aim at optimizing the overall delay in the network.
Advanced automation is being adopted by manufacturing facilities and wireless technologies are set to be a key component in driving the factories of the future. It is expected that private cellular networks and WLAN technologies would be deployed for smart factory operations. Since both wireless technologies can operate on the same channel in unlicensed bands, then efficient resource sharing becomes important. When multiple devices compete for the resource, the estimation of number of devices contending for the channel resource can help the design of an efficient resource sharing scheme. This paper aims to address the challenge of estimating the number of factory devices contending to transmit over the unlicensed channel. We adopt three machine learning (ML) techniques and develop a novel device number estimation system by collating and analysing the idle-time interval between transmission across the channel. By using NS-3 simulation, the performance of the proposed estimation approach is evaluated. The results presented reveal the significance of the chosen features and performance of each ML algorithm used.
The 5th generation (5G) mobile networks and beyond need to support massive machine-type communications (MTC) devices with limited available radio resources. In this paper, we study the power-domain non-orthogonal multiple access (NOMA) technology to support energy-efficient massive MTC networks, where MTC devices exchange information using sporadic and low-rate short packets. We investigate the subchannel allocation and power control policy to maximize the achievable effective energy efficiency (EE) for uplink NOMA-based massive MTC networks, taking into account of short-packet communication characteristics. We model the subchannel allocation problem as a multi-agent Markov decision process and propose an efficient Q-learning algorithm to solve it. Furthermore, we obtain the optimal transmission power policy by approximating the achievable effective rate of uplink NOMA-based short packet communications. Compared with the existing OFDMA scheme, simulations validate that the proposed scheme can improve the achievable effective EE of massive MTC networks with 5.93%.
One of the most fundamental forms of cooperation in any network is the cooperation between network nodes for routing and the subsequent context dissemination. To do so each node runs an instance of a routing process relying, in many cases, only on partial network information rather than network-wide information. This can lead to instabilities and problematic situations, such as deadlocks or livelocks. Deadlock is a condition where a process stalls; meaning it reaches a state from which there is no exit action. When it comes to routing this would mean the condition where a packet reaches a node and is not forwarded any further because the routing process has reached a state which was not taken into account in its behavioural specification. Livelock is a condition from where a process can exit; however every exit action will eventually lead the process back to the same condition. With respect to routing this would refer to the existence of loops. In this paper we show how formal verification, and in particular model checking, can be applied in this context; to find such problems and also assess the performance and quantify properties of the overall routing process. As an example case study we use a routing protocol designed for wireless sensor networks named Adaptive Load Balanced Algorithm Rainbow version, suitable for context dissemination in Wireless Sensor Network environments, where energy efficient operations are also important.
© 2014 Springer Science+Business Media New YorkCreating Internet-of-Things (IoT) solutions that can be deployed at scale requires adequate experimentation environments. In the area of experimentation, two trends can be observed. First, there is a shift from lab-based, controlled experiments to experimenting “in the wild”: researchers tend to augment the users’ natural environments and observe how people integrate a new solution into their everyday lives. Second, when a substantial investment in setting up an experimentation infrastructure has been made, it makes sense to open it to a wide community of researchers; the concept of Experimentation-as-a-Service (EaaS) is emerging along these lines. SmartCampus, an IoT testbed developed at the University of Surrey, fits the both trends very well. It involves real users in a natural setting, as IoT devices are deployed in the users’ offices. Further, several user-centric experiments conducted in the SmartCampus were driven by external researchers, i.e., people who do not belong to the team that developed the testbed. In this paper we report on lessons learned from such IoT experiments. After a brief overview of SmartCampus and the experiments themselves, we offer a simple experiment stakeholder model, which identifies key actors and interfaces between them. We then focus on issues related to the external experimenters who take advantage of the experimentation “service.” That focus is motivated by our realization that EaaS, while attractive in principle, gives rise to a number of non-trivial challenges.
This paper describes a distributed, cooperative and real time rental protocol for DCA operations in a multi system and mult) cell context for OFDMA systems. A credit token based rental protocol using auctioning Is proposed in support of dynamic spectrum sharing between cells. The proposed scheme can be tuned adaptively as a function of the context by specifying the credit tokens usage in the radio etiquette. The application of the rental protocol is illustrated with an ascending bid auctioning. The paper also describes two approaches for BS-BS communications in support of the rental protocol. Finally, it is described how the proposed mechanisms contribute to the current approaches followed in the IEEE 802.16h and IEEE 802.22 standards efforts addressing cognitive radio, © 2006 IEEE.
What activities take place at home? When do they occur, for how long do they last and who is involved? Asking such questions is important in social research on households, e.g., to study energyrelated practices, assisted living arrangements and various aspects of family and home life. Common ways of seeking the answers rest on self-reporting which is provoked by researchers (interviews, questionnaires, surveys) or non-provoked (time use diaries). Longitudinal observations are also common, but all of these methods are expensive and time-consuming for both the participants and the researchers. The advances of digital sensors may provide an alternative. For example, temperature, humidity and light sensors report on the physical environment where activities occur, while energy monitors report information on the electrical devices that are used to assist the activities. Using sensor-generated data for the purposes of activity recognition is potentially a very powerful means to study activities at home. However, how can we quantify the agreement between what we detect in sensor-generated data and what we know from self-reported data, especially nonprovoked data? To give a partial answer, we conduct a trial in a household in which we collect data from a suite of sensors, as well as from a time use diary completed by one of the two occupants. For activity recognition using sensor-generated data, we investigate the application of mean shift clustering and change points detection for constructing features that are used to train a Hidden Markov Model. Furthermore, we propose a method for agreement evaluation between the activities detected in the sensor data and that reported by the participants based on the Levenshtein distance. Finally, we analyse the use of different features for recognising different types of activities.
Dynamic spectrum management is a promising solution for network operators to efficiently utilise the limited radio spectrum and guarantee operator's profit by increasing capacity as well as generating more spectrum opportunities for opportunistic use. This paper presents a novel algorithm for efficient spectrum management to optimise spectrum utilisation between two sharing UMTS cellular operators. It is shown that proposed solution approach increases revenue of sharing operators without sacrificing the quality of service on either network. A multioperator UMTS simulation tool is also developed to evaluate the performance of proposed algorithm. The simulation results show that the proposed algorithm achieves high efficiency of spectrum utilisation and gains up to 33% can be achieved for both uniform and non-uniform distribution of traffic. © 2011 IEEE.
Spectrum trading allows incumbents of spectrum to increase their profit margins by selling portions of unused spectrum to secondary systems. While being a spectrally efficient process; interference from secondary usage degrades the performance of the incumbent network. We propose distributed power control algorithms with incumbent protection for secondary systems which is incorporated in the incumbent's pricing mechanism. Pricing is a fundamental part of the spectrum trading economic model; it indicates the value of spectrum to both buyers (SSP) and sellers (PSP) and influences the forces of demand and supply. There exist a compromise between the accrued revenue from spectrum trading and the resultant QoS degradation caused by SSPs to PSPs. interference from SSPs may increase churn within the PSP network, therefore, pricing algorithms need to be efficiently designed to capture PSPs profit in relation to the magnitude of SSPs interference. We consider distributed power control algorithms (PCA) for the SSP network and allow secondary spectrum trading in TV white spaces (TVWS) and demonstrate that a PSP can profit from secondary spectrum trading without obtrusive interference from SSPs. The power control algorithm introduced therefore ensures that the incumbent's interference threshold is never violated, leading to increased profit and reduced churn for the incumbent operator and improved quality of service for the secondary system due to spectrum buy. © 2013 IEEE.
In this paper we investigate the impact that introduction of new ambient networks (AN) functionality will have on usage of system resources and on connection delay. The signalling load for multiple attachment and negotiation procedures is assessed by modelling signalling sequences for a WLAN system enabled with AN technology. The load is computed for varying numbers of users and for users with different levels of "willingness to evaluate and negotiate offers". The results show that the most important parameter is the number of attachment attempts per time unit, which is an indicator of user activity level. In the investigated scenarios, the relative load of signalling is 0.1 - 1.0 % of the transferred user data. The delay depends on the current load situation of the network.
The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented.
Smart Cities use different Internet of Things (IoT) data sources and rely on big data analytics to obtain information or extract actionable knowledge crucial for urban planners for efficiently use and plan the construction infrastructures. Big data analytics algorithms often consider the correlation of different patterns and various data types. However, the use of different techniques to measure the correlation with smart cities data and the exploitation of correlations to infer new knowledge are still open questions. This paper proposes a methodology to analyse data streams, based on spatio-temporal correlations using different correlation algorithms and provides a discussion on co-occurrence vs. causation. The proposed method is evaluated using traffic data collected from the road sensors in the city of Aarhus in Denmark.
In this paper we present and evaluate the performance of a resource allocation algorithm to enhance the Quality of Service (QoS) provision and energy efficiency of downlink Orthogonal Frequency Division Multiple Access (OFDMA) systems. The proposed algorithm performs resource allocation using information on the downlink packet delay, the average delay and data rate of past allocations, as well as the downlink users' buffer status in order to minimize packet segmentation. Based on simulation results, the proposed algorithm achieves significant performance improvement in terms of packet timeout rate, goodput, fairness, and average delay. Moreover, the effect of poor QoS provision on energy efficiency is demonstrated through the evaluation of the performance in terms of energy consumption per successfully received bit.
In this paper we present and evaluate the performance of a resource allocation algorithm to enhance the Quality of Service (QoS) provision and energy efficiency of uplink Long Term Evolution (LTE) systems. The proposed algorithm considers the main constraints in uplink LTE resource allocation, i.e., the allocation of contiguous sets of resource blocks of the localized Single Carrier – Frequency Division Multiple Access (SC-FDMA) physical layer to each user, and the imperfect knowledge of the users' uplink buffer status and packet waiting time. The optimal resource allocation is formulated as a discrete connected cake-cutting problem, where different agents are allocated consecutive subsequences of a sequence of indivisible items. This problem is NP-hard, therefore a suboptimal algorithm is introduced, which performs resource allocation using information on the estimated uplink packet delay, the average delay and data rate of past allocations, as well as the required uplink power per resource block. Based on simulation results, the proposed algorithm achieves significant performance improvement in terms of packet timeout rate, goodput, and fairness. Moreover, the effect of poor QoS provision on energy efficiency is demonstrated through the evaluation of the performance in terms of energy consumption per successfully received bit.
5G systems are expected to advance on a number of aspects compared with current systems, for example, providing a 1000 times higher capacity, a much lower latency and improved quality of user experience. This paper presents the vision and approach followed by the H2020 project SPEED-5G. The approach is based on densification of small cells, exploitation of Multi-RAT, development of new resource management techniques and a more efficient use of spectrum. A novel 5G system architecture is proposed based on the Network Slicing paradigm, which enables a highly flexible, scalable and backwards compatible architecture. A core aspect is the definition of a new MAC layer that facilitates Multi-RAT access and allows prioritising and allocating traffic across heterogeneous access technologies.
A practical wireless network solution for providing community broadband Internet access services are considered to be wireless mesh networks with delay-throughput tradeoff. This important aspect of network design lies in the capability to simultaneously support multiple independent mesh connections at the intermediate mobile stations. The intermediate mobile stations act as routers by combining network packets with forwarding, a scenario usually known as multiple coding unicasts. The problem of efficient network design for such applications based on multipath network coding with delay control on packet servicing is considered. The simulated solution involves a joint consideration of wireless media access control (MAC) and network-layer multipath selection. Rather than considering general wireless mesh networks, here the focus is on a relatively small-scale mesh network with multiple sources and multiple sinks suitable for multihop wireless backhaul applications within WiMAX standard. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.
Fifth generation (5G) envisages a “hyperconnected society” with an enormous number of interconnected devices, anywhere and at any time. Edge computing plays a pivotal role in this vision, enabling low latency, large traffic volumes, and improved quality of experience. The advent of 5G and edge computing encourages vertical industries to develop innovative services, which can meet the challenging demands coming from consumers. However, economic feasibility is the ultimate factor that determines the viability of a new service. Hence, effective techniques for the economic assessment of such services are needed. This paper analyzes the provision of immersive media services in crowded events, through a cloud‐enabled small cell network owned by a neutral host, and offered in multitenancy to different mobile network operators. We initially develop a planning model to predict the required compute, storage, and radio resources. Taking into account dynamic factors such as service penetration and price evolution, we then provide a number of economic indices, such as net present value, internal rate of return, and expected payback period to assess the viability of a potential investment in a 5G infrastructure for immersive media services. The presented analysis will guide small cell network operators in the provision of 5G innovative media services.
The ongoing development of mobile communication networks to support a wide range of superfast broadband services has led to massive capacity demand. This problem is expected to be a significant concern during the deployment of the 5G wireless networks. The demand for additional spectrum to accommodate mobile services supporting higher data rates and having lower latency requirements, as well as the need to provide ubiquitous connectivity with the advent of the Internet of Things (IoT) sector, is likely to considerably exceed the supply, based on the current policy of exclusive spectrum allocation to mobile cellular systems. Hence, the imminent spectrum shortage has introduced a new impetus to identify practical solutions to make the most efficient use of the scarce licensed bands in a shared manner. Recently, the concept of dynamic spectrum sharing has received considerable attention from regulatory bodies and governments globally, as it could potentially open new opportunities for mobile operators to exploit spectrum bands whenever they are underutilised by their owners, subject to service level agreements. Although various sharing paradigms have been proposed and discussed, the impact and performance gains of different schemes can be scenario-specific and vary depending on the nature of the sharing parties, the level of sharing and spectrum access scheme. In this survey, we describe the main concepts of dynamic spectrum sharing, different sharing scenarios, as well as the major challenges associated with sharing licensed bands. Finally, we conclude this survey paper with open research challenges and suggest some future research directions.
Cognitive radio is an enabling technology that allows opportunistic users to reuse licensed spectrum in order to overcome the artificial spectrum scarcity. In cognitive radio networks, opportunistic users collaboratively perform spectrum sensing to detect the presence of incumbent users. Collaborative Spectrum Sensing (CSS) performance suffers due to the presence of malicious users. We propose a robust malicious user detection algorithm which exploits inherent statistical moments of the sensing observations from collaborating users to identify malicious users among them. In this way, CSS performance is improved by detecting and marginalizing the effects of malicious users.
IoT data analytics is underpinning numerous applications, however the task is still challenging predominantly due to heterogeneous IoT data streams, unreliable networks and ever increasing size of the data. In this context, we propose a two layer architecture for analyzing IoT data. The first layer provides a generic interface using a service oriented gateway to ingest data from multiple interfaces and IoT systems, store it in a scalable manner and analyze it in real-time to extract high-level events whereas second layer is responsible for probabilistic fusion of these high-level events. In the second layer, we extend state-ofthe- art event processing using Bayesian networks (BNs) in order to take uncertainty into account while detecting complex events. We implement our proposed solution using open source components optimized for large-scale applications. We demonstrate our solution on real-world use-case in the domain of intelligent transportation system (ITS) where we analysed traffic, weather and social media data streams from Madrid city in order to predict probability of congestion in real-time. The performance of the system is evaluated qualitatively using a web-interface where traffic administrators can provide the feedback about the quality of predictions and quantitatively using F-measure with an accuracy of over 80%.
The widespread use of IoT devices has opened the possibilities for many innovative applications. Almost all of these applications involve analyzing complex data streams with low latency requirements. In this regard, pattern recognition methods based on CEP have the potential to provide solutions for analyzing and correlating these complex data streams in order to detect complex events. Most of these solutions are reactive in nature as CEP acts on real-time data and does not exploit historical data. In our work, we have explored a proactive approach by exploiting historical data using machine learning methods for prediction with CEP. We propose an adaptive prediction algorithm called Adaptive Moving Window Regression (AMWR) for dynamic IoT data and evaluated it using a realworld use case. Our proposed architecture is generic and can be used across different fields for predicting complex events.
The heterogeneous, dynamic nature of ubiquitous environments necessitates that all system components that form part of a personalisation framework should be context aware. Personalised service delivery requires that the system must detect and interpret device modality contexts in real time and provide automated adaptation on behalf of the user. Towards this aim, this paper presents the design and implementation of a demonstrator that offers personalised, context sensitive, service and content delivery.
The use of service-oriented computing paradigm in Internet of Things research has recently received significant attention to create a semantic service layer that supports virtualisation of and interaction among ``Things''. Using service-based solutions will produce a deluge of services that provide access to different data and capabilities exposed by different resources. The heterogeneity of the resources and their service attributes require efficient solutions that can discover services and match them to the data and capability requirements of different users. We propose a distributed hybrid service matchmaking method that combines our previous work on probabilistic service matchmaking using latent semantic analysis with a weighted-link analysis based on logical signature matching. The hybrid method can overcome semantic synonymy in semantic service description which usually presents the biggest challenge for semantic service matchmakers. The results show that the proposed method performs better than existing solutions in terms of precision and normalised discounted cumulative gain measurement values.
Event detection has been studied and researched for many years and it has been applied in real world applications with the aim of characterising a situation in the real world. In order to capture a situation, Wireless Sensor Networks (WSNs) are deployed and sensor nodes are used to sense the entities of interest for the real world application; sensing the environment results in the production of a large and often continuous production of raw data. In this context, event detection is used in order to extract the most relevant and useful information from this large set of data. The constraints of nodes have to be taken into account such as energy, computation, and memory. The environment is observed from a program that is hosted on a sensor node. Machine learning and data mining techniques are embedded in the program to learn from the environment and detect events. A collaborative sensing is a technology to process an event from distrusted nodes which can enhance an accuracy result that can be fault or event. This research studied processing sensor data to detect events using multiple sensor nodes. A model and/or rules are defined in order to detect an outlier from data matching between sensor data and the model and/or rules. An outlier is analysed and processed to detect an event. The main contributions of this work have been on collaborative sensing in different sensors including clustering analysis for data labelling, classification analysis in order to process an outlier for an event detection.
The Web of Things (WoT) paradigm enables access to physical world things and their data through standard Web protocols. This provides interoperability at the hardware and communication protocol level, but does not add intelligence to the things or facilitate unambiguous interpretation of their data. The evolution of the WoT towards the semantic WoT offers the promise of meeting the interoperability challenge through the use of semantic Web technologies. The W3C Web of Things initiative encourages the use of common vocabularies to ensure interoperability and a common understanding of the domain knowledge. Ontologies provide a structured, common formalism to the disparate elements of the WoT and can form the basis of a common knowledge base. The research community and standardisation bodies have developed numerous ontologies describing the elements of the WoT and associated domains. A comprehensive review of the various proposed ontologies is needed to facilitate the adoption and reuse of the available models. This survey reviews the current state-of-the-art in WoT ontologies, which are presented from two perspectives: cross-domain ontologies which are classified into device, service, data and localisation models, and domain ontologies, which are presented from an environmental and user-oriented perspective.
Semantic modeling for the Internet of Things has become fundamental to resolve the problem of interoperability given the distributed and heterogeneous nature of the “Things”. Most of the current research has primarily focused on devices and resources modeling while paid less attention on access and utilisation of the information generated by the things. The idea that things are able to expose standard service interfaces coincides with the service oriented computing and more importantly, represents a scalable means for business services and applications that need context awareness and intelligence to access and consume the physical world information. We present the design of a comprehensive description ontology for knowledge representation in the domain of Internet of Things and discuss how it can be used to support tasks such as service discovery, testing and dynamic composition.
Even mobile Web Services are still provided using servers that usually reside in the core networks. Main reason for not providing large and complex Web Services from resource limited mobile devices is not only the volatility of wireless connections and mobility of mobile hosts, but also, the often limited processing power. Offloading of some of the processing tasks is one step towards achieving optimal mobile Web Service provision. This paper presents two frameworks for providing distributed mobile Web Services: One mobile service provision framework is built on Simple Object Access Protocol (SOAP), while the other implements Representational State Transfer (REST) architecture. Both frameworks have been extended with offloading functionality and different types of resource intensive operations, i.e., process intensive and bandwidth intensive services, have been tested. The results show that using a REST-based framework leads of a better performing offloading behaviour, compared to SOAP-based mobile services. Distributed mobile services based on REST consume fewer resources and achieve better performance compared to SOAP based mobile services. The paper describes the approach, evaluation method and findings.
This work presents a Cognitive Management framework for empowering the Internet of Things (IoT). This framework has the ability to dynamically adapt its behaviour, through self-management functionality, taking into account information and knowledge (obtained through machine learning) on the situation (e.g., internal status and status of environment), as well as policies (designating objectives, constraints, rules, etc.). Cognitive technologies constitute a unique and efficient approach for addressing the technological heterogeneity of the IoT and obtaining situation awareness, reliability and efficiency. The paper also presents a first indicative implementation of the proposed framework, comprising real sensors and actuators. The preliminary results of this work demonstrate high potential towards self-reconfigurable IoT. © The Author(s).
The wide field of wireless sensor networks requires that hun- dreds or even thousands of sensor nodes have to be main- tained and configured. With the upcoming initatives such as Smart Home and Internet of Things, we need new mecha- nism to discover and manage this amount of sensors. In this paper, we describe a middleware architecture that uses con- text information of sensors to supply a plug-and-play gate- way and resource management framework for heterogeneous sensor networks. Our main goals are to minimise the effort for network engineers to configure and maintain the network and supply a unified interface to access the underlying het- erogeneous network. Based on the context information such as battery status, routing information, location and radio signal strength the gateway will configure and maintain the sensor network. The sensors are associated to nearby base stations using an approach that is adapted from the 802.11 WLAN association and negotiation mechanism to provide registration and connectivity services for the underlying sen- sor devices. This abstracted connection layer can be used to integrate the underlying sensor networks into high-level ser- vices and applications such as IP-based networks and Web services.
Dynamic spectrum allocation (DSA) seeks to exploit the variations in the loads of various radio-access networks to allocate the spectrum efficiently. Here, a spectrum manager implements DSA by periodically auctioning short-term spectrum licenses. We solve analytically the problem of the operator of a CDMA cell populated by delaytolerant terminals operating at various data rates, on the downlink, and representing users with dissimilar "willingness to pay" (WtP). WtP is the most a user would pay for a correctly transferred information bit. The operator finds a revenue-maximising internal pricing and a service priority policy, along with a bid for spectrum. Our clear and specific analytical results apply to a wide variety of physical layer configurations. The optimal operating point can be easily obtained from the frame-success rate function. At the optimum, (with a convenient time scale) a terminal's contribution to revenues is the product of its WtP by its data rate; and the product of its WtP by its channel gain determines its service priority ("revenue per Hertz"). Assuming a second-price auction, the operator's optimal bid for a certain spectrum band equals the sum of the individual revenue contributions of the additional terminals that could be served, if the band is won.
Even today many papers and presentations make the claim that there is sufficient spectrum available for all new and future services, if only the spectrum could be used more efficiently, inter alia through additional flexibility in the assignment of frequencies to those systems that need the spectrum at a certain point in time at a given location. It still seems common understanding that radio regulation and the rules applicable for the deployment of systems and use of radio spectrum resources are by far too rigid and regulators are supposed to be not sufficiently forward looking. The claim goes that regulation needs to be changed in a way that the rules will allow dynamic access and thus increase efficiency that is expected to be facilitated by the extreme flexibility provided by the new dynamic spectrum access, flexible spectrum and cognitive radio system technologies. However, this is, to a large extend a misconception, as regulation has evolved over the last decade, regulators have changed the way spectrum is licensed and the conditions under which it can be used. This paper provides an overview of the current spectrum regulatory landscape covering the developments, in particular in Europe, including the WAPECS and BEM approaches, the impact of E3 on regulation as well as considering the changes that came along globally. The paper also contains an argument on the real changes that are still required to permit the use of state-of-theart dynamic spectrum assignment or access technologies and approaches. Copyright © 2010 The authors.
One of the major challenges in cognitive and cooperative communication systems that support advanced coexistence technologies for radio resource usage optimization is efficient decision making. However, the high dynamicity of such communication environments, in conjunction with the existence of an increasing amount of diverse context information, result in decision making becoming a non-trivial issue. In this paper, we present a novel decision making framework for modern wireless communication environments. More specifically, the different steps of the decision making cycle and the role they play in the decision making process are presented, while the different fragments that constitute the notion of context, as well as their role and the way they interact with and influence the decision making steps are described in detail. Moreover, two case-studies, on interference management in heterogeneous networks and noise-robust and energy-efficient sensing, respectively, are presented, highlighting their compliance with the described decision making framework.
Increased spectrum efficiency has been demonstrated with the use of cognitive radios, however with increased likelihood of interference to the incumbents of spectrum. Several studies solved the interference problem from the transmitter power control perspective, so as to curtail excessive cognitive interference powers; however, neglecting the effect of secondary terminal mobility. We show by simulation that such assumption of terminal immobility in the power control algorithm would fail in time variant cases resulting in increased levels of interference to the Incumbents as well as serious degradation in QoS within the cognitive radio network. We model the link gain evolution process as a distance dependent shadow fading process and scale up the target signal to interference ratio to cope with time variability. This paper therefore, proposes a mobility driven power control algorithm for cognitive radios based on sensing information, which ensures that the interference limit at the Incumbents is unperturbed at all times while concurrently maintaining the QoS within the cognitive radio network. © 2011 IEEE.
In this article, an Arrival and Departure Time Predictor (ADTP) for scheduling communication in opportunistic Internet of Things (IoT) is presented. The proposed algorithm learns about temporal patterns of encounters between IoT devices and predicts future arrival and departure times, therefore future contact durations. By relying on such predictions, a neighbour discovery scheduler is proposed, capable of jointly optimizing discovery latency and power consumption in order to maximize communication time when contacts are expected with high probability and, at the same time, saving power when contacts are expected with low probability. A comprehensive performance evaluation with different sets of synthetic and real world traces shows that ADTP performs favourably with respect to previous state of the art. This prediction framework opens opportunities for transmission planners and schedulers optimizing not only neighbour discovery, but the entire communication process.
The most common use of formal verification methods and tools so far has been in identifying whether livelock and/or deadlock situations can occur during protocol execution, process, or system operation. In this work we aim to show that an additional equally important and useful application of formal verification tools can be in protocol design and protocol selection in terms of performance related metrics. This can be achieved by using the tools in a rather different context compared to their traditional use. That is not only as model checking tools to assess the correctness of a protocol in terms of lack of livelock and deadlock situations but rather as tools capable of building profiles of protocol operations, assessing their performance, and identifying operational patterns and possible bottleneck operations. This process can provide protocol designers with an insight about the protocols' behavior and guide them towards further protocol design optimizations. It can also assist network operators and service providers in selecting the most suitable protocol for specific network and service configurations. We illustrate these principles by showing how formal verification tools can be applied in this protocol profiling and performance assessment context using some existing protocols as case studies.
The concept of sensing-as-a-service is proposed to enable a unified way of accessing and controlling sensing devices for many Internet of Things based applications. Existing techniques for Web service computing are not sufficient for this class of services that are exposed by resource-constrained devices. The vast number of distributed and redundantly deployed sensors necessitate specialised techniques for their discovery and ranking. Current research in this line mostly focuses on discovery, e.g., designing efficient searching methods by exploiting the geographical properties of sensing devices. The problem of ranking, which aims to prioritise semantically equivalent sensor services returned by the discovery process, has not been adequately studied. Existing methods mostly leverage the information directly associated with sensor services, such as detailed service descriptions or quality of service information. However, assuming the availability of such information for sensor services is often unrealistic. We propose a ranking strategy by estimating the cost of accessing sensor services. The computation is based on properties of the sensor nodes as well as the relevant contextual information extracted from the service access process. The evaluation results demonstrate not only the superior performance of the proposed method in terms of ranking quality measure, but also the potential for preserving the energy of the sensor nodes.
This work addresses joint transceiver optimization for multiple-input, multiple-output (MIMO) systems. In practical systems the complete knowledge of channel state information (CSI) is hardly available at transmitter. To tackle this problem, we resort to the codebook approach to precoding design, where the receiver selects a precoding matrix from a finite set of pre-defined precoding matrices based on the instantaneous channel condition and delivers the index of the chosen precoding matrix to the transmitter via a bandwidth-constraint feedback channel. We show that, when the symbol constellation is improper, the joint codebook based precoding and equalization can be designed accordingly to achieve improved performance compared to the conventional system. © 2012 IEEE.
The 5G technology has tapped into millimeter wave (mmWave) spectrum to create additional bandwidth for improved network capacity. The use of mmWave for specific applications including vehicular networks has widely discussed. However, applying mmWave to vehicular networks faces challenges of high mobility nodes and narrow coverage along the mmWave beams. In this paper, we focus on a mmWave small cell base station deployed in a city area to support vehicular network application. We propose profiling vehicle mobility for a machine learning agent to learn the performance of serving vehicles with different mobility profiles and utilize the past experiences to select appropriate mmWave beam to service a vehicle. Our machine learning agent is based on multi-armed bandit learning model, where classical multi-armed bandit and contextual multi-armed bandit are used. Particularly for the contextual multi-armed bandit, the contexts are vehicle mobility information. We show that the local street layout has naturally constrained vehicle movement creating distinct mobility information for vehicles, and the vehicle mobility information is highly related to communication performance. By using vehicle mobility information, the machine learning agent is able to identify vehicles that can remain within a beam for longer time period to avoid frequent handovers.
Automated service discovery enables human users or software agents to form queries and to search and discover the services based on different requirements. This enables implementation of high-level functionalities such as service recommendation, composition, and provisioning. The current service search and discovery on the Web is mainly supported by text and keyword based solutions which offer very limited semantic expressiveness to service developers and consumers. This paper presents a method using probabilistic machine-learning techniques to extract latent factors from semantically enriched service descriptions. The latent factors are used to construct a model to represent different types of service descriptions in a vector form. With this transformation, heterogeneous service descriptions can be represented, discovered, and compared on the same homogeneous plane. The proposed solution is scalable to large service datasets and provides an efficient mechanism that enables publishing and adding new services to the registry and representing them using latent factors after deployment of the system. We have evaluated our solution against logic-based and keyword-based service search and discovery solutions. The results show that the proposed method performs better than other solutions in terms of precision and normalised discounted cumulative gain values.
Localization is crucial for various applications, this includes resource coordination in small and ultra-small cells, as well as the whole range of Location Based Service (LBS). Multilateration is a localization technique that is based on distance measurements between multiple reference nodes and a target node. This paper introduces a multilateration localization approach that uses Singular Value Decomposition (SVD) for 3D indoor positioning. It also provides a mathematical multilateration formulation which considers the coordinates of the reference nodes and the relative distance between transmitting nodes. In practical deployments, the relative distance can be estimated using RSSI; we apply Kalman filtering to the RSSI measurements aiming to get a more accurate RSSI value. The approach is complemented by using two selection methods which help chosing the best nodes for multilateration computation. The paper concludes with a discussion of the experimental evaluation results obtained.
Hot spots in a wireless sensor network emerge as locations under heavy traffic load. Nodes in such areas quickly deplete energy resources, leading to disruption in network services. This problem is common for data collection scenarios in which Cluster Heads (CH) have a heavy burden of gathering and relaying information. The relay load on CHs especially intensifies as the distance to the sink decreases. To balance the traffic load and the energy consumption in the network, the CH role should be rotated among all nodes and the cluster sizes should be carefully determined at different parts of the network. This paper proposes a distributed clustering algorithm, Energy-efficient Clustering (EC), that determines suitable cluster sizes depending on the hop distance to the data sink, while achieving approximate equalization of node lifetimes and reduced energy consumption levels. We additionally propose a simple energy-efficient multihop data collection protocol to evaluate the effectiveness of EC and calculate the end-to-end energy consumption of this protocol; yet EC is suitable for any data collection protocol that focuses on energy conservation. Performance results demonstrate that EC extends network lifetime and achieves energy equalization more effectively than two well-known clustering algorithms, HEED and UCR.
Exploiting path diversity to enhance communication reliability is a key desired property in Internet. While the existing routing architecture is reluctant to adopt changes, overlay routing has been proposed to circumvent the constraints of native routing by employing intermediary relays. However, the selfish interdomain relay placement may violate local routing policies at intermediary relays and thus affect their economic costs and performances. With the recent advance of the concept of network virtualization, it is envisioned that virtual networks should be provisioned in cooperation with infrastructure providers in a holistic view without compromising their profits. In this paper, the problem of policy-aware virtual relay placement is first studied to investigate the feasibility of provisioning policycompliant multipath routing via virtual relays for inter-domain communication reliability. By evaluation on a real domain-level Internet topology, it is demonstrated that policy-compliant virtual relaying can achieve a similar protection gain against single link failures compared to its selfish counterpart. It is also shown that the presented heuristic placement strategies perform well to approach the optimal solution.
Establishing wireless networks in urban areas that can provide ubiquitous Internet access to end-users is a central part of the efforts towards defining the Internet of the future. In recent years, Wireless Mesh Network (WMN) backbone infrastructures are proposed as a cost effective technology to provide city-wide Internet access. Studies that evaluate the performance of city-wide mesh network deployments via experiments provide essential information on various challenges of building them. In this survey, we particularly focus on such studies and provide brief conclusions on the problems, benefits, and future research directions of city-wide WMNs.
Energy efficiency is a critical issue for future wireless communication. The European FP7 C2POWER project is to research, develop and demonstrate energy saving technologies for multi-standard wireless mobile devices, exploiting the combination of Cognitive Radio and cooperative strategies. The basic objective is to establish an energy optimization framework founded upon energy aware Cognitive Radio nodes which can dynamically optimize energy consumption while satisfying desired Quality of Service (QoS) for specific radio environments and applications. The context awareness for energy efficient Cognitive Radio can be seen as decision-making processes to optimize energy saving strategy based on energy related context information. The decision of a single node may have impacts on the energy consumption of other nodes and the entire Cognitive Radio network. Such decision-making processes are interactive. Game theory provides a mathematical basis for the analysis of interactive decision making processes. This paper presents a context aware architecture for energy efficient Cognitive Radio network from a game theoretical perspective. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.
With technologies developed in the Internet of Things, embedded devices can be built into every fabric of urban environments and connected to each other; and data continuously produced by these devices can be processed, integrated at different levels, and made available in standard formats through open services. The data, obviously f a form of “big data”, is now seen as the most valuable asset in developing intelligent applications. As the sizes of the IoT data continue to grow, it becomes inefficient to transfer all the raw data to a centralised, cloud-based data centre and to perform efficient analytics even with the state-ofthe- art big data processing technologies. To address the problem, this article demonstrates the idea of “distributed intelligence” for sensor data computing, which disperses intelligent computation to the much smaller while autonomous units, e.g., sensor network gateways, smart phones or edge clouds in order to reduce data sizes and to provide high quality data for data centres. As these autonomous units are usually in close proximity to data consumers, they also provide potential for reduced latency and improved quality of services. We present our research on designing methods and apparatus for distributed computing on sensor data, e.g., acquisition, discovery, and estimation, and provide a case study on urban air pollution monitoring and visualisation.
In order to satisfy the requirements of future IMT-Advanced mobile systems, the concept of spectrum aggregation is introduced by 3GPP in its new LTE-Advanced (LTE Rel. 10) standards. While spectrum aggregation allows aggregation of carrier components (CCs) dispersed within and across different bands (intra/inter-band) as well as combination of CCs having different bandwidths, spectrum aggregation is expected to provide a powerful boost to the user throughput in LTE-Advanced (LTE-A). However, introduction of spectrum aggregation or carrier aggregation (CA) as referred to in LTE Rel. 10, has required some changes from the baseline LTE Rel. 8 although each CC in LTE-A remains backward compatible with LTE Rel. 8. This article provides a review of spectrum aggregation techniques, followed by requirements on radio resource management (RRM) functionality in support of CA. On-going research on the different RRM aspects and algorithms to support CA in LTE-Advanced are surveyed. Technical challenges for future research on aggregation in LTE-Advanced systems are also outlined. © 2014 IEEE.
Backup paths are usually pre-installed by network operators to protect against single link failures in backbone networks that use multi-protocol label switching. This paper introduces a new scheme called Green Backup Paths (GBP) that intelligently exploits these existing backup paths to perform energy-aware traffic engineering without adversely impacting the primary role of these backup paths of preventing traffic loss upon single link failures. This is in sharp contrast to most existing schemes that tackle energy efficiency and link failure protection separately, resulting in substantially high operational costs. GBP works in an online and distributed fashion, where each router periodically monitors its local traffic conditions and cooperatively determines how to reroute traffic so that the highest number of physical links can go to sleep for energy saving. Furthermore, our approach maintains quality-of-service by restricting the use of long backup paths for failure protection only, and therefore, GBP avoids substantially increased packet delays. GBP was evaluated on the point-of-presence representation of two publicly available network topologies, namely, GÉANT and Abilene, and their real traffic matrices. GBP was able to achieve significant energy saving gains, which are always within 15% of the theoretical upper bound. © 2004-2012 IEEE.
We propose a method to align different ontologies in similar domains and then define correspondence between concepts in two different ontologies using the SKOS model.
5G New Radio (NR) is touted as a pivotal enabling technology for the genuine realization of connected and cooperative autonomous driving. Despite numerous research efforts in recent years, a systematic overview on the role of 5G NR in future connected autonomous communication networks is missing. To fill this gap and to spark more future research, this paper introduces the technology components of 5G NR and discusses the evolution from existing cellular vehicle-to-everything (V2X) technology towards NR-V2X. We primarily focus on the key features and functionalities of physical layer, Sidelink communication and its resource allocation, architecture flexibility, security and privacy mechanisms, and precise positioning techniques. Moreover, we envisage and highlight the potential of machine learning for further performance enhancement in NR-V2X services. Lastly, we show how 5G NR can be configured to support advanced V2X use cases.
Time series analysis aims to extract meaningful information from data that has been generated in sequence by a dynamic process. The modelling of the non-linear dynamics of a signal is often performed using a linear space with a similarity metric which is either linear or attempts to model the non-linearity of the data in the linear space. In this research, a different approach is taken where the non-linear dynamics of the time series are represented using a phase space. Training data is used to construct the phase space in which the data lies on or close to a lower-dimensional manifold. The basis of the non-linear manifold is derived using the kernel principal components derived using kernel principal component analysis where fewer components are retained in order to identify the lower-dimensional manifold. Data instances are projected onto the manifold, and those with a large distance between the original point and the projection are considered to be derived from a different underlying process. The proposed algorithm is able to perform time series classification on univariate and multivariate data. Evaluations on a large number of real-world data sets demonstrate the accuracy of the new algorithm and how it exceeds state-of-the-art performance.
This paper presents a framework for including cognitive management functionalities in the spectrum selection process for Opportunistic Networks (ONs).The framework is based on a decision making functionality interacting with a knowledge management block that stores and processes information about the spectrum use. Different approaches for spectrum selection are discussed covering specific cases including the capability to aggregate different bands and the possibility to jointly select the spectrum and the network interface. Illustrative results of the proposed framework are presented. © 2012 IIMC Ltd.
To improve inter-operability of future 5G systems with existing technologies, this paper proposes a novel context-aware user-driven framework for network selection in multi- RAT environments. It relies on fuzzy logic to cope with the lack of information usually associated with the terminal side and the intrinsic randomness of the radio environment. In particular, a fuzzy logic controller first estimates the out-of-context suitability of each RAT to support the QoS requirements of a set of heterogeneous applications. Then, a fuzzy multiple attribute decision making (MADM) methodology is developed to combine these estimates with the various components of the context (e.g., terminal capabilities, user preferences and operator policies) to derive the in-context suitability level of each RAT. Based on this novel metric, two spectrum selection and spectrum mobility functionalities are developed to select the best RAT in a given context. The proposed fuzzy MADM approach is validated in a dense small cell environment to perform a context-aware offloading for a mixture of delay-sensitive and best-effort applications. The results reveal that the fuzzy logic component is able to efficiently track changes in the operating conditions of the different RATs, while the MADM component enables to implement an adjustable context-aware strategy. The proposed fuzzy MADM approach results in a significant improvement in achieving the target strategy, while maintaining an acceptable QoS level compared to a traditional offloading based on signal strength.
The Internet of Things (IoT) paradigm connects everyday objects to the Internet and enables a multitude of applications with the real world data collected from those objects. In the city environment, real world data sources include fixed installations of sensor networks by city authorities as well as mobile sources, such as citizens’ smartphones,¬ taxis and buses equipped with sensors. This kind of data varies not only along the temporal but also the spatial axis. For handling such frequently updated, time-stamped and structured data from a large number of heterogeneous sources, this paper presents a data-centric framework that offers a structured substrate for abstracting heterogeneous sensing sources. More importantly, it enables the collection, storage and discovery of observation and measurement data from both static and mobile sensing sources.
Dynamic spectrum allocation (DSA) has been cited as a promising mechanism for managing the radio spectrum for coexisting systems. The goal of the DSA scheme is to increase the performance of networks in the shared spectrum, by providing a more efficient way of utilisation. This work addresses analytically the impact of multi-cell, multi-operator interference on the overall spectrum when multiple operators co-exist and share a common pool of radio resources. We propose a centralised DSA scheme that is able to capture the interference level and interact dynamically to minimise interference and enhance spectrum utilisation while maintaining a satisfactory level of QoS. Furthermore, a concise system model and framework able to describe the interaction among different operators is presented. The DSA algorithm has been investigated for co-located and displaced cellular networks. The simulation results indicate that the proposed DSA algorithm significantly outperforme the fixed spectrum allocation (FSA) ensuring minimum level of interference in the system. The QoS of the overall system has been improved in the DSA compared to traditional FSA. Moreover, the proposed algorithm enhanced the spectrum utilization by 26% guaranteeing that all operators are given fair access to the shared spectrum. © 2011 IEEE.
—In the fifth and beyond (5G/B5G) communication, wireless networks are evolved towards offering various services of different use cases and, therefore, need to span a wide range of requirements. While different services will be supported at the same time, radio resource management needs to consider the different requirements. In addition, as wireless systems are capable to support multi-connectivity, radio resource allocation becomes more challenging. In this context, we introduce a many-to-many matching game, and develop a distributed radio resource allocation algorithm supporting multi-connectivity. Simulation results demonstrate that the proposed approach improves the QoS levels of UEs by up to 14.9% considering their service requirements.
Supporting quality of service (QoS) while fulfilling high efficiency of bandwidth utilization is challenging in wireless mesh networks. To deal with this issue, hybrid medium access control (MAC) protocols are effective candidates because they can achieve QoS support and better resource sharing at the same time. However, in multi-hop communication environments, hybrid MAC protocols suffer from interference and low bandwidth efficiency. To solve these problems, in this paper, we propose a distributed interference-aware admission control algorithm (DIACA) with soft resource allocation for hybrid MAC protocols suitable for IEEE 802.11 wireless mesh networks; a scheme for providing QoS improvement for real-time sessions (RTSNs) while enhancing the efficiency of bandwidth utilization. The proposed DIACA possesses a function for interference probing, making each node recognize their interfering counterparts. Further, through support of interference detection, concurrent transmissions can be achieved by letting non-interfering nodes transmit data simultaneously along a route which improves the efficiency of spatial reuse of bandwidth. In addition, the DIACA can implement soft resource allocation for RTSNs having delay requirements but loose (or low) throughput demands. By using soft resource allocation, a transmission opportunity can be shared by different RTSNs with low data rates and each of the RTSNs can obtain satisfactory QoS. Simulation results indicate that the proposed admission control algorithm can significantly enhance the bandwidth utilization of wireless channel and can improve QoS for RTSNs. © 2012 IEEE.
Energy consumption in ISP backbone networks has been rapidly increasing with the advent of increasingly bandwidth-hungry applications. Network resource optimization through sleeping reconfiguration and rate adaptation has been proposed for reducing energy consumption when the traffic demands are at their low levels. It has been observed that many operational backbone networks exhibit regular diurnal traffic patterns, which offers the opportunity to apply simple time-driven link sleeping reconfigurations for energy-saving purposes. In this work, an efficient optimization scheme called Time-driven Link Sleeping (TLS) is proposed for practical energy management which produces an optimized combination of the reduced network topology and its unified off-peak configuration duration in daily operations. Such a scheme significantly eases the operational complexity at the ISP side for energy saving, but without resorting to complicated online network adaptations. The GÉANT network and its real traffic matrices were used to evaluate the proposed TLS scheme. Simulation results show that up to 28.3% energy savings can be achieved during off-peak operation without network performance deterioration. In addition, considering the potential risk of traffic congestion caused by unexpected network failures based on the reduced topology during off-peak time, we further propose a robust TLS scheme with Single Link Failure Protection (TLS-SLFP) which aims to achieve an optimized trade-off between network robustness and energy efficiency performance.
In this paper we present and evaluate the performance of a resource allocation algorithm to enhance the Quality of Service (QoS) provision and energy efficiency of uplink Long Term Evolution (LTE) systems. The proposed algorithm considers the main constraints in uplink LTE resource allocation, i.e., the allocation of contiguous sets of resource blocks of the Single Carrier – Frequency Division Multiple Access (SC-FDMA) physical layer to each user, and the imperfect knowledge of the users' uplink buffer status and packet waiting time. Resource allocation is performed using information on the estimated uplink packet delay, the average delay and data rate of past allocations, as well as the required uplink power per resource block. According to simulation results, the proposed algorithm achieves significant performance improvement in terms of packet loss rate, goodput, fairness, and energy efficiency. Moreover, the effect of poor QoS provision on energy efficiency is demonstrated through the evaluation of the performance in terms of energy consumption per successfully received bit.
With universal usability geared towards user focused customisation, a context reasoning engine can derive meaning from the various context elements and facilitate decision-taking for applications and context delivery mechanisms. The heterogeneity of available device capabilities means that the recommendation algorithm must be in a formal, effective and extensible form. Moreover, user preferences, capability context and media metadata must be considered simultaneously to determine appropriate presentation format. Towards this aim, this paper presents a reasoning mechanism that supports service presentation through a rule-based mechanism. The validation of the approach is presented through application use cases.
IEEE 802.11-based wireless technology is widely applied in many areas, supporting communications where wired devices are not available. However, providing satisfactory QoS is still a challenging topic in 802.11-based wireless networks because of the problems such as error-prone wireless channel condition, power consumption, short of centralised facility, mobility as well as channel contention. For addressing these issues, one feasible solution can be to implement resource reservation for the sessions that require QoS assurances. The responsibility of resource reservation scheme is to make sure that QoS-sensitive sessions get sufficient bandwidth in order to sustain their high performance. Difficulties are already identified for designing resource reservation schemes in both network and MAC layers. However, there is no profound investigation outcome for this kind of QoS mechanism. Therefore, in this paper, we intend to produce a comprehensive survey of resource reservation approaches for IEEE 802.11-based wireless networks. The associated research works are summarised and also classified. Moreover, both the drawbacks and the merits of each kind of resource reservation scheme are highlighted. © 2013 IEEE.
The integration of things’ data on the Web and Web linking for things’ description and discovery is leading the way towards smart Cyber–Physical Systems (CPS). The data generated in CPS represents observations gathered by sensor devices about the ambient environment that can be manipulated by computational processes of the cyber world. Alongside this, the growing use of social networks offers near real-time citizen sensing capabilities as a complementary information source. The resulting Cyber–Physical–Social System (CPSS) can help to understand the real world and provide proactive services to users. The nature of CPSS data brings new requirements and challenges to different stages of data manipulation, including identification of data sources, processing and fusion of different types and scales of data. To gain an understanding of the existing methods and techniques which can be useful for a data-oriented CPSS implementation, this paper presents a survey of the existing research and commercial solutions. We define a conceptual framework for a data-oriented CPSS and detail the various solutions for building human–machine intelligence.
The Real World Internet or the Web of Things has brought an approach to integrate wireless sensor devices in a manner that is natural to the Web, where sensors are exposed as addressable web resources like any other web resource. Although there is still a clear deficiency with regards to managing the mobility of the sensor devices in this approach, and how it affects the service and the users interacting with it. The work presented here addresses this issue and aims to provide an approach towards maintaining service continuity of migrating sensor devices in a framework that builds upon the concept of the 'Web of Things'. © 2011 IEEE.
The IoT is increasingly being used to support smart spaces and physical analytics and yet much of this smartness is made deliberately invisible to the user–echoingWeiser’s vision of calm computing and technology that fades into the background. However, this means that users may not be aware or may not understand how the IoT is being deployed in their area. In other domains we know that a lack of awareness and a lack of understanding can lead to poor user experience/frustration, mistrust, suspicion, inability to capitalise on benefits and, security vulnerabilities. In this paper we present preliminary work that explores the issue of user awareness of IoT-based data collection.
Energy is a critical resource in the design of wireless networks since wireless devices are usually powered by batteries. Without any new approaches for energy saving, 4G mobile users will relentlessly be searching for power outlets rather than network access, and becoming once again bound to a single location. To avoid the so called 4G "energy trap" and to help wireless devices become more environment friendly, there is a clear need for disruptive strategies to address all aspects of power efficiency from the user devices through to the core infrastructure of the network and how these devices and equipment interact with each other. The ICT-C2POWER project is the vehicle that will address these issues through cognitive techniques and cooperation. The C2POWER case study is to research, develop and demonstrate energy saving technologies for multi-standard wireless mobile devices, exploiting the combination of cognitive radio and cooperative strategies, while still enabling the required performance in terms of data rate and QoS to support active applications. Copyright © 2010 The authors.
A cognitive radio opportunistically accesses spectrum bands under the constrain that it does not interfere with the licensed users. Cognitive radio performs spectrum sensing to find spectrum opportunities. Although a large number of spectrum sensing algorithms are available in literature; majority of them addressed static cognitive radios. In this paper, we study the energy detection based local spectrum sensing in the presence of user mobility. We show that CR mobility improves spectrum sensing performance by exploiting spatial diversity. We propose a framework for local spectrum sensing in which a cognitive radio do multiple spectrum measurements and makes a decision about the existence of the licensed user. An optimal fusion rule based on likelihood ratios is derived and based on test statistics a suitable detector and functional architecture of a cognitive radio is proposed. A closed form expression for the number of spectrum measurement cycles is also derived in this paper under given performance constraints. ©2010 IEEE.
We consider, in this paper, the maximization of throughput in a dense network of collaborative cognitive radio (CR) sensors with limited energy supply. In our case, the sensors are mixed varieties (heterogeneous) and are battery powered. We propose an ant colony-based energy-efficient sensor scheduling algorithm (ACO-ESSP) to optimally schedule the activities of the sensors to provide the required sensing performance and increase the overall secondary system throughput. The proposed algorithm is an improved version of the conventional ant colony optimization (ACO) algorithm, specifically tailored to the formulated sensor scheduling problem. We also use a more realistic sensor energy consumption model and consider CR networks employing heterogeneous sensors (CRNHSs). Simulations demonstrate that our approach improves the system throughput efficiently and effectively compared with other algorithms.
Roll-out of future cloud systems will be influenced by regulations from the standardisation bodies, if made available across the community. Trends in cloud deployment, operation and management to date have not been guided by any regulatory standards, and resources have been deployed in an ad hoc manner as demanded according to the business objectives of service providers. This is the least costly and most quickly revenue-returning business model. It is not however, the most cost-effective approach on a long-term basis: As a consequence of this roll-out model to date, the interoperability of resources deployed across clouds managed by different operators is restricted through inability to allocate workload to them in a regulated and controllable manner. The absence of standardised approaches to cloud management is therefore beginning to be accommodated such that the cost and performance advantages of interoperable operation may be exploited. In this paper, we review the state-of-the-art in standards across the field and trends in their development. We present a model which defines the drivers for cloud interoperability and constraints which restrict the extent to which this may realistically occur in future scalable solutions. This is supplemented with discussion on future challenges foreseen with regard to cloud operation and the way in which standards require provision such that cloud interoperation may be accommodated. © 2013 The Science and Information Organization.
Semantic modelling provides a potential basis for interoperating among different systems and applications in the Internet of Things (IoT). However, current work has mostly focused on IoT resource management while not on the access and utilisation of information generated by the “Things”. We present the design of a comprehensive and lightweight semantic description model for knowledge representation in the IoT domain. The design follows the widely recognised best practices in knowledge engineering and ontology modelling. Users are allowed to extend the model by linking to external ontologies, knowledge bases or existing linked data. Scalable access to IoT services and resources is achieved through a distributed, semantic storage design. The usefulness of the model is also illustrated through an IoT service discovery method.
The requirements of analyzing heterogeneous data streams and detecting complex patterns in near real-time have raised the prospect of Complex Event Processing (CEP) for many internet of things (IoT) applications. Although CEP provides a scalable and distributed solution for analyzing complex data streams on the fly, it is designed for reactive applications as CEP acts on near real-time data and does not exploit historical data. In this regard, we propose a proactive architecture which exploits historical data using machine learning (ML) for prediction in conjunction with CEP. We propose an adaptive prediction algorithm called Adaptive Moving Window Regression (AMWR) for dynamic IoT data and evaluated it using a real-world use case with an accuracy of over 96%. It can perform accurate predictions in near real-time due to reduced complexity and can work along CEP in our architecture. We implemented our proposed architecture using open source components which are optimized for big data applications and validated it on a use-case from Intelligent Transportation Systems (ITS). Our proposed architecture is reliable and can be used across different fields in order to predict complex events.
One of the goals that can be achieved by providing adaptive web services from mobile hosts is to allow continuous service provisioning. However, there are limitations in terms of complexity and size of the services that may be executed on mobile hosts. In this paper, two steps are taken towards providing adaptive web services from resource limited mobile devices. The first step is to investigate mechanisms that facilitate distributing the execution of mobile web services; the main mechanisms are offloading and migration. The second step is to integrate these mechanisms with available web service architectures to produce an extended mobile web service framework. In this case we integrated them with both SOAP as well as REST. The paper describes the offloading and migration mechanisms as well as the implementation of a prototype that allows performance evaluation of both extended frameworks. To investigate the load and performance of the distributed services, the prototype implements resource intensive applications. The results presented show that basing distributed mobile-hosted services on REST is more suitable than using SOAP as underlying web service infrastructure. © 2011 IEEE.
To date implementations of Internet of Things (IoT) architectures are confined to particular application areas and tailored to meet only the limited requirements of their narrow applications. To overcome technology and sector boundaries this paper proposes a dynamic service creation environment that employs i) orchestration of business services based on re-usable IoT service components, ii) self-management capable components for automated configuration and testing of services for things, and iii) abstraction of the heterogeneity of underlying technologies to ensure interoperability. To ensure reliability and robustness the presented approach integrates self-testing and self-adaptation in all service life cycle phases. The service life cycle management distinguishes the IoT service creation phase (design-time) and the IoT service provision phase (run-time). For test-friendly service creation (1) semantic service descriptions are employed to derive semi-automatically services and related tests, (2) and testing is systematically integrated into a Service Creation Environment. For reliable and robust service provisioning the presented system (3) forces validation tests in a sandbox environment before deployment and (4) enables run-time monitoring for service adaptation. The system under test is modelled by finite state machines (FSM) that are semi-automatically composed of re-usable test components. Then path searching algorithms are applied to derive automatically tests from the FSM model. The resulting tests are specified in the test control notation TTCN-3 and compiled to run the validation tests. © 2012 IIMC Ltd.
The fifth-generation wireless communication networks (5G) facilitate a wide range of newly-emerging applications alongside existing cellular mobile broadband services. One of the key service classes of 5G is Ultra-Reliable and Low-Latency Communications (URLLC), which guarantees the rapid delivery of short packets (up to 1 ms) with a success probability rate of 99.999%. The challenging reliability and latency requirements of URLLC cannot be delivered by existing cellular networks, resulting in the need for significant air interface modifications. This study aims to satisfy the link latency requirements of URLLC applications, and specifically reduce the latency associated with the presence of the Hybrid Automatic Repeat reQuest (HARQ) feedback scheme. To this end, we investigate a supervised learning method to provide early HARQ (E-HARQ) feedback on the decodability status of the coded-received signal, ahead of the decoding processing. This strategy allows the transmitter to react faster and minimize the signal round-trip time (RTT). The simulation results demonstrate the capability of the proposed mechanism to speed up the feedback releasing and enhance the prediction accuracy by 12% with the introduction of a new feature derived by the channel state estimation.
Ubiquitous services have gained increasing attention in the area of mobile communication aiming to allow service access anywhere, anytime and anyhow while keeping complexity to a minimum for both users and service providers. Ubiquitous environment features a wide range and an increasing number of access devices and network technologies. Context-aware content/service adaptation is deemed necessary to ensure best user experience. We developed an Adaptation Management Framework (AMF) Web Service which manages the complexity of dynamic and autonomous content adaptation and serves as an invisible enabler for ubiquitous service delivery. It remains challenging to manage the tasks involved in the communication between the AMF Web Service and the user's environment, typically represented by various types of intelligent agents. This work presents a middleware which manages those tasks and serves not only as a protocol gateway, but also as a message translator, a service broker, a complexity shield etc., between AMF Web Services and User Agents.
While ultra-reliable and low latency communication (uRLLC) is expected to cater to emerging services requiring real-time control, such as factory automation and autonomous driving, the design of uRLLC of stringent requirements would be very challenging. Among novel solutions to satisfy uRLLC's requirements, interface diversity is widely regarded as an efficient enabler of ultra-reliable connectivity. When mobile de- vices are connected to multiple base stations (BSs) of different radio access technologies (RATs) and same data is transmitted via multiple links simultaneously, the transmission reliability can be improved. How- ever, duplicate transmission of same data causes an increase in the traffic loads, leading to radio resource shortage. Considering it, efficient config- uration of multi-connectivity (MC) for mobile devices is important. In this paper, the RAT selection scheme including efficient MC configura- tion is proposed. By adopting distributed reinforcement learning (RL), each device could learn the policy for efficient MC configuration and select appropriate RATs. Simulation results show that 20.8% reliabil- ity improvements over the single connectivity scheme is observed. Com- paring to the method to configure MC for devices all the time, 37.6% improvement is achieved at high traffic loads.
Network scenarios beyond 3G assume the cooperation of operators with wireless access networks of different technologies in order to improve scalability and provide enhanced services to their mobile customers. While the selection of an optimised delivery path in such scenarios with multiple access networks is already a challenging task for unicast delivery, the problem becomes more severe for multicast services, where a potentially large group of heterogeneous receivers has to be served simultaneously via shared resources. In this paper we study the problem of selecting the optimal bearer paths for multicast services with groups of heterogeneous receivers in wireless networks with overlapping coverage. We propose an algorithm for bearer selection with different optimisation goals, demonstrating the existing tradeoff between user preference and resource efficiency.
This paper focuses on service clustering and uses service descriptions to construct probabilistic models for service clustering.We discuss how service descriptions can be enriched with machine-interpretable semantics and then we investigate how these service descriptions can be grouped in clusters in order to make discovery, ranking, and recommendation faster and more effective. We propose using Probabilistic Latent Semantic Analysis (PLSA) and Latent Dirichlet Allocation (LDA) (i.e. two machine learning techniques used in Information Retrieval) to learn latent factors from the corpus of service descriptions and group services according to their latent factors. By creating an intermediate layer of latent factors between the services and their descriptions, the dimensionality of the model is reduced and services can be searched and linked together based on probabilistic methods in latent space. The model can cluster any newly added service with a direct calculation without requiring to re-calculate the latent variables or re-train the model.
This intelligent multimedia adaptation and delivery framework tailors to ubiquitous environments, so that users can experience multimedia content using multiple devices in various mobility situations.
In pervasive environments, availability and reliability of a service cannot always be guaranteed. In such environments, automatic and dynamic mechanisms are required to compose services or compensate for a service that becomes unavailable during the runtime. Most of the existing works on services composition do not provide sufficient support for automatic service provisioning in pervasive environments. We propose a Divide and Conquer algorithm that can be used at the service runtime to repeatedly divide a service composition request into several simpler sub-requests. The algorithm repeats until for each sub-request we find at least one atomic service that meets the requirements of that sub-request. The identified atomic services can then be used to create a composite service. We discuss the technical details of our approach and show evaluation results based on a set of composite service requests. The results show that our proposed method performs effectively in decomposing a composite service requests to a number of sub-requests and finding and matching service components that can fulfill the service composition request.
This paper introduces a new scheme called Green MPLS Fast ReRoute (GMFRR) for enabling energy aware traffic engineering. The scheme intelligently exploits bac kup label switched paths, originally used for failure protection, in order to achieve energy saving during the normal failure-free operation period. GMFRR works in an online and distributed fashion whe re each router periodically monitors its local traffic condition and cooperatively determines how to efficiently reroute traffic onto the backup paths in order to exploit opportunities for power saving through link sleeping in the primary paths. According to our performance evaluations based on the academic network GEANT and its traffic matrices, GMFRR is able to achieve significant power saving gains, which are within 15% of the theoretical upper bound.
In the fifth and beyond (5G/B5G) communication, wireless networks are evolved towards offering various services of different use cases and, therefore, need to span a wide range of requirements. While different services will be supported at the same time, radio resource management needs to consider the different requirements. In addition, as wireless systems are capable to support multi-connectivity, radio resource allocation becomes more challenging. In this context, we introduce a manyto- many matching game, and develop a distributed radio resource allocation algorithm supporting multi-connectivity. Simulation results demonstrate that the proposed approach improves the QoS levels of UEs by up to 14.9% considering their service requirements.
Cities have an ever increasing wealth of sensing capabilities, recently including also internet of things (IoT) systems. However, to fully exploit such sensing capabilities with the aim of offering effective city-sensing-driven applications still presents certain obstacles. Indeed, at present, the main limitation in this respect consists of the vast majority of data sources being served on a “best effort” basis. To overcome this limitation, we propose a “resilient and adaptive IoT and social sensing platform”. Resilience guarantees the accurate, timely and dependable delivery of the complete/related data required by smart-city applications, while adaptability is introduced to ensure optimal handling of the changing requirements during application provision. The associated middleware consists of two main sets of functionalities: (a) formulation of sensing requests: selection and discovery of the appropriate data sources; and (b) establishment and control of the necessary resources (e.g., smart objects, networks, computing/storage points) on the delivery path from sensing devices to the requesting applications. The middleware has the intrinsic feature of producing sensing information at a certain level of detail (geographical scope/timeliness/accuracy/completeness/dependability) as requested by the applications in a given domain. The middleware is assessed and validated at a proof-of-concept level through innovative, dependable and real-time applications expected to be highly reproducible across different cities.
This intelligent multimedia adaptation and delivery framework tailors to ubiquitous environments, so that users can experience multimedia content using multiple devices in various mobility situations. Multidevice environments offer the potential to enhance the user experience in terms of flexibility and interactivity and will enable novel applications in education, entertainment, collaboration, and communication. We analyzed the different processing steps and defined related framework functionalities such as the generation of the presentation schedule, the computation of the presentation-environment matches, personalization through situation learning, and device(s)-tailored presentation delivery.
The Internet of Things (IoT) paradigm aims to realize heterogeneous physical world objects interacting with each other and with the surrounding environment. In this prospect, the automatic provisioning of the varied possible interactions and bridging them with the digital world is a key pertinent issue for enabling novel IoT applications. The introduction of description logic-based semantics to provide homogeneous descriptions of object capabilities enables lowering the heterogeneity and a limited set of interactions (such as those with stationary objects with fixed availability) to be deduced using classical reasoning systems. However, the inability of such semantics to capture the dynamics of an IoT system as well as the scalability issues that reasoning systems encounter if too many descriptions have to be processed, necessitate that such approaches should be used in conjunction with others. Towards this aim, this paper proposes an automated rule-based association mechanism for integrating the digital IoT components with physical entities along temporal-spatial-thematic axes. To address the scalability issue, this mechanism is distributed over a federated network of nodes, each embodying a set of objects located in the same geographical area. Nodes covering nearby geographical areas can share their object descriptions while all nodes are capable of deducing interactions between the descriptions that they are aware of.© 2013 Elsevier B.V.
In existing energy-efficient clustering algorithms for Wireless Sensor Networks (WSNs), individual nodes usually experience significant differences in lifetime. The issue of some nodes depleting energy earlier than other is usually referred to as hot-spot issue in WSNs, which dramatically shortens the stable operation period of a network when all nodes are live with residual energy. This paper addresses hot-spot issue through equalizing individual node's lifetime throughout the network. The probability of nodes to become cluster-head (CH) in this algorithm is relevant to node distance to the sink and is subject to the individual node-lifetime equalization. When selecting CHs, the residual node energy is considered as well. Performance evaluation illustrates the effectiveness of our algorithm in terms of extending the stable operation period of the clustered WSNs. Copyright © 2010 The authors.
Gateways in sensor networks are used to relay, aggregate and communicate information from capillary networks to more capable (e.g. IP-based) networks. However Gateway-to-Gateway (G2G) communication to exchange and update information among the gateways in large-scale sensor networks for query processing, data fusion and other similar tasks has been less discussed in recent works. The requirements for large-scale sensor networks such as dynamic topology and update strategies to reduce the overall network load makes G2G communications an important aspect in the network design. In this paper, we introduce a mediated gossip-based G2G communication mechanism. The proposed solution leverages the publish/subscribe approach and uses high-level context assigned to publish/subscribe channels to enable the information discovery and G2G communications. Gateways store/aggregate sensor observation and measurement data according to specific context which is defined based on features such as spatial and temporal attributes, observed phenomena (i.e. feature of interest) and sensor device features. The gateways communicate with each other to exchange data and also to forward related queries for data aggregation in cases that the data should be aggregated from two different sources. The proposed solution also facilitates reliable sensor service provisioning by enabling gateways to communicate and/or forward requests to other gateways when a resource fails or a sensor node becomes unavailable. We compare our results to probabilistic gossiping algorithms and run benchmarks on different dynamic network topologies based on indicators such as number of sent messages and dissemination delay.
In this article, a cognitive management framework is proposed for ensuring exploitation of the Future Internet of Things (FIoT). Cognitive systems offer self-x and learning. A cognitive system has the ability to dynamically select its behavior through self-management/awareness functionality, taking into account information and knowledge on the context of the operations as well as policies and including the generation of the context itself. The framework is based on the principle that any real world object and any digital object that is available, accessible, observable or controllable can have a virtual representation in the Future Internet, which is called Virtual Object (VO). Basic VOs can be composed in a more sophisticated way by forming Composite VOs (CVOs), which provide services to high-level applications and end-users. The described paradigm is applied to various applications scenarios: smart home, smart office, smart city and smart business. This paper presents some background in IoT, identifies the requirements and challenges, and sets the directions that should be followed. © 2012 SCPE.
The ability to manage the distributed functionality of large multi-vendor networks will be an important step towards ultra-dense 5G networks. Managing distributed scheduling functionality is particularly important, due to its influence over inter-cell interference and the lack of standardization for schedulers. In this paper, we formulate a method of managing distributed scheduling methods across a small cluster of cells by dynamically selecting schedulers to be implemented at each cell. We use deep reinforcement learning methods to identify suitable joint scheduling policies, based on the current state of the network observed from data already available in the RAN. Additionally, we also explore three methods of training the deep reinforcement learning based dynamic scheduler selection system. We compare the performance of these training methods in a simulated environment against each other, as well as homogeneous scheduler deployment scenarios, where each cell in the network uses the same type of scheduler. We show that, by using deep reinforcement learning, the dynamic scheduler selection system is able to identify scheduler distributions that increase the number of users that achieve their quality of service requirements in up to 77% of the simulated scenarios when compared to homogeneous scheduler deployment scenarios.
In this paper we present an analytical framework that aims to improve the energy efficiency of traffic offloading via Wireless Local Area Networks, taking into account the energy consumption for both data transmission and network discovery operations. More specifically, the network scanning period is optimized in order to minimize the energy consumption in a vehicular scenario where a user moves along a road covered by a long range cellular network and a number of randomly deployed Wireless Local Area Networks. The performance of the system that performs periodic network scanning with the optimal period is compared against a sub-optimal system that does not take into consideration the user and network context information when determining the network scanning period. According to performance evaluation results, the use of the optimal network scanning period achieves significant improvement in terms of energy consumption and network detection delay.
Multimedia services are becoming increasingly popular among mobile users. Ontology and related technologies have been introduced into the multimedia domain as a means to provide declarative formal representations of the domain knowledge and thus to enable intelligent multimedia processing, such as media format adaptation. The range of devices available to access media content becomes increasingly heterogeneous and at the same time ubiquitous. Users expect to access their services and content without restrictions in time or location. Users have many and different gadgets/devices with network connectivity at their disposal to receive content, ranging from their smart phones, car audio systems to laptops, or office PCs, etc. Hence there is a need to link the discovery and the description of these ambient device with multimedia domain knowledge representations in order to facilitate a ubiquitous multimedia experience. The contribution of this work is an approach for mapping device descriptions, which are leveraged on the resource discovery protocol UPnP to OWL ontology instances. The ontology instances chosen are compliant with the MPEG-21 DIA OWL-formatted ontology. This approach bridges the gap between non-semantic description mechanisms of the legacy device/services discovery protocol with the semantic multimedia domain knowledge representation.
As sensors are adopted in almost all fields of life, the Internet of Things (IoT) is triggering a massive influx of data. We need efficient and scalable methods to process this data to gain valuable insight and take timely action. Existing approaches which support both batch processing (suitable for analysis of large historical data sets) and event processing (suitable for realtime analysis) are complex. We propose the hut architecture, a simple but scalable architecture for ingesting and analyzing IoT data, which uses historical data analysis to provide context for real-time analysis. We implement our architecture using open source components optimized for big data applications and extend them where needed. We demonstrate our solution on two real-world smart
This paper formulates an optimization problem thatmaximizes an aggregate utility that captures the “in-context” suit-ability of available radio access technologies (RATs) to supportadaptive video streaming subject to a single-homing constraint.To efficiently solve the considered problem, a novel network-assisted quality-of-experience (QoE)-driven methodology is de-vised, and its impact on the end-user devices is evaluated.The proposed approach is evaluated and benchmarked againstits distributed and centralized counterparts from a cost-benefitperspective. The results reveal that the proposed strategy sig-nificantly outperforms its distributed counterpart, and performsdifferently with respect to its centralized counterpart dependingon the number of video clients. At low loads, it performs similarlywith much less control overhead. At high loads, the proposedstrategy scales up well, while the centralized approach getsoverwhelmed by an increasing uplink signaling. A practicalityanalysis of the proposed strategy for battery-powered devicesreveals that its gain in terms of uplink signaling outweighs its costin terms of processing load, which results in a drastic reduction ofthe consumed energy. Therefore, the proposed solution providesa win-win situation, where the video clients can sustain goodQoE levels at reduced energy consumption, while the networkcan accommodate more users with existing capacity.
With development of 5G and Beyond communication technologies and the recent achievements in autonomous driving, technical solutions to improve road safety have attracted great attention. In this paper, we present a collision avoidance system implemented using a 1/10 scale vehicle, as a research platform for autonomous driving connected a vehicular network. While the collision avoidance system exploits data fusion to make decisions relevant to predicting potential collision events, the effectiveness of the fusion of data obtained from in-vehicle sensors and vehicular communication is evaluated within a testbed environment.
This paper proposes a novel context-aware userdriven strategy to efficiently exploit all available bands and licensing regimes in ultra-dense deployments without prior knowledge about each combination. It relies first on fuzzy logic to estimate the suitability of each radio access technology (RAT) to support the requirements of various applications. Then, a fuzzy multiple attribute decision making (MADM) approach is developed to combine these estimates with the heterogeneous context components to assess the in-context suitability. Based on this metric, a spectrum management strategy is proposed to support interactive video sessions for a set of Bronze and Gold subscriptions. The results reveal that the proposed approach always assigns Gold users to the well-regulated licensed band, while switches Bronze users between licensed and unlicensed bands depending on the operating conditions. This results in a significant improvement of the quality-of-experience (QoE) compared to a baseline that exploits only licensed bands. Then, a comparative study is conducted between the available options to exploit unlicensed bands, namely Offloading and Sharing. The results show that the best option strongly depends on the existing load on WLAN. Therefore, a combined approach is proposed to efficiently switch between both options, which achieves the best QoE for all considered loads.
This paper addresses the current regulatory framework, research activities and standardization efforts towards a shared use of radio spectrum in the European Union. Two main research streams are considered: The emerging concept of Licensed Shared Access that ensures a predictable Quality of Service for secondary users of spectrum and the opportunistic Device-to-Device communications that represent a recent and enormous socio-technological trend. The paper presents also research results that support the spectrum sharing standardization path in ETSI and 3GPP.
This paper presents a task allocation-oriented framework to enable efficient in-network processing and cost-effective multi-hop resource sharing for dynamic multi-hop multimedia wireless sensor networks with low node mobility, e.g., pedestrian speeds. The proposed system incorporates a fast task reallocation algorithm to quickly recover from possible network service disruptions, such as node or link failures. An evolutional self-learning mechanism based on a genetic algorithm continuously adapts the system parameters in order to meet the desired application delay requirements, while also achieving a sufficiently long network lifetime. Since the algorithm runtime incurs considerable time delay while updating task assignments, we introduce an adaptive window size to limit the delay periods and ensure an up-to-date solution based on node mobility patterns and device processing capabilities. To the best of our knowledge, this is the first study that yields multi-objective task allocation in a mobile multi-hop wireless environment under dynamic conditions. Simulations are performed in various settings, and the results show considerable performance improvement in extending network lifetime compared to heuristic mechanisms. Furthermore, the proposed framework provides noticeable reduction in the frequency of missing application deadlines.
Spectrum sensing is one of the crucial aspects in Cognitive Radio (CR). Fast and accurate spectrum opportunity detection provides interference avoidance to other/licensed users. At the same time, it offers more efficient spectrum utilization by providing accurate sensing information as an input to the intelligent dynamic resource allocation process. Wideband spectrum sensing has been introduced due to the higher bandwidth demand and increasing spectrum scarcity since it provides better chance of detecting spectrum opportunity. In this paper, the application of wavelet transform techniques for wideband spectrum opportunity detection in CRs is documented. Wavelet analysis is used in two-step process detection or multi-resolution opportunity detection proposed here. Edge detection using wavelet analysis is employed in the first step to indentify possibly available subband(s). The fine analysis is done in the second step for each chosen subband(s) using wavelet transform in order to detect any non-stationary signal, which may present in the chosen subband(s). With this two-step process, detection time could be reduced and at the same time providing detection accuracy. The paper presents research approach and the experimental study, which involves the development of the test platform used to obtain real-time spectrum sensing results and the software tool used for the opportunity detection. The experimental results are provided, which prove the practicality and accuracy of the approach. ©2010 IEEE.
In this paper, we present a method that facilitates Internet of Things (IoT) for building a product passport and data exchange enabling the next stage of the circular economy. SmartTags based on printed sensors (i.e., using functional ink) and a modified GS1 barcode standard enable unique identification of objects on a per item-level (including Fast-Moving Consumer Goods—FMCG), collecting, sensing, and reading of parameters from environment as well as tracking a products’ lifecycle. The developed ontology is the first effort to define a semantic model for dynamic sensors, including datamatrix and QR codes. The evaluation of decoding and readability of identifiers (QR codes) showed good performance for detection of sensor state printed over and outside the QR code data matrix, i.e., the recognition ability with image vision algorithm was possible. The evaluation of the decoding performance of the QR code data matrix printed with sensors was also efficient, i.e., the QR code ability to be decoded with the reader after reversible and irreversible process of ink (dis)appearing was preserved, with slight drop in performance if ink density is low.
The range of content and services on the Internet, the diversity of terminals and the heterogeneity of network technologies to access these services make it a necessity to implement some means to adapt both content and services to meet the needs of the communication environment of the service consumer. An adaptation management framework to support content/service adaptation will take away the complexity of the delivery of compound services from the user and content/service provider. This paper introduces such a context-aware content/service adaptation management framework which uses knowledge-based semantic Web and Web service technologies to facilitate the interoperability between the different technologies and domains involved in content/service adaptation, e.g. the user's context, the content/service and the available adaptation tools. This work, being part of Mobile VCE (MVCE) Ubiquitous Services project, advances the knowledge-based approaches by, firstly, proposing a distributed architecture for such a framework to accelerate its real-world deployment, and secondly, providing mechanisms for adaptation decision making and adaptation tools selection. This paper provides an overview of the framework architecture and outlines the functionalities of its constituent components, i.e. Adaptation Manager and Content Adaptor, as well as the communication mechanisms applied.
The immediacy of social media messages means that it can act as a rich and timely source of real world event information. The detected events can provide a context to observations made by other city information sources such as fixed sensor installations and contribute to building ‘city intelligence’. In this work, we propose a novel unsupervised method to extract real world events that may impact city services such as traffic, public transport, public safety etc., from Twitter streams. We also develop a named entity recognition model to obtain the precise location of the related events and provide a qualitative estimation of the impact of the detected events. We apply our developed approach to a real world dataset of tweets collected from the city of London.
Future cellular systems demand higher throughput as an important requirement, along with smaller cell sizes to characterize the performance of network services. This paper purposes a way to optimize the multihop cellular network (MCN) deployment in LTE-A/Mobile WiMAX broadband wireless access systems. A simple way to optimize the MCN is to associate direct and multihop users based on maximum channel quality and allocate the resources blocks dynamically based on traffic load balancing as adjustment variables. The changing traffic demands require dynamic network reconfiguration to maintain proportional fairness in achieving the throughput. A self optimizing network based on genetic algorithm (GA) is made to adaptively resize the cell coverage limit and dynamically allocate resources based on active user demands. A policy control scheme to control resource allocations between direct and multihop users can be either fixed resource allocation (FRA) or dynamic resource allocation (DRA).
In this paper we present a co-primary spectrum sharing algorithm for the Quality of Service (QoS) enhancement of uplink Single-Carrier Frequency Division Multiple Access (SC-FDMA) systems. We consider the limitations that are resulting from the fact that each user can only be provided with only contiguous sets of resource blocks (following the constraints of the localized SC-FDMA physical layer), and the effect of the limited, or even lack of, knowledge of each user’s buffer status and packet delays in the uplink. The sharing of available resources is based on the operator spectrum access priority, an estimation of the packet delays in the uplink direction, the average delay and data rate of earlier allocations, and the power per resource block. Simulation results show that the proposed algorithm considerably improves the performance in terms of packet loss rate, goodput, and fairness.
Architecture Description Languages enable the formalization of the architecture of systems and the execution of preliminary analysis on them, aiming at the identification and resolution of design problems in the early stages of development. Such problems can be incompatibilities and mismatches in the connections between system components and in the format and type of information exchanged between them. Architecture Description Languages were initially developed to validate the correctness of software architectures; however, their applicability has been extended to cover many diverse areas during the past few years. In this paper, we aim to show how Architecture Description Languages can be applied to and be a useful tool towards validating the correctness of architectures and configurations of future internet networking environments. We do so by using a recently proposed architectural approach and a recently proposed deployment approach, implemented by means of network virtualization, as case studies
An important emerging capability is for mobile terminals to be dynamically reconfigured. Through ongoing advances in technology such as software defined radio, reconfiguration of mobile terminals will in the near future be achievable across all layers of the protocol stack. However, along with the capability for such wide-ranging reconfiguration comes the need to manage reconfiguration procedures. This is necessary to coordinate reconfigurations, to ensure that there are no negative effects (e.g. interference to other RATs) as a result of reconfigurations, and to leverage maximal potential benefits of reconfiguration and ensuing technologies such as those involving dynamic spectrum access. The IEEE P1900.4 working group is therefore defining three building blocks for reconfiguration management: Network Reconfiguration Management (NRM), Terminal Reconfiguration Management (TRM), and a radio enabler to provide connectivity between the NRM and TRMs. In this paper we concentrate on aspects of the radio enabler, highlighting its relevance in heterogeneous radio access scenarios, its advantages, and some aspects of its technical realization.
Understanding home activities is important in social research to study aspects of home life, e.g., energy-related practices and assisted living arrangements. Common approaches to identifying which activities are being carried out in the home rely on self-reporting, either retrospectively (e.g., interviews, questionnaires, and surveys) or at the time of the activity (e.g., time use diaries). The use of digital sensors may provide an alternative means of observing activities in the home. For example, temperature, humidity and light sensors can report on the physical environment where activities occur, while energy monitors can report information on the electrical devices that are used to assist the activities. One may then be able to infer from the sensor data which activities are taking place. However, it is first necessary to calibrate the sensor data by matching it to activities identified from self-reports. The calibration involves identifying the features in the sensor data that correlate best with the self-reported activities. This in turn requires a good measure of the agreement between the activities detected from sensor-generated data and those recorded in self-reported data. To illustrate how this can be done, we conducted a trial in three single-occupancy households from which we collected data from a suite of sensors and from time use diaries completed by the occupants. For sensor-based activity recognition, we demonstrate the application of Hidden Markov Models with features extracted from mean-shift clustering and change points analysis. A correlation-based feature selection is also applied to reduce the computational cost. A method based on Levenshtein distance for measuring the agreement between the activities detected in the sensor data and that reported by the participants is demonstrated. We then discuss how the features derived from sensor data can be used in activity recognition and how they relate to activities recorded in time use diaries.
Currently, the IEEE Standards Association is very active in the framework of cognitive radio with an aim to provide a bridge between research results, implementation, and widespread deployment of this new communication paradigm. This article reports recent developments within the IEEE Dynamic Spectrum Access Network Standards Committee1 on dynamic spectrum access networks with particular consideration of IEEE 1900.6, ??Spectrum Sensing Interfaces and Data Structures for Dynamic Spectrum Access and Other Advanced Radio Communication Systems.?? It outlines the current structure of the IEEE 1900.6 standard and its relationship with other related standardization activities. We provide application scenarios and topology briefly, and discuss open research issues that raise future challenges to the standardization community.
Dynamic Spectrum Access (DSA)/Cognitive Radio (CR) represents a promising and versatile concept to improve the efficiency of spectrum exploitation by allowing unlicensed users to opportunistically access underutilised licensed bands, provided that no harmful interference is caused to the legitimate (licensed) users of the spectrum. This revolutionary spectrum access paradigm can be exploited not only to deploy new radio systems and technologies in the already allocated spectrum, but also to increase the capacity of existing systems. A good example of this application is the extension of Long Term Evolution (LTE) cellular systems in Television (TV) white spaces (i.e., TV channels not used in a certain region), which has received significant attention. Most of the existing studies, however, have focused on the extension of the LTE downlink component. By contrast, this work complements previous studies by considering the LTE uplink component in TV white spaces. By means of system-level simulations, this work analyses the conditions under which such coexistence is feasible and the underlying implications. © VDE VERLAG GMBH.
Autonomous systems and mission-critical applications demand ultra-reliable low-latency communication (URLLC). To build wireless communication networks capable of accommodating such applications, optimization of the air-interface characteristics is vital. This paper leverages recent advancements in the field of Artificial Intelligence (AI) technologies to optimize specific aspects of the air interface design to satisfy these stringent link reliability and latency requirements. The precise aim of this research is to reduce the link latency caused by the presence of the Hybrid Automatic Repeat reQuest (HARQ) mechanism. To this end, we propose a novel deep learning-based algorithm (Deep-HARQ), employing a deep neural network (DNN) with fully connected layers to estimate the decodability of the coded-received in-phase and quadrature (I/Q) signals prior to accomplishing the majority of the complex reception tasks. This enables the receiver to respond faster, allowing for the reduction of the signal round-trip time (RTT). To evaluate Deep-HARQ with a realistic dataset, we collected training and validation samples from a waveform compatible with 3GPP 5G NR Release 15 standards. The simulation results reveal a faster estimation response, with an accuracy enhancement of 12% compared to relevant algorithms in the literature.
Efficient testing of Internet of Things (IoT)-based services suffers from the underlying heterogeneous nature of IoT resources and hinders the process of rapid service creation and deployment. Real world effects, based on the behaviour of IoT-based services, tend to prevent the straight forward execution within the productive environment. Current solutions for testbeds, involving physical or virtual IoT resources, appear to require intense capacities for time and resources. This paper describes a new approach for testing IoT-based service build on a code insertion methodology, which can be derived from the semantic description of the IoT-based service. The proposed IoT resource emulation interface is described from the semantic, architectural and implementation perspective. The paper compares its applicability and efficiency with classical approaches and expose high emulation capabilities while minimising the testing effort. © VDE VERLAG GMBH.
Network virtualization has been recognized as a promising solution to enable the rapid deployment of customized services by building multiple Virtual Networks (VNs) on a shared substrate network. Whereas various VN embedding schemes have been proposed to allocate the substrate resources to each VN requests, little work has been done to provide backup mechanisms in case of substrate network failures. In a virtualized infrastructure, a single substrate failure will affect all the VNs sharing that resource. Provisioning a dedicated backup network for each VN is not efficient in terms of substrate resource utilization. In this paper, we investigate the problem of shared backup network provision for VN embedding and propose two schemes: shared on-demand and shared pre-allocation backup schemes. Simulation experiments show that both proposed schemes make better utilization of substrate resources than the dedicated backup scheme without sharing, while each of them has its own advantages. © 2011 IEEE.
Collaboration towards a goal involves groups of entities collectively possessing characteristics required to accomplish the goal. Facilitating collaborations in pervasive environments requires the automated formation of such groups. The group formation process is especially challenging in decentralised environments where there is no single central entity that can coordinate the formation process. It is also important that the group formation mechanisms are generic in nature so that they can be utilised in heterogeneous target environments regardless of their domain and requirements. This paper proposes a generic approach for automating group formation in decentralised environments. © 2011 IEEE.
The use of semantic Web technologies and service oriented computing paradigm in Internet of Things research has recently received significant attention to create a semantic service layer that supports virtualisation of and interaction among "Things". Using service-based solutions will produce a deluge of services that provide access to different data and capabilities exposed by different resources. The heterogeneity of the resources and their service attributes, and dynamicity of mobile environments require efficient solutions that can discover services and match them to the data and capability requirements of different users. Semantic service matchmaking process is the fundamental construct for providing higher level service-oriented functionalities such as service recommendation, composition, and provisioning in Internet of Things. However, scalability of the current approaches in dealing with large number of services and efficiency of logical inference mechanisms in processing huge number of heterogeneous service attributes and metadata are limited. We propose a hybrid semantic service matchmaking method that combines our previous work on probabilistic service matchmaking using latent semantic analysis with a weighted-link analysis based on logical signature matching. The hybrid method can overcome most cases of semantic synonymy in semantic service description which usually presents the biggest challenge for semantic service matchmakers. The results show that the proposed method performs better than existing solutions in terms of precision (P@n) and normalised discounted cumulative gain (NDCG) measurement values. © 2012 IEEE.
Mobile sensing techniques have been increasingly deployed in many Internet of Things based applications because of their cost efficiency, wide coverage and flexibility. However, these techniques are unreliable in many situations due to noise of different kinds, loss of communication, or insufficient energy. As such, datasets created from mobile sensing scenarios are likely to contain large amount of missing data, which makes further data analysis difficult, inaccurate, or even impossible. We find that the existing estimation models and techniques developed for static sensing do not work well in the mobile sensing scenarios. To address the problem, we propose a spatio-temporal method, which is specifically designed for answering queries in such applications. Experiments on a real-world, incomplete mobile sensing dataset show that the proposed method outperforms the state-of-the-art noticeably in terms of estimation errors. More importantly, the proposed model is tolerant to datasets with extremely high missing data rates. Training with the proposed model is also efficient, which makes it suitable for deployment on computationally constrained devices and platforms that need to process massive amounts of data in real time.
In this paper, a novel priority assignment scheme is proposed for priority service networks, in which each link sets its own priority threshold, namely, the lowest priority the link is willing to support for the incoming packets without causing any congestion. Aiming at a reliable transmission, the source then assigns each originated packet the maximum priority value required along its path, because links may otherwise discard the incoming packets which do not meet the corresponding priority requirements. It is shown that if each source sends the traffic at a rate that is reciprocal to the specified highest priority, a bandwidth max-min fairness is achieved in the network. Furthermore, if each source possesses a utility function of the available bandwidth and sends the traffic at a rate so that the associated utility is reciprocal to the highest link priority, a utility max-min fairness is achieved. For general networks without priority services, the resulting flow control strategy can be treated as a unified framework to achieve either bandwidth max-min fairness or utility max-min fairness through link pricing policy. More importantly, the utility function herein is only assumed to be strictly increasing and does not need to satisfy the strictly concave condition, the new algorithms are thus not only suitable for the traditional data applications with elastic traffic, but are also capable of handling real-time applications in the Future Internet.
Mobile networks have gone through various stages of evolution with each stage aimed at addressing a wide range of challenges and limitations. During the early evolutions of mobile networks - 2G, 2.5G, 3G -, the key challenge was to investigate efficient and cost effective ways of delivering higher data speeds. This led to the proposals and development of 4G LTE networks based on a flat all-IP architecture with Internet based protocols. However, recent trends indicates that the Internet-like architecture in mobile networks has further enabled Internet based cloud service providers to provide Over-The-Top (OTT) applications to mobile devices bypassing and competing with the mobile operator on services such as voice, video, messaging and gaming. This is a key motivation for cloud service providers and mobile operators to explore various opportunities in which they can both leverage on their existing infrastructures in order to efficiently deploy cloud services in mobile environments. In this paper we study the challenges and limitations that constrains the efficient deployment of cloud services in mobile environments. We then propose a collaborative approach in which the cloud service provider and mobile operator can dynamically manage the underlying mobile network infrastructure resources in order to optimise the delivery of cloud services in mobile environments. This is achieved by using cloud management approaches with the ability to factor in mobile resources such as mobility and frequency spectrums while also integrating the cloud service provider and mobile operator cloud infrastructures © 2013 The Science and Information Organization.
Emerging applications in Multi-hop Wireless Networks (MHWNs) require considerable processing power which often may be beyond the capability of individual nodes. Parallel processing provides a promising solution, which partitions a program into multiple small tasks and executes each task concurrently on independent nodes. However, multi-hop wireless communication is inevitable in such networks and it could have an adverse effect on distributed processing. In this paper, an adaptive intelligent task mapping together with a scheduling scheme based on a genetic algorithm is proposed to provide real-time guarantees. This solution enables efficient parallel processing in a way that only possible node collaborations with cost-effective communications are considered. Furthermore, in order to alleviate the power scarcity of MHWN, a hybrid fitness function is derived and embedded in the algorithm to extend the overall network lifetime via workload balancing among the collaborative nodes, while still ensuring the arbitrary application deadlines. Simulation results show significant performance improvement in various testing environments over existing mechanisms.
Developments in (wireless) sensor and actuator networks and the capabilities to manufacture low cost and energy efficient networked embedded devices have lead to considerable interest in adding real world sense to the Internet and the Web. Recent work has raised the idea towards combining the Internet of Things (i.e. real world resources) with semantic Web technologies to design future service and applications for the Web. In this paper we focus on the current developments and discussions on designing Semantic Sensor Web, particularly, we advocate the idea of semantic annotation with the existing authoritative data published on the semantic Web. Through illustrative examples, we demonstrate how rule-based reasoning can be performed over the sensor observation and measurement data and linked data to derive additional or approximate knowledge. Furthermore, we discuss the association between sensor data, the semantic Web, and the social Web which enable construction of context-aware applications and services, and contribute to construction of a networked knowledge framework.
This paper proposes an extension to Session Initiation Protocol (SIP) for contextualized service delivery in a service delivery platform (SDP) that enables device specific multimedia delivery. SIP separates between session establishment and description and is thus, amenable to be extended for advanced implementations which make it an ideal platform for service creation. Device specific multimedia delivery needs rich and flexible device descriptions, and our approach proposes advanced device descriptions through semantic technologies. The proposed SIP extensions have been implemented on a SIP Application Server which functions as SDP in IP Multimedia Subsystem (IMS). The validation of the proposed extensions is shown through an Android SIP client application that acts as a device browser and recommender for different multimedia services to users. An example device user agent (UA) application has also been implemented on a laptop.
The Internet of Things enables human beings to better interact with and understand their surrounding environments by extending computational capabilities to the physical world. A critical driving force behind this is the rapid development and wide deployment of wireless sensor networks, which continuously produce a large amount of real-world data for many application domains. Similar to many other large-scale distributed technologies, interoperability and scalability are the prominent and persistent challenges. The proposal of sensor-as-a-service aims to address these challenges; however, to our knowledge, there are no concrete implementations of techniques to support the idea, in particular, large-scale, distributed sensor service discovery. Based on the distinctive characteristics of the sensor services, we develop a scalable discovery architecture using geospatial indexing techniques and semantic service technologies. We perform extensive experimental studies to verify the performance of the proposed method and its applicability to large-scale, distributed sensor service discovery.
Collaborative spectrum sensing has been widely accepted as a promising approach to improve spectrum sensing performance by exploiting spatial diversity of cognitive radio users. However, in the presence of malfunctioning or misbehaved users, performance of collaborative spectrum sensing deteriorates significantly. In this paper, a credibility based mechanism for collaborative spectrum sensing using beta reputation system has been introduced. Our proposed method works well even if the total number of misbehaved users is unknown. In the proposed scheme, fusion center assigns weight to each user observation based on individual user credibility score. User credibility score is calculated using beta reputation system and simulation results show that proposed scheme significantly improves the reliability of aggregated data in the presence of falsified users.
This volume constitutes the revised papers of the 4th European Conference on Smart Sensing and Context, Euro SSC 2009, held in Guilford, UK, in September 2009.
Today's wireless and mobile services are typically monolithic and often centralized in nature, which limits heterogeneous service access and shared service usage. New sources of revenue for providers are expected to include tailored, personalized, and dynamically composed services that are fast to market, cost efficient, and provide compelling user experience. To meet these market needs, the SPICE service platform extends the IP Multimedia Subsystem (IMS) by supporting added-value services that are composed of more basic services. The platform is presented through the architecture and through scenarios showing the interactions between the platform and its service components. We describe the IMS role and functions in SPICE, and the use of ontology and Semantic Web technologies for integrated knowledge management in such mobile service platforms. © 2007 IEEE.
The current research on context-aware systems in ubiquitous environments opens a number of interlinked research challenges. On the lowest level of such systems, discovery mechanisms and flexible semantic descriptions of available devices and services form the basis for end-user service personalization. The challenge is to design a description model that leverages implicit semantic information obtained, while being cognizant of resource constraints of a device. To encounter the different device characteristics and personalization challenges, this paper proposes a device and service description approach that provides a high level contextual view of device information. The work has been performed as part of the Personal Distributed Environment concept, also described in the paper. Further, a user-centric view of multiple user interface devices to access services in a heterogeneous and dynamic networked environment has been implemented by extending UPnP device discovery. A comparison with existing state of the art approaches concludes the work.
Increase in system capacity and data rates can be achieved efficiently in a wireless system by getting the transmitter and receiver closer to each other. Femtocells deployed in the macrocell significantly improve the indoor coverage and provide better user experience. The femtocell base station called as Femtocell Access Point (FAP) is fully user deployed and hence reduces the infrastructure, maintenance and operational cost of the operator while at the same time providing good Quality of Service (QoS) to the end user and high network capacity gains. However, the mass deployment of femtocell faces a number of challenges, among which interference management is of much importance, as the fundamental limits of capacity and achievable data rates mainly depends on the interference faced by the femtocell network. To cope with the technical challenges including interference management faced by the femtocells, researchers have suggested a variety of solutions. These solutions vary depending on the physical layer technology and the specific scenarios considered. Furthermore, the cognitive capabilities, as a functionality of femtocell have also been discussed in this survey. This article summarises the main concepts of femtocells that are covered in literature and the major challenges faced in its large scale deployment. The main challenge of interference management is discussed in detail with its types in femtocells and the solutions proposed over the years to manage interference have been summarised. In addition an overview of the current femtocell standardisation and the future research direction of femtocells have also been provided.
Data searching and retrieval is one of the fundamental functionalities in many Web of Things applications, which need to collect, process and analyze huge amounts of sensor stream data. The problem in fact has been well studied for data generated by sensors that are installed at fixed locations; however, challenges emerge along with the popularity of opportunistic sensing applications in which mobile sensors keep reporting observation and measurement data at variable intervals and changing geographical locations. To address these challenges, we develop the Geohash-Grid Tree, a spatial indexing technique specially designed for searching data integrated from heterogeneous sources in a mobile sensing environment. Results of the experiments on a real-world dataset collected from the SmartSantander smart city testbed show that the index structure allows efficient search based on spatial distance, range and time windows in a large time series database.
The energy consumption of backbone networks has become a primary concern for network operators and regulators due to the pervasive deployment of wired backbone networks to meet the requirements of bandwidth-hungry applications. While traditional optimization of IGP link weights has been used in IP based load-balancing operations, in this paper we introduce a novel link weight setting algorithm, the Green Load-balancing Algorithm (GLA), which is able to jointly optimize both energy efficiency and load-balancing in backbone networks. Such a scheme can be directly applied on top of existing link sleeping techniques in order to achieve substantially improved energy saving gains. The contribution is a practical solution that opens a new dimension of energy efficiency optimization, but without sacrificing traditional traffic engineering performance in plain IP routing environments. In order to evaluate the efficiency of the proposed optimization scheme without losing generality, we applied it to a set of recently proposed but diverse algorithms for link sleeping operations in the literature. Evaluation results based on the European academic network topology, GÉANT, and its real traffic matrices show that GLA can achieve significantly improved energy efficiency compared to the original standalone algorithms, while also maintaining near-optimal load-balancing performance.
Providing adaptive web services from mobile hosts is a new approach in mobile web services to cope resource scarcity of mobile network environment. This approach is explored through investigating some mechanisms to allow continuous and reliable service provisioning. However, there is a clear limitation in terms complexity and size of the services that may be executed on mobile hosts. In this paper, Simple Partial Offloading mechanism is studied to facilitate mobile web service adaptation through distributing the execution of mobile web services and modeling the transfer of required location-based information. The distribution can be classified into Forward or Bounce offloading while the transfer modeling is based on either Frontend or Backend scheme. Hence, four distinct types of mobile web service frameworks have been implemented; each of these architectures represents a different strategy for achieving adaptive and distributed web services. The paper describes the four prototypes that allow performance evaluation using resource intensive applications. The results presented show that basing distributed mobile hosted services on Backend Bounce Offload strategy is more suitable for mobile network environment.
Nowadays, smart environments (e.g., smart home, smart city) are built heavily relying on Cloud computing for the coordination and collaboration among smart objects. The Cloud is typically centralized but smart objects are ubiquitously distributed; thus, data transmission latency (i.e., end-to-end delay or response time) between Cloud and smart objects is a critical issue especially to the applications that have strict delay requirements. To address this concern, a new Fog computing paradigm has been recently proposed by the industry. The key idea is to bring the computing power from the remote Cloud closer to the users, which further enables real-time interaction and location-based services. In particular, the local processing capability of Fog computing significantly scales down the data volume towards the Cloud, and it in turn has great impacts on the entire Internet. In this chapter, smart living as one of the primary elements of smart cities has been conceptualized to EHOPES, namely smart Energy, smart Health, smart Office, smart Protection, smart Entertainment and smart Surroundings. And then the data flow analysis has been investigated to disclose a variety of data flow characteristics. Based on these studies, a data-centered Fog platform has been developed to support smart living. Case studies are also conducted to validate and evaluate the proposed platform.
The heterogeneous, dynamic nature of current communication environments necessitates that all system components that form part of a personalisation framework should be context aware. To ensure context enabled interoperation, a shared, formalised specification of devices and services in the ambient environment is a must. With this aim, this paper presents an ontology model that captures the semantics of the multimodal devices and services in the mobile ad-hoc environment. The approach is validated using available metrics and compared to existing approaches, both through subjective feature-based evaluation and metrics’ calculations. This paper also extends the metrics’ usability by extending the analysis to interoperability with application logic and domain capture.
In the IEEE 802.22 standard, the spectrum sensing mechanism is identified as a key functionality of a cognitive radio. Due to the channel uncertainty, a single cognitive user, in most cases, can not make a reliable decision and hence collaboration or cooperation of and among multiple users is required. However, when large number of cognitive users are collaborating with each other, the bandwidth requirements for sending their result to the fusion centre tends to be very large. In this paper, a metric for spectrum efficiency is defined and used for the optimisation of collaborative spectrum sensing. An optimisation algorithm is presented to calculate the optimal number of collaborating cognitive users with the aim to maximise overall spectrum efficiency by satisfying certain constraints in terms of global probability of detection and probability of false alarm. Numerical results show that for maximum spectrum efficiency collaboration of only a subset of the available cognitive users is required. © 2009 IEEE.
The heterogeneous, dynamic nature of current mobile environments necessitates that all system components that form part of a personalization framework should be context aware. This necessitates the development of a framework of distributed context handlers to derive meaning from context and combine it with application logic. Towards this aim, this paper presents a Service Context Manager (SCM) framework that handles all the stages of context gathering, processing and reasoning to enable personalized service presentation. The validation of the approach is shown through application use cases.
Backbone network energy efficiency has recently become a primary concern for Internet Service Providers and regulators. The common solutions for energy conservation in such an environment include sleep mode reconfigurations and rate adaptation at network devices when the traffic volume is low. It has been observed that many ISP networks exhibit regular traffic dynamicity patterns which can be exploited for practical time-driven link sleeping configurations. In this work, we propose a joint optimization algorithm to compute the reduced network topology and its actual configuration duration during daily operations. The main idea is first to intelligently remove network links using a greedy heuristic, without causing network congestion during off-peak time. Following that, a robust algorithm is applied to determine the window size of the configuration duration of the reduced topology, making sure that a unified configuration with optimized energy efficiency performance can be enforced exactly at the same time period on a daily basis. Our algorithm was evaluated using on a Point-of-Presence representation of the GÉANT network and its real traffic matrices. According to our simulation results, the reduced network topology obtained is able to achieve 18.6% energy reduction during that period without causing significant network performance deterioration. The contribution from this work is a practical but efficient approach for energy savings in ISP networks, which can be directly deployed on legacy routing platforms without requiring any protocol extension. © 2012 IEEE.
Next generation networks will be more and more heterogeneous and dynamic. The traditional model for interactions between different networks is relatively static. However, future scenarios involve quite dynamic inter-network relationships established to meet service requirements on-the-fly. In particular, Network Composition will enable dynamic and automatic cooperation between networks, based on agreements tailored to meet the specific needs of the networks concerned. To facilitate such interactions, functionality to advertise and to discover various networks and services within the local and remote scope will be required. In this paper, we address the twinfunctions of Network Advertisement and Discovery. A scenariobased analysis is used to determine the key requirements of these functions, with special emphasis on the problem of distributing advertisement and discovery messages to a set of unknown target networks. A novel solution called Map of Relaying Messages is presented to enable the efficient and timely distribution of various messages for the purpose of advertisement and discovery.
We investigate a collision-sensitive secondary network that intends to opportunistically aggregate and utilize spectrum of a primary network to achieve higher data rates. In opportunistic spectrum access with imperfect sensing of idle primary spectrum, secondary transmission can collide with primary transmission. When the secondary network aggregates more channels in the presence of the imperfect sensing, collisions could occur more often, limiting the performance obtained by spectrum aggregation. In this context, we aim to address a fundamental query, that is, how much spectrum aggregation is worthy with imperfect sensing. For collision occurrence, we focus on two different types of collision: one is imposed by asynchronous transmission; and the other by imperfect spectrum sensing. The collision probability expression has been derived in closed-form with various secondary network parameters: primary traffic load, secondary user transmission parameters, spectrum sensing errors, and the number of aggregated sub-channels. In addition, the impact of spectrum aggregation on data rate is analysed under the constraint of collision probability. Then, we solve an optimal spectrum aggregation problem and propose the dynamic spectrum aggregation approach to increase the data rate subject to practical collision constraints. Our simulation results show clearly that the proposed approach outperforms the benchmark that passively aggregates sub-channels with lack of collision awareness.
The world becomes ubiquitous, and mobile communication platforms become oriented towards integration with the web, getting benefits from the large amount of information available there, and creation of the new types of value-added services. Semantic and ontology technologies are seen as being able to advance the seamless integration of the mobile and the Web worlds. We provide background information on the Semantic Web field, discuss other research fields that bring semantics into play for reaching the ontology-enabled ubiquitous mobile communication vision, and exemplify the state of the art of ontology development and use in telecommunication projects. © 2009, IGI Global.
The economic and social impact of poor air quality in towns and cities is increasingly being recognised, together with the need for effective ways of creating awareness of real-time air quality levels and their impact on human health. With local authority maintained monitoring stations being geographically sparse and the resultant datasets also featuring missing labels, computational data-driven mechanisms are needed to address the data sparsity challenge. In this paper, we propose a machine learning-based method to accurately predict the Air Quality Index (AQI), using environmental monitoring data together with meteorological measurements. To do so, we develop an air quality estimation framework that implements a neural network that is enhanced with a novel Non-linear Autoregressive neural network with exogenous input (NARX) model, especially designed for time series prediction. The framework is applied to a case study featuring different monitoring sites in London, with comparisons against other standard machine-learning based predictive algorithms showing the feasibility and robust performance of the proposed method for different kinds of areas within an urban region.
This paper describes a linked-data platform to publish sen- sor data and link them to existing resource on the semantic Web. The linked sensor data platform, called Sense2Web supports exible and in- teroperable descriptions and provide association of di erent sensor data ontologies to resources described on the semantic Web and the Web of data. The current advancements in (wireless) sensor networks and being able to manufacture low cost and energy e cient hardware for sensors has lead to much interest in integrating physical world data into theWeb. Wireless sensor networks employ various types of hardware and software components to observe and measure physical phenomena and make the obtained data available through di erent networking services. Applica- tions and users are typically interested in querying various events and requesting measurement and observation data from the physical world. Using a linked data approach enables data consumers to access sensor data and query the data and relations to obtain information and/or inte- grate data from various sources. Global access to sensor data can provide a wide range of applications in di erent domains such as geographical information systems, healthcare, smart homes, and business applications and scenarios. In this paper we focus on publishing linked-data to anno- tate sensors and link them to other existing resources on the Web.
We consider resource allocation with aggregation for different types of traffic in heterogeneous networks, including WLANs. While mobile data traffic is expected to increase, efficient management of multiple bands including unlicensed band becomes increasingly important. In this context, we formulate a resource allocation problem using utility functions for heterogeneous traffic and propose a novel algorithm that considers the estimated UE speed, traffic types and channel quality. Simulation results illustrate performance of the proposed algorithm in terms of higher utility value and fairness, even at high traffic loads. Additional improvements in resource utilization through estimating UE speed and allocating low-mobility UEs to Wi-Fi are shown.
Due to the heterogeneity and complexity of user environments, multimedia services and multimedia content in the communication domain, adaptation is of a paramount importance to interoperability. Adaptation decisions at the different stages of multimedia services delivery to the end user depends on contextual information, i.e. metadata that characterises the situation of entities involved in the interaction between the user and multimedia services. This paper presents how the Adaptation Manager processes and models contextual information, and how it complements the decision taking framework defined by MPEG-21 DIA. The use of important standards and technologies such as MPEG-21 DIA, XML, Description Logics and OWL is explained. Copyright 2008 ICST.
We consider the resource allocation with aggregation of multiple bands including unlicensed band for heterogeneous traffic. While the mobile data traffic including high volume of video traffic is expected to increase significantly, an efficient management of radio resources from multiple bands is required to guarantee the quality of service (QoS) of different traffic types. In this context, we formulate an optimal resource allocation by using different utility functions for heterogeneous traffic and the two-step resource allocation algorithm including resource grouping has been proposed. Simulation results demonstrate that the proposed algorithm enhances the connection robustness and shows good performance in terms of higher utility value of inelastic traffic even at high traffic loads by steering elastic traffic to unlicensed band.
The energy consumption of backbone networks has risen exponentially during the past decade with the advent of various bandwidth-hungry applications. To address this serious issue, network operators are keen to identify new energy-saving techniques to green their networks. Up to this point, the optimization of IGP link weights has only been used for load-balancing operations in IP-based networks. In this paper, we introduce a novel link weight setting algorithm, the Green Load-balancing Algorithm (GLA), which is able to jointly optimize both energy efficiency and load-balancing in backbone networks without any modification to the underlying network protocols. The distinct advantage of GLA is that it can be directly applied on top of existing link-sleeping based Energy-aware Traffic Engineering (ETE) schemes in order to achieve substantially improved energy saving gains, while at the same time maintain traditional traffic engineering objectives. In order to evaluate the performance of GLA without losing generality, we applied the scheme to a number of recently proposed but diverse ETE schemes based on link sleeping operations. Evaluation results based on the European academic network topology GÉANT and its real traffic matrices show that GLA is able to achieve significantly improved energy efficiency compared to the original standalone algorithms, while also achieving near-optimal load-balancing performance. In addition, we further consider end-to-end traffic delay requirements since the optimization of link weights for load-balancing and energy savings may introduce substantially increased traffic delay after link sleeping. In order to solve this issue, we modified the existing ETE schemes to improve their end-to-end traffic delay performance. The evaluation of the modified ETE schemes together with GLA shows that it is still possible to save a significant amount of energy while achieving substantial load-balancing within a given traffic delay constraint. © 2014 Elsevier B.V. All rights reserved.
This paper describes the End-to-End Reconfigurability (E2R IT) research framework and focuses on the current status of the activities inside the consortium. E2R II is a partly funded project inside the Sixth Framework Programme of European Community. The E2R II project is part of a wider program, started with the E2R I project in 2004. Tn this program, concepts and solutions in order to enable, manage and control the end-to-end connectivity in highly heterogeneous environments are developed, taking into account the different Radio Access Technologies potentially aclive (2G/3G cellular, 4G/B3G, IEEE 802.xx. broadcast...). The key objective of the E2R II project is to devise, develop, trial and showcase architectural design of reconfigurable devices and supporting system functions with the aim to offer an extensile set of operational choices to the users, application and service providers, operators, manufacturers and regulators in the context of heterogeneous systems.
This paper proposes a novel spectrum utility (SU) metric that assesses the efficiency of spectrum usage by a set of heterogeneous applications. Unlike the traditional spectrum efficiency (SE), the proposed metric does not blindly consider the achievable bit-rate, but captures the most relevant performance metrics for each of the considered applications. Specifically, it is formulated as an aggregated utility that combines the satisfaction level with respect to the various requirements with an innovative pricing model based on it to derive the total revenue generated for the spectrum owner. To get insight into the usefulness of the proposed metric, the proposed methodology is instantiated for an illustrative use case, where a mixture of delay-sensitive (i.e., interactive video) and -tolerant (i.e., file transfer) applications are established in dense indoor deployments. The obtained results reveal that the proposed SU significantly outperforms the legacy SE in assessing how efficiently a limited frequency spectrum is utilised from the perspective of the total revenue, particularly when the quality-of-experience (QoE) perceived during video sessions is degraded. This calls for a novel SU-aware ecosystem, where the spectrum sharing models, billing policies and resource allocation mechanisms (e.g., medium access control (MAC) and radio resource management (RRM)) are jointly revisited to maximise the overall SU.
As the number of Linked Data sets increases with more and more interconnections defined between them, querying a single data set is no longer enough for users who need data from mixed domains. The requirement to query data from different data sets motivates the research into federated queries. Network latency is one of the key factors which affect the performance of a federated query. The influence of network latency can be minimised by decreasing the number of remote requests, which is related to the number of joins. In this paper, we provide a mechanism for federated querying based on subject and sameAs grouping techniques. Exploiting the benefits of proposed grouping methods, the number of joins during a federated query has been reduced, thus improving the performance of the entire query. We have evaluated our approach against other existing approaches, using an existing benchmark suite and found that our approach performs better than comparable approaches for queries that are not highly selective.
-The VISIon of service personalization for mobile communication environments entails context sensitive service provisioning. The realization of such customizable smart spaces necessitates acquisition and processing of modality context information from a variety of devices in the ambient environment. The heterogeneity of available device capabilities and description formats brings new challenges for a context reasoning engine that formulates content delivery decisions. Specifically, to ensure interoperability with existing application logic, the enabling components should support semantic queries. Secondly, situations where variously formatted context input may not provide enough information to answer queries, should be intelligently handled. Towards this aim, this paper discusses a context reasoning and query interface component as part of a Service Context Manager (SCM) framework that supports semantic querying and handles incomplete context information through a rule-based mechanism. The validation of the approach is provided by showing the mapping of disparate UAProf and UPnP descriptions into the framework and querying of supported modality services.
We present an emerging approach of Mobile Social Spaces (MOSS) that intends to improve the ways in which people communicate in the modern world. Pervasive content and service creation and provisioning, in particular for dynamically changing social groups, is a complex task and subject to varying locations of individuals, of the complete group and its context. MOSS tries to remove some of the obstacles in this area and defines a range of functionalities that will support dynamic ubiquitous creation and instantiation of community content and services.
The most common use of formal verification methods so far has been in identifying whether livelock and/or deadlock situations can occur during protocol execution, process, or system operation. In this work, we aim to show that an additional equally important and useful application of formal verification methods can be in protocol design in terms of performance-related metrics. This can be achieved by using the methods in a rather different context compared with their traditional use, that is, not only as model checking tools to assess the correctness of a protocol in terms of lack of livelock and deadlock situations but rather as tools capable of building profiles of protocol operations, assessing their performance, and identifying operational patterns and possible bottleneck operations. This process can provide protocol designers with an insight about the protocols’ behavior and guide them toward further optimizations. It can also assist network operators and service providers to assess the protocols’ relative performance and select the most suitable protocol for specific deployment scenarios. We illustrate these principles by showing how formal verification tools can be applied in this protocol profiling and performance assessment context using some existing protocol implementations in mobile and wireless environments as case studies.
Collaborative spectrum sensing has attracted significant research attention in the last few years and is widely accepted as a viable approach to improve spectrum sensing reliability. Fusing data from multiple Opportunistic users (OUs) in order to produce reliable sensing results implies a reliance on the OU to provide correct information. In the presence of malfunctioning or selfish users, performance of collaborative spectrum sensing deteriorates significantly. In this article, we propose mechanisms for the detection and suppression of such deleterious opportunistic users (DOUs) for hard and soft decision fusion. More specifically, a credibility based mechanism for hard decision fusion using a beta reputation system (HDC-BR) is introduced. Our proposed method does not require knowledge of the total number of deleterious users in advance. In HDC-BR, the fusion center assigns and updates weights to each user’s decisions based on an individual user credibility score which is calculated using the beta reputation system. The presence of DOUs in soft decision based collaborative spectrum sensing has even more adverse effects on system performance. We also propose a scheme for the case of soft decision fusion to detect and eliminate falsified user observations at the fusion centre using a Modified Grubbs Test; we refer to it as SDC-MG. We compare the performance of the proposed methods with malicious user detection schemes proposed in the literature as well as with the case where no DOU suppression scheme is implemented, and conclude that SDC-MG performs much better than HDC-BR in a low Signal to Noise Ratio (SNR) regime.
Spectrum sensing, in particular, detecting the presence of incumbent users in licensed spectrum, is one of the pivotal task for cognitive radios (CRs). In this paper, we provide solutions to the spectrum sensing problem by using statistical test theory, and thus derive novel spectrum sensing approaches. We apply the classical Kolmogorov-Smirnov (KS) test to the problem of spectrum sensing under the assumption that the noise probability distribution is known. In practice, the exact noise distribution is unknown, so a sensing method for Gaussian noise with unknown noise power is proposed. Next it is shown that the proposed sensing scheme is asymptotically robust and can be applied to non-Gaussian noise distributions. We compare the performance of sensing algorithms with the well-known Energy Detector (ED) and Anderson-Darling (AD) sensing proposed in recent literature. Our paper shows that proposed sensing methods outperform both ED and AD based sensing especially for the most important case when the received Signal to Noise Ratio (SNR) is low.
This work addresses joint transceiver optimization for multiple-input, multiple-output (MIMO) systems. In practical systems the complete knowledge of channel state information (CSI) is hardly available at transmitter. To tackle this problem, we resort to the codebook approach to precoding design, where the receiver selects a precoding matrix from a finite set of pre-defined precoding matrices based on the instantaneous channel condition and delivers the index of the chosen precoding matrix to the transmitter via a bandwidth-constraint feedback channel. We show that, when the symbol constellation is improper, the joint codebook based precoding and equalization can be designed accordingly to achieve improved performance compared to the conventional system.
Test Case Diversity investigations promise to reduce the number of Test Cases to be executed whereby addressing one of the drawbacks of automated model-based testing. Based on the assumption that more diverse Test Cases have a higher probability to fail, algorithms for distance analysis and search based minimisation techniques can help to enhance the quality of selection. This work discusses the application of Hamming Distance and Levenshtein Distance to compute similarity scores and outlines how Random Search and Hill Climbing can be applied to the problem of group optimisation based on pairwise Test Case similarity scores. The evaluation results, conducted with a test framework for automated test derivation and execution for IoT-based services, indicates that proposed Group Hill Climbing algorithm can outperform Random Search and at the same time utilising less computation time. The inclusion of the sequencebased Levenshtein algorithm shows advantages over the utilisation of the set-based Hamming-inspired scoring methodology.
providing adaptive resource intensive Web services from mobile hosts needs to be done in a rather light-weight manner to allow continuous service provisioning. Processing and communication will drain the battery rapidly; hence both should be kept at a minimum. This paper describes the outcomes of an investigation into offloading and migration mechanisms that facilitate provision of adaptive and distributed mobile Web services. The investigation goes through three phases. The first phase integrates these mechanisms with the Simple Object Access Protocol (SOAP) and Representational State Transfer (REST) architectures producing extended mobile Web service frameworks. This phase is achieved by the implementation of a prototype that allows performance evaluation of both extended frameworks. The evaluation of the load and performance of the distributed services is taking place using resource intensive applications. The results presented show that basing distributed mobilehosted services on REST is more suitable than using SOAP as underlying Web service infrastructure. The second phase relies on the outperforming REST-based framework to examine four distinct strategies for mobile Web service distribution mechanisms. In the last phase, evaluation results of the second phase are interpreted as Fuzzy Logic rules. These rule sets are used to trigger and control offloading schemes. © 2012 ACADEMY PUBLISHER.
We consider a multiple femtocell deployment in a small area which shares spectrum with the underlaid macrocell. We design a joint energy and radio spectrum scheme which aims not only for co—existence with the macrocell, but also for an energy—efficient implementation of the multi—femtocells. Particularly, aggregate energy usage on dense femtocell channels is formulated taking into account the cost of both the spectrum and energy usage. We investigate an energy-and-spectral efficient approach to balance between the two costs by varying the number of active sub—channels and their energy. The proposed scheme is addressed by deriving closed—form expressions for the interference towards the macrocell and the outage capacity. Analytically, discrete regions under which the most promising outage capacity is achieved by the same size of active sub—channels are introduced. Through a joint optimization of the sub—channels and their energy, properties can be found for the maximum outage capacity under realistic constraints. Using asymptotic and numerical analysis, it can be noticed that in a dense femtocell deployment, the optimum utilization of the energy and the spectrum to maximize the outage capacity converges towards a round—robin scheduling approach for a very small outage threshold. This is the inverse of the traditional greedy approach.
This paper extends traditional dynamic adaptive streaming over HTTP (DASH) to efficiently exploit all available bands and licensing regimes in a given context. A novel objective quality-of-experience (QoE) metric is proposed to capture the most relevant factors that impact user perception during streaming sessions. Based on it, a QoE-driven adaptation strategy is devised to jointly select the best radio access technology (RAT) and quality for each video segment depending on the various components of the context. It relies first on fuzzy logic to estimate the QoE provided by each available RAT subject to the uncertainty level associated with DASH clients. Then, a fuzzy multiple attribute decision making (MADM) methodology is developed to combine the QoE estimates with the heterogeneous components of the context to assess the in-context suitability levels. The proposed approach is applied to adapt video streaming across available RATs in dense deployments for a set of Bronze and Gold subscriptions. The results reveal that the proposed strategy always assigns Gold clients to the wellregulated licensed band, while switches Bronze clients between licensed and unlicensed bands depending on the operating conditions. It strikes a balance between maximising video quality and reducing playback stalling, which significantly improves the perceived QoE compared to the traditional DASH approach.
The previous chapters mainly examined methods to save energy at the mobile handset, either by using short-range cooperation between mobile terminals, or by performing smart vertical handovers between heterogeneous radio access technologies. These techniques can be beneficial to mobile systems, but they have to be performed based on informed decisions; meaning that mobile devices need to be cognitive. Modern devices already collect significant amounts of information, but they have limited capability to exploit such context/information, and handover decisions are merely based on signal strength, or are network controlled and based on network load. In this chapter, we aim to go beyond the state-of-the-art by envisioning mobile terminals with the capability to make informed decisions based on a reservoir of context information made available through context providers; namely what is referred to as smart phones. We include a survey of the current state of the art for context extraction and management in context-aware systems; besides listing the current context extraction techniques and research efforts, we pinpoint the important properties of good context extraction techniques. Thereafter, we discuss how context information can be exploited in energy saving when performing network or node discovery mechanism, by instructing the mechanisms to scan for certain nodes/networks which are known to be in the vicinity. Finally, we discuss the range of context information that can be used to make informed decisions to save power.
Most of the IoT applications are distributed in nature generating large data streams which have to be analyzed in near real-time. Solutions based on Complex Event Processing (CEP) have the potential to extract high-level knowledge from these data streams but the use of CEP for distributed IoT applications is still in early phase and involves many drawbacks. The manual setting of rules for CEP is one of the major drawback. These rules are based on threshold values and currently there are no automatic methods to find the optimized threshold values. In real-time dynamic IoT environments, the context of the application is always changing and the performance of current CEP solutions are not reliable for such scenarios. In this regard, we propose an automatic and context aware method based on clustering for finding optimized threshold values for CEP rules. We have developed a lightweight CEP called CEP to run on low processing hardware which can update the rules on the run. We have demonstrated our approach using a real-world use case of Intelligent Transportation System (ITS) to detect congestion in near real-time.
Delivering individualized services that conform to the user’s current situation will form the focus of ubiquitous environments. A description of the networked environment at a semantic level will necessitate contextually oriented knowledge acquisition methods. This then engenders unique challenges for the crucial step of resource discovery. A number of service discovery protocols exist to perform this role. In this paper, we identify the requirements inherent for such an environment and investigate the suitability of the available protocols against these. A suitable candidate solution is proposed with an implementation with semantic extensions and reference points for further enhancements.