Dr Abdelrahim Mohamed
Academic and research departmentsInstitute for Communication Systems, Department of Electrical and Electronic Engineering, Faculty of Engineering and Physical Sciences.
Dr. Abdelrahim Mohamed received the B.Sc. degree (first class) in Electrical and Electronics Engineering from the University of Khartoum, Sudan, in 2011, the M.Sc. degree (distinction) in Mobile and Satellite Communications from the University of Surrey, U.K. in 2013, and the Ph.D. degree in Electronics Engineering from the University of Surrey, U.K. in 2016. He secured the first place in the Electrical and Electronic Engineering Department of the University of Surrey, U.K., during his M.Sc. studies. In addition, he is awarded the Sentinels of Science 2016 Award. He is currently a Postdoctoral Research Fellow at the Institute for Communication Systems (ICS), University of Surrey, U.K. He is currently involved in the System Architecture and Co-existence work area, and the New Physical Layer work area in the 5G Innovation Centre at Surrey. He is the developer of the 5G New Radio System Level Simulator. His research contributed to the QSON project, the FP7 CoRaSat project, the EPSRC Stepping Towards the Industrial 6th Sense project, the DCMS 5G Planning Tool, and the Energy Proportional eNodeB for LTE-Advanced and Beyond project. His main areas of research interest include radio access networks with control/data plane separation, energy efficiency, mobility management, and cognitive radio. He is endorsed by the Royal Academy of Engineering and the Home Office as an Exceptional Talent/Promise.
Areas of specialism
Affiliations and memberships
Main developer of the C++ 3GPP-compliant simulators to provide a benchmark implementation for the standard in parallel to being used as a testing and evaluation tool. Calibrated against 3GPP results and other commercial and non-commercial simulators.
A DCMS (Department for Digital, Culture, Media & Sport) project involving 5GIC, Ordnance Survey and Met Office to develop a planning tool for 5G system operating in millimeter-wave band in Bournemouth.
Details available at: www.gov.uk/government/publications/ordnance-survey-5g-planning-and-mmwa…
Huawei/5GIC project, proposed predictive models and algorithms for eNodeB sleep modes coupled with detailed component and sub-component profiling
Interdisciplinary EPSRC project integrating 5G URLLC/mMTC with machine learning for fault detection/prediction in intelligent refineries and chemical process plants
CENC/5GIC project investigates using LEO satellites for full 5G coverage
Member and Researcher of 5GIC Integrated Solution, 5GIC New Physical Layer work area, 5GIC MAC, RRM and RAN Management work area
Postgraduate research supervision
PhD Candidate: Ruben Alexandre Guerra Borralho
Coverage Enhancement in Cellular Networks
EEEM032 Advanced Satellite Communication Techniques
Courses I teach on
Multi-band and multi-tier network densification is being considered as the most promising solution to overcome the capacity crunch problem of cellular networks. In this direction, small cells (SCs) are being deployed within the macro cell (MC) coverage, to off-load some of the users associated with the MCs. This deployment scenario raises several problems. Among others, signalling overhead and mobility management will become critical considerations. Frequent handovers (HOs) in ultra dense SC deployments could lead to a dramatic increase in signalling overhead. This suggests a paradigm shift towards a signalling conscious cellular architecture with smart mobility management. In this regards, the control/data separation architecture (CDSA) with dual connectivity is being considered for the future radio access. Considering the CDSA as the radio access network (RAN) architecture, we quantify the reduction in HO signalling w.r.t. the conventional approach. We develop analytical models which compare the signalling generated during various HO scenarios in the CDSA and conventionally deployed networks. New parameters are introduced which can with optimum value significantly reduce the HO signalling load. The derived model includes HO success and HO failure scenarios along with specific derivations for continuous and non-continuous mobility users. Numerical results show promising CDSA gains in terms of saving in HO signalling overhead.
This contribution introduces the development of an intelligent monitoring and control framework for chemical processes, integrating the advantages of Industry 4.0 technologies, cooperative control and fault detection via wireless sensor networks. Using information on the process’ structure and behaviour, equipment information, and expert knowledge, the system is able to detect faults. The integration with the monitoring system facilitates the detection and optimises the controller’s actions. The results indicate that the proposed approach achieves high fault detection accuracy based on plant measurements, while the cooperative controllers improve the control of the process.
Recently, the fifth-generation (5G) cellular system has been standardised. As opposed to legacy cellular systems geared towards broadband services, the 5G system identifies key use cases for ultra-reliable and low latency communications (URLLC) and massive machine-type communications (mMTC). These intrinsic 5G capabilities enable promising sensor-based vertical applications and services such as industrial process automation. The latter includes autonomous fault detection and prediction, optimised operations and proactive control. Such applications enable equipping industrial plants with a sixth sense (6S) for optimised operations and fault avoidance. In this direction, we introduce an inter-disciplinary approach integrating wireless sensor networks with machine learningenabled industrial plants to build a step towards developing this 6S technology. We develop a modular-based system that can be adapted to the vertical-specific elements. Without loss of generalisation, exemplary use cases are developed and presented including a fault detection/prediction scheme, and a sensor density-based boundary between orthogonal and non-orthogonal transmissions. The proposed schemes and modelling approach are implemented in a real chemical plant for testing purposes, and a high fault detection and prediction accuracy is achieved coupled with optimised sensor density analysis.
In research community, a new radio access network architecture with a logical separation between control plane (CP) and data plane (DP) has been proposed for future cellular systems. It aims to overcome limitations of the conventional architecture by providing high data rate services under the umbrella of a coverage layer in a dual connection mode. This configuration could provide significant savings in signalling overhead. In particular, mobility robustness with minimal handover (HO) signalling is considered as one of the most promising benefits of this architecture. However, the DP mobility remains an issue that needs to be investigated. We consider predictive DP HO management as a solution that could minimise the out-of band signalling related to the HO procedure. Thus we propose a mobility prediction scheme based on Markov Chains. The developed model predicts the user’s trajectory in terms of a HO sequence in order to minimise the interruption time and the associated signalling when the HO is triggered. Depending on the prediction accuracy, numerical results show that the predictive HO management strategy could significantly reduce the signalling cost as compared with the conventional non-predictive mechanism.
Frequent handovers (HOs) in dense small cell deployment scenarios could lead to a dramatic increase in signalling overhead. This suggests a paradigm shift towards a signalling conscious cellular architecture with intelligent mobility management. In this direction, a futuristic radio access network with a logical separation between control and data planes has been proposed in research community. It aims to overcome limitations of the conventional architecture by providing high data rate services under the umbrella of a coverage layer in a dual connection mode. This approach enables signalling efficient HO procedures, since the control plane remains unchanged when the users move within the footprint of the same umbrella. Considering this configuration, we propose a core-network efficient radio resource control (RRC) signalling scheme for active state HO and develop an analytical framework to evaluate its signalling load as a function of network density, user mobility and session characteristics. In addition, we propose an intelligent HO prediction scheme with advance resource preparation in order to minimise the HO signalling latency. Numerical and simulation results show promising gains in terms of reduction in HO latency and signalling load as compared with conventional approaches.
Most of the wireless systems such as the long term evolution (LTE) adopt a pilot symbol-aided channel estimation approach for data detection purposes. In this technique, some of the transmission resources are allocated to common pilot signals which constitute a significant overhead in current standards. This can be traced to the worst-case design approach adopted in these systems where the pilot spacing is chosen based on extreme condition assumptions. This suggests extending the set of the parameters that can be adaptively adjusted to include the pilot density. In this paper, we propose an adaptive pilot pattern scheme that depends on estimating the channel correlation. A new system architecture with a logical separation between control and data planes is considered and orthogonal frequency division multiplexing (OFDM) is chosen as the access technique. Simulation results show that the proposed scheme can provide a significant saving of the LTE pilot overhead with a marginal performance penalty.
Conventional cellular systems are dimensioned according to a worst case scenario, and they are designed to ensure ubiquitous coverage with an always-present wireless channel irrespective of the spatial and temporal demand of service. A more energy conscious approach will require an adaptive system with a minimum amount of overhead that is available at all locations and all times but becomes functional only when needed. This approach suggests a new clean slate system architecture with a logical separation between the ability to establish availability of the network and the ability to provide functionality or service. Focusing on the physical layer frame of such an architecture, this paper discusses and formulates the overhead reduction that can be achieved in next generation cellular systems as compared with the Long Term Evolution (LTE). Considering channel estimation as a performance metric whilst conforming to time and frequency constraints of pilots spacing, we show that the overhead gain does not come at the expense of performance degradation.
In this paper, we present an analytical framework to model the sleep mode power consumption of a base station (BS) as a function of its sleep depth. The sleep depth is made up of the BS deactivation latency, actual sleep period and activation latency. Numerical results demonstrate a close match between our proposed approach and the actual sleep mode power consumption for selected BS types. As an application of our proposed approach, we analyze the optimal sleep depth of a BS, taking into consideration the increased power consumption during BS activation, which exceeds its no-load power consumption. We also consider the power consumed during BS deactivation, which also exceeds the power consumed when the actual sleep level is attained. From the results, we can observe that the average total power consumption of a BS monotonically decreases with the sleep depth as long as the ratio between the actual sleep period and the transition latency (deactivation plus reactivation latency) exceeds a certain threshold.
We consider an idealistic scenario where the vacation (no-load) period of a typical base station (BS) is known in advance such that its vacation time can be matched with a sleep depth. The latter is the sum of the deactivation latency, actual sleep period and reactivation latency. Noting that the power consumed during the actual sleep period is a function of the deactivation latency, we derive an accurate closed-form expression for the optimal deactivation latency for deterministic BS vacation time. Further, using this expression, we derive the optimal average power consumption for the case where the vacation time follows a known distribution. Numerical results show that significant power consumption savings can be achieved in the sleep mode by selecting the optimal deactivation latency for each vacation period. Furthermore, our results also show that deactivating the BS hardware is sub-optimal for BS vacation less than a particular threshold value.
Nowadays, system architecture of the fifth generation (5G) cellular system is becoming of increasing interest. To reach the ambitious 5G targets, a dense base station (BS) deployment paradigm is being considered. In this case, the conventional always-on service approach may not be suitable due to the linear energy/density relationship when the BSs are always kept on. This suggests a dynamic on/off BS operation to reduce the energy consumption. However, this approach may create coverage holes and the BS activation delay in terms of hardware transition latency and software reloading could result in service disruption. To tackle these issues, we propose a predictive BS activation scheme under the control/data separation architecture (CDSA). The proposed scheme exploits user context information, network parameters, BS sleep depth and measurement databases to send timely predictive activation requests in advance before the connection is switched to the sleeping BS. An analytical model is developed and closed-form expressions are provided for the predictive activation criteria. Analytical and simulation results show that the proposed scheme achieves a high BS activation accuracy with low errors w.r.t. the optimum activation time.
This contribution introduces a framework for the fault detection and healing of chemical processes over wireless sensor networks. The approach considers the development of a hybrid system which consists of a fault detection method based on machine learning, a wireless communication model and an ontology-based multi-agent system with a cooperative control for the process monitoring.
Future cellular systems need to cope with a huge amount of data and diverse service requirements in a flexible, sustainable, green and efficient way with minimal signalling overhead. This calls for network densification, a short length wireless link, efficient and proactive control signalling and the ability to switch off the power consuming devices when they are not in use. In this direction, the conventional alwayson service and worst-case design approach has been identified as the main source of inefficiency, and a paradigm shift towards adaptive and on-demand systems is seen as a promising solution. However, the conventional radio access network (RAN) architecture limits the achievable gains due to the tight coupling between network and data access points, which in turn imposes strict coverage and signalling requirements irrespective of the spatio-temporal service demand, channel conditions or mobility profiles.
Cognitive radio (CR) is a potentially promising solution to the spectrum crunch problem that faces both future terrestrial and satellite systems. This paper discusses the applicability of CR in satellite/terrestrial spectrum sharing scenarios by modelling interference relations between these systems. It analyses the relative impact of several design parameters that can be tuned in order to reach a particular interference target. A realistic path loss model is considered and gain patterns of directional antennas are taken into account which are found to be efficient in minimising the interference. A generic model that is not restricted to particular systems is developed, and typical parameters are used to analyse the co-existence feasibility in a realistic sharing scenario. The results show that both satellite and terrestrial systems could potentially operate in the same band without degrading each other’s performance if appropriate considerations are taken into account and an appropriate design of the interfering system is carried out.
As soon as 2020, network densification and spectrum extension will be the dominant theme to support enormous capacity and massive connectivity . However, this approach may not guarantee wide area coverage due to the poor propagation capabilities of high frequency bands. In addition, energy efficiency and signalling overhead will become critical considerations in ultra-dense deployment scenarios. This calls for a futuristic two layer RAN architecture with dual connectivity, where the high frequency bands are used for data services, complemented by a coverage layer at conventional cellular bands . This separation of control and data planes will enable a transition from always-on to always-available systems and could result in order of magnitude savings in energy and signalling overhead.
Conventional cellular systems are designed to ensure ubiquitous coverage with an always present wireless channel irrespective of the spatial and temporal demand of service. This approach raises several problems due to the tight coupling between network and data access points, as well as the paradigm shift towards data-oriented services, heterogeneous deployments and network densification. A logical separation between control and data planes is seen as a promising solution that could overcome these issues, by providing data services under the umbrella of a coverage layer. This article presents a holistic survey of existing literature on the control-data separation architecture (CDSA) for cellular radio access networks. As a starting point, we discuss the fundamentals, concept and general structure of the CDSA. Then, we point out limitations of the conventional architecture in futuristic deployment scenarios. In addition, we present and critically discuss the work that has been done to investigate potential benefits of the CDSA, as well as its technical challenges and enabling technologies. Finally, an overview of standardisation proposals related to this research vision is provided.
The ambitious fifth generation (5G) cellular system requirements and performance targets motivated standardisation bodies to consider wide bandwidth allocations for 5G in the mm-wave band. Nevertheless, parts of the considered band are already allocated to satellite services in several regions. We tackle this challenge by proposing a co-existence framework for 5G and fixed satellite services (FSS). We focus on the uplink of both systems and consider realistic 5G deployment scenarios with multiple users and multiple radio access network (RAN) cells. We propose a generic and controllable co-existence constraint applicable to different 5G numerologies and configurations. In addition, we derive a protection distance to guarantee the co-existence constraint and utilise several 5G system features to define soft constraints. The 5G/FSS coexistence is investigated based on performance of the 5G user plane. Simulation results show that the 5G deployment scenario is a key factor in setting the protection distance. In addition, the FSS elevation has a significant effect on the identified distance. The results suggest that both systems can operate in the same band without very large protection distance at a controllable expense of a small, e.g., 1% - 5%, performance loss.
Network densification with small cell deployment is being considered as one of the dominant themes in the fifth generation (5G) cellular system. Despite the capacity gains, such deployment scenarios raise several challenges from mobility management perspective. The small cell size, which implies a small cell residence time, will increase the handover (HO) rate dramatically. Consequently, the HO latency will become a critical consideration in the 5G era. The latter requires an intelligent, fast and light-weight HO procedure with minimal signalling overhead. In this direction, we propose a memory-full context-aware HO scheme with mobility prediction to achieve the aforementioned objectives. We consider a dual connectivity radio access network architecture with logical separation between control and data planes because it offers relaxed constraints in implementing the predictive approaches. The proposed scheme predicts future HO events along with the expected HO time by combining radio frequency performance to physical proximity along with the user context in terms of speed, direction and HO history. To minimise the processing and the storage requirements whilst improving the prediction performance, a user-specific prediction triggering threshold is proposed. The prediction outcome is utilised to perform advance HO signalling whilst suspending the periodic transmission of measurement reports. Analytical and simulation results show that the proposed scheme provides promising gains over the conventional approach.
Seamless and ubiquitous coverage are key factors for future cellular networks. Despite capacity and data rates being the main topics under discussion when envisioning the Fifth Generation (5G) and beyond of mobile communications, network coverage remains one of the major issues since coverage quality highly impacts the system performance and end-user experience. The increasing number of base stations and user terminals is anticipated to negatively impact the network coverage due to increasing interference. Furthermore, the "ubiquitous coverage" use cases, including rural and isolated areas, present a significant challenge for mobile communication technologies. This survey presents an overview of the concept of coverage, highlighting the ways it is studied, measured, and how it impacts the network performance. Additionally, an overlook of the most important key performance indicators influenced by coverage, which may affect the envisioned use cases with respect to throughput, latency, and massive connectivity, are discussed. Moreover, the main existing developments and deployments which are expected to augment the network coverage, in order to meet the requirements of the emerging systems, are presented as well as implementation challenges.
A novel Multiple-Input and Multiple- Output (MIMO) transmission scheme termed as Space- Time Block Coded Quadrature Spatial Modulation (STBC-QSM) is proposed. It amalgamates the concept of Quadrature Spatial Modulation (QSM) and Space- Time Block Coding (STBC) to exploit the diversity benefits of STBC relying on sparse Radio Frequency (RF) chains. In the proposed STBC-QSM scheme, the conventional constellation points of the STBC structure are replaced by the QSM symbols, hence the information bits are conveyed both by the antenna indices as well as by conventional STBC blocks. Furthermore, an efficient Bayesian Compressive Sensing (BCS) algorithm is developed for our proposed STBCQSM system. Both our analytical and simulation results demonstrated that the proposed scheme is capable of providing considerable performance gains over the existing schemes. Moreover, the proposed BCS detector is capable of approaching the Maximum Likelihood (ML) detector’s performance despite only imposing a complexity near similar to that of the Minimum Mean Square Error (MMSE) detector in the high Signal to Noise Ratio (SNR) regions.
The parameters of Physical (PHY) layer radio frame for 5th Generation (5G) mobile cellular systems are expected to be flexibly configured to cope with diverse requirements of different scenarios and services. This paper presents a frame structure and design which is specifically targeting Internet of Things (IoT) provision in 5G wireless communication systems. We design a suitable radio numerology to support the typical characteristics, that is, massive connection density and small and bursty packet transmissions with the constraint of low cost and low complexity operation of IoT devices. We also elaborate on the design of parameters for Random Access Channel (RACH) enabling massive connection requests by IoT devices to support the required connection density. The proposed design is validated by link level simulation results to show that the proposed numerology can cope with transceiver imperfections and channel impairments. Furthermore, results are also presented to show the impact of different values of guard band on system performance using different subcarrier spacing sizes for data and random access channels, which show the effectiveness of the selected waveform and guard bandwidth. Finally, we present system level simulation results that validate the proposed design under realistic cell deployments and inter-cell interference conditions.
The fifth-generation (5G) new radio (NR) cellular system promises a significant increase in capacity with reduced latency. However, the 5G NR system will be deployed along with legacy cellular systems such as the long-term evolution (LTE). Scarcity of spectrum resources in low frequency bands motivates adjacent-/co-carrier deployments. This approach comes with a wide range of practical benefits and it improves spectrum utilization by re-using the LTE bands. However, such deployments restrict the 5G NR flexibility in terms of frame allocations to avoid the most critical mutual adjacent-channel interference. This in turns prevents achieving the promised 5G NR latency figures. In this we paper, we tackle this issue by proposing to use the minislot uplink feature of 5G NR to perform uplink acknowledgement and feedback to reduce the frame latency with selective blind retransmission to overcome the effect of interference. Extensive system-level simulations under realistic scenarios show that the proposed solution can reduce the peak frame latency for feedback and acknowledgment up to 33% and for retransmission by up to 25% at a marginal cost of an up to 3% reduction in throughput.
A novel Multiple-Input and Multiple- Output (MIMO) transmission scheme termed as Generalized Quadrature Spatial Modulation (G-QSM) is proposed. It amalgamates the concept of Quadrature Spatial Modulation (QSM) and spatial multiplexing for the sake of achieving a high throughput, despite relying on low number of Radio Frequency (RF) chains. In the proposed G-QSM scheme, the conventional constellation points of the spatial multiplexing structure are replaced by the QSM symbols, hence the information bits are conveyed both by the antenna indices as well as by the classic Amplitude/Phase Modulated (APM) constellation points. The upper bounds of the Average Bit Error Probability (ABEP) of the proposed G-QSM system in high throughput massive MIMO configurations are derived. Furthermore, an Efficient Multipath Orthogonal Matching Pursuit (EMOMP) based Compressive Sensing (CS) detector is developed for our proposed G-QSM system. Both our analytical and simulation results demonstrated that the proposed scheme is capable of providing considerable performance gains over the existing schemes in massive MIMO configurations.
Nowadays, dense network deployment is being considered as one of the effective strategies to meet capacity and connectivity demands of the fifth generation (5G) cellular system. Among several challenges, energy consumption will be a critical consideration in the 5G era. In this direction, base station on/off operation, i.e., sleep mode, is an effective technique to mitigate the excessive energy consumption in ultra-dense cellular networks. However, current implementation of this technique is unsuitable for dynamic networks with fluctuating traffic profiles due to coverage constraints, quality-of-service requirements and hardware switching latency. In this direction, we propose an energy/load proportional approach for 5G base stations with control/data plane separation. The proposed approach depends on a multi-step sleep mode profiling, and predicts the base station vacation time in advance. Such a prediction enables selecting the best sleep mode strategy whilst minimizing the effect of base station activation/reactivation latency, resulting in significant energy saving gains.
This paper investigates the 28 GHz band sharing between fixed satellite services (FSS) and fifth generation (5G) new radio (NR) cellular system. In particular, it focuses on modelling a sharing scenario between the uplink of the FSS system and the uplink of the 5G NR enhanced mobile broadband (eMBB) cellular system. Such a scenario could generate interference from the FSS terminals towards the 5G base station, known as next generation Node-B (g-NodeB). We provide detailed interference modelling, sharing constraint derivations and performance analysis under realistic path loss models and antenna radiation patterns based on the latest system characteristics of the third generation partnership project (3GPP) 5G NR Release 15. Several scenarios for seamless coexistence of the two systems are considered by evaluating the efficiency and the signal-to-interference-plus-noise ratio (SINR) at the NR g-NodeB, and using the block error rate (BLER) as a sharing constraint. A single FSS terminal is considered and the impact of several parameters, such as the distance to the g-NodeB and FSS elevation angle, on the g-NodeB spectrum efficiency are evaluated. In addition, the impact of the g-NodeB antenna array size on reducing the FSS/g-NodeB protection distance is evaluated and a dynamic beam steering is proposed to minimise the protection distance.
Conventional mobility management schemes tend to hit the core network with increased signaling load when the cell size is shrinking and the user mobility speed increases. To mitigate this problem research community has proposed various intelligent mobility management schemes that take advantage of the predictability of the users mobility pattern. However, most of the proposed solutions are only focused on signaling of the active-state (i.e., handover signaling) and proposals on improvement of the idle-state signaling has been limited and were not well received from the industrial practitioners. This paper first surveys the major shortcomings of the existing proposals for the idle mode mobility management and then proposes a new architecture, namely predictive mobility management (PrMM) to mitigate the identified challenges. An analytical framework is developed and a closed form solution for the expected signaling overhead of the PrMM is presented. The results of numerical evaluations confirm that, depending on user mobility and network configuration, the PrMM efficiency can surpass the long term evolution (LTE) 4G signaling scheme by over 90%. Analysis of the results shows that the best performance is achieved at highly dense paging areas and lower cell crossing rates.