Dynamic Network Slice Scaling Assisted by Attention-Based Prediction in 5G Core Network

Network slicing is a key technology in fifth-generation (5G) networks that allows network operators to create multiple logical networks over a shared physical infrastructure to meet the requirements of diverse use cases. Among core functions to implement network slicing, resource management and scaling are difficult challenges. Network operators must ensure the Service Level Agreement (SLA) requirements for latency, bandwidth, resources, etc for each network slice while utilizing the limited resources efficiently, i.e., optimal resource assignment and dynamic resource scaling for each network slice. Existing resource scaling approaches can be classified into reactive and proactive types. The former makes a resource scaling decision when the resource usage of virtual network functions (VNFs) exceeds a predefined threshold, and the latter forecasts the future resource usage of VNFs in network slices by utilizing classical statistical models or deep learning models. However, both have a trade-off between assurance and efficiency. For instance, the lower threshold in the reactive approach or more marginal prediction in the proactive approach can meet the requirements more certainly, but it may cause unnecessary resource wastage. To overcome the trade-off, we first propose a novel and efficient proactive resource forecasting algorithm. The proposed algorithm introduces an attention-based encoder-decoder model for multivariate time series forecasting to achieve high short-term and long-term prediction accuracies. It helps network slices be scaled up and down effectively and reduces the costs of SLA violations and resource overprovisioning. Using the attention mechanism, the model attends to every hidden state of the sequential input at every time step to select the most important time steps affecting the prediction results. We also designed an automated resource configuration mechanism responsible for monitoring resources and automatically adding or removing VNF instances.

View this article on IEEE Xplore

 

ML-Based Classification of Device Environment Using Wi-Fi and Cellular Signal Measurements

Future spectrum sharing rules very likely will be based on device environment: indoors or outdoors. For example, the 6 GHz rules created different power regimes for unlicensed devices to protect incumbents: “indoor” devices, subject to lower transmit powers but not required to access an Automatic Frequency Control database to obtain permission to use a channel, and “outdoor” devices, allowed to transmit at higher power but required to do so to determine channel availability. However, since there are no reliable means of determining if a wireless device is indoors or outdoors, other restrictions were mandated: reduced power for client devices and indoor access points that cannot be battery powered, have detachable antennas or be weatherized. These constraints lead to sub-optimal spectrum usage and potential for misuse. Hence, there is a need for robust identification of device environments to enable spectrum sharing. In this paper we study automatic indoor/outdoor classification based on the radio frequency (RF) environment experienced by a device. Using a custom Android app, we first create a labeled data set of a number of parameters of Wi-Fi and cellular signals in various indoor and outdoor environments, and then evaluate the classification performance of various machine learning (ML) models on this data set. We find that tree-based ensemble ML models can achieve greater than 99% test accuracy and F1-Score, thus allowing devices to self-identify their environment and adapt their transmit power accordingly.

View this article on IEEE Xplore

 

A Comprehensive Review of the COVID-19 Pandemic and the Role of IoT, Drones, AI, Blockchain, and 5G in Managing its Impact

The unprecedented outbreak of the 2019 novel coronavirus, termed as COVID-19 by the World Health Organization (WHO), has placed numerous governments around the world in a precarious position. The impact of the COVID-19 outbreak, earlier witnessed by the citizens of China alone, has now become a matter of grave concern for virtually every country in the world. The scarcity of resources to endure the COVID-19 outbreak combined with the fear of overburdened healthcare systems has forced a majority of these countries into a state of partial or complete lockdown. The number of laboratory-confirmed coronavirus cases has been increasing at an alarming rate throughout the world, with reportedly more than 3 million confirmed cases as of 30 April 2020. Adding to these woes, numerous false reports, misinformation, and unsolicited fears in regards to coronavirus, are being circulated regularly since the outbreak of the COVID-19. In response to such acts, we draw on various reliable sources to present a detailed review of all the major aspects associated with the COVID-19 pandemic. In addition to the direct health implications associated with the outbreak of COVID-19, this study highlights its impact on the global economy. In drawing things to a close, we explore the use of technologies such as the Internet of Things (IoT), Unmanned Aerial Vehicles (UAVs), blockchain, Artificial Intelligence (AI), and 5G, among others, to help mitigate the impact of COVID-19 outbreak.

View this article on IEEE Xplore

Absorption of 5G Radiation in Brain Tissue as a Function of Frequency, Power and Time

The rapid release of 5G wireless communications networks has spurred renewed concerns regarding the interactions of higher radiofrequency (RF) radiation with living species. We examine RF exposure and absorption in ex vivo bovine brain tissue and a brain simulating gel at three frequencies: 1.9 GHz, 4 GHz and 39 GHz that are relevant to current (4G), and upcoming (5G) spectra. We introduce a highly sensitive thermal method for the assessment of radiation exposure, and derive experimentally, accurate relations between the temperature rise (ΔT), specific absorption rate (SAR) and the incident power density (F), and tabulate the coefficients, ΔT/ΔF and Δ(SAR)/ΔF, as a function of frequency, depth and time. This new method provides both ΔT and SAR applicable to the frequency range below and above 6 GHz as shown at 1.9, 4 and 39 GHz, and demonstrates the most sensitive experimental assessment of brain tissue exposure to millimeter-wave radiation to date, with a detection limit of 1 mW. We examine the beam penetration, absorption and thermal diffusion at representative 4G and 5G frequencies and show that the RF heating increases rapidly with frequency due to decreasing RF source wavelength and increasing power density with the same incident power and exposure time. We also show the temperature effects of continuous wave, rapid pulse sequences and single pulses with varying pulse duration, and we employ electromagnetic modeling to map the field distributions in the tissue. Finally, using this new methodology, we measure the thermal diffusivity of ex vivo bovine brain tissue experimentally.

View this article on IEEE Xplore

Complex Systems: A Communication Networks Perspective Towards 6G

 

Over the last few years, the analysis and modeling of networks as well as the analysis and modeling of networked dynamical systems, has attracted considerable interdisciplinary interest, especially using the complex systems theory. These efforts are driven by the fact that systems, as diverse as genetic networks or the Internet can be effectively described as complex networks. Contrary, despite the unprecedented evolution of technology, basic issues and fundamental principles related to the structural and evolutionary properties of communication networks still remain largely unaddressed. The situation is even more complicated when we attempt to model the mobile communication networks and especially the 5th generation (5G) and eventually the forthcoming 6th generation (6G). In this work, we attempt to review basic models of complex networks from a communication networks perspective, focusing on their structural and evolutionary properties. Based on this review we aim to reveal the models of complex networks, that may apply when modeling the 5G and 6G mobile communication networks. Furthermore, we expect to encourage the collaboration between complex systems and networking theorists toward meeting the challenging demands of 5G networks and beyond.

View this article on IEEE Xplore

Smart Power Control for Quality-Driven Multi-User Video Transmissions: A Deep Reinforcement Learning Approach

 

Device-to-device (D2D) communications have been regarded as a promising technology to meet the dramatically increasing video data demand in the 5G network. In this paper, we consider the power control problem in a multi-user video transmission system. Due to the non-convex nature of the optimization problem, it is challenging to obtain an optimal strategy. In addition, many existing solutions require instantaneous channel state information (CSI) for each link, which is hard to obtain in resource-limited wireless networks. We developed a multi-agent deep reinforcement learning-based power control method, where each agent adaptively controls its transmit power based on the observed local states. The proposed method aims to maximize the average quality of received videos of all users while satisfying the quality requirement of each user. After off-line training, the method can be distributedly implemented such that all the users can achieve their target state from any initial state. Compared with conventional optimization based approach, the proposed method is model-free, does not require CSI, and is scalable to large networks.

View this article on IEEE Xplore

Most Popular Article of 2017: 5G Cellular User Equipment: From Theory to Practical Hardware Design

Research and development on the next generation wireless systems, namely 5G, has experienced explosive growth in recent years. In the physical layer, the massive multiple-input-multiple output (MIMO) technique and the use of high GHz frequency bands are two promising trends for adoption. Millimeter-wave (mmWave) bands, such as 28, 38, 64, and 71 GHz, which were previously considered not suitable for commercial cellular networks, will play an important role in 5G. Currently, most 5G research deals with the algorithms and implementations of modulation and coding schemes, new spatial signal processing technologies, new spectrum opportunities, channel modeling, 5G proof of concept systems, and other system-level enabling technologies. In this paper, we first investigate the contemporary wireless user equipment (UE) hardware design, and unveil the critical 5G UE hardware design constraints on circuits and systems. On top of the said investigation and design tradeoff analysis, a new, highly reconfigurable system architecture for 5G cellular user equipment, namely distributed phased arrays based MIMO (DPA-MIMO) is proposed. Finally, the link budget calculation and data throughput numerical results are presented for the evaluation of the proposed architecture.

View this article on IEEE Xplore