前沿论文带您解读5G应用领域 ——图书馆前沿文献专题推荐服务(2)
2020-03-09
5G建设已经如火如荼,在国民生活领域得到了多方面的发展。从应用方向上,5G可被划分为3个场景:增强型的移动宽带(eMBB)、高可靠、低时延的物联网应用(URLLC,面向车联网),以及海量连接的机器通信(mMTC,面向物联网)。5G的一些典型关键技术,如大规模天线阵列、密集网络、新型波形复用与信道编译码,以及毫米波,以及网络虚拟化与切片技术等,都在不断的迭代更新。而随着人工智能(AI)、新型矩阵计算材料、边缘计算等技术的发展,逐渐融入到传统意义上的通信网络之中,由此引出更多、更广泛的通信技术研究方法论与思路。在此,我们在国际学术的前沿期刊上向从事相关研究的科研人员推荐一些研究成果,以期能引起学术碰撞与共鸣,从而推动5G技术更进一步发展和繁荣。
本期我们为您推送4篇文章,从不同的角度解读5G的发展!
The strong uptake in the Internet of Things (IoT) due to more affordable computing, storage and sensors has led to the generation of massive amounts of data, creating both challenges and opportunities for industry. For instance, global mobile data traffic is increasing at two times the rate of IP (internet protocol) traffic, driven by a proliferating number of connected devices and an intense thirst for rich media services and experiences. In 2020, 0.5 zettabytes (1021 bytes) of mobile data are expected to be used1 . As a result, today’s networks will not be able to handle the variety and volume of data that are emerging. 5G is the next generation of wireless mobile broadband networks and is expected to meet this challenge. 5G is not simply an evolution of 4G. It requires major improvements over 4G throughput and capacity, and is the first wireless protocol to address the inclusion of the massive number of machines/things in the network. It will optimize the network for human–human communication, and will deliver a solution for machine-type communication.
Wireless communication technologies such as fifth generation mobile networks (5G) will not only provide an increase of 1000 times in Internet traffic in the next decade but will also offer the underlying technologies to entire industries to support Internet of things (IOT) technologies. Compared to existing mobile communication techniques, 5G has more varied applications and its corresponding system design is more complicated. The resurgence of artificial intelligence (AI) techniques offers an alternative option that is possibly superior to traditional ideas and performance. Typical and potential research directions related to the promising contributions that can be achieved through AI must be identified, evaluated, and investigated. To this end, this study provides an overview that first combs through several promising research directions in AI for 5G technologies based on an understanding of the key technologies in 5G. In addition, the study focuses on providing design paradigms including 5G network optimization, optimal resource allocation, 5G physical layer unified acceleration, end-to-end physical layer joint optimization, and so on.
Sensing, communication, computing and networking technologies continue to generate more and more data, and this trend is set to continue. Future Internet of Things (IoT) devices could, in particular, be widely deployed for tasks such as environmental monitoring, city management, and medicine and healthcare, requiring data processing, information extraction and real-time decision making. Cloud computing alone cannot support such ubiquitous deployments and applications because of infrastructure shortcomings such as limited communication bandwidth, intermittent network connectivity, and strict delay constraints. To address this challenge, and ensure timely data processing and flexible services, multi-tier computing resources are required, which can be deployed and shared along the continuum from the cloud to things.
Memristor crossbars offer reconfigurable non-volatile resistance states and could remove the speed and energy efficiency bottleneck in vector-matrix multiplication, a core computing task in signal and image processing. Using such systems to multiply an analogue-voltage-amplitude-vector by an analogue-conductance-matrix at a reasonably large scale has, however, proved challenging due to difficulties in device engineering and array integration. Here we show that reconfigurable memristor crossbars composed of hafnium oxide memristors on top of metal-oxide-semiconductor transistors are capable of analogue vector-matrix multiplication with array sizes of up to 128 x 64 cells. Our output precision (5-8 bits, depending on the array size) is the result of high device yield (99.8%) and the multilevel, stable states of the memristors, while the linear device current-voltage characteristics and low wire resistance between cells leads to high accuracy. With the large memristor crossbars, we demonstrate signal processing, image compression and convolutional filtering, which are expected to be important applications in the development of the Internet of Things (IoT) and edge computing.