Skip to main content
Publications lead hero image abstract pattern

Publications

Blog on Selected Ideas in Communications
Written By:

Petar Popovski, Editor in Chief of IEEE JSAC

Published: 23 Dec 2022

This blog was started in January 2022, with the original intention to select and highlight one article, each month, from each newly published IEEE JSAC issue. In the meantime, the blog included other types of articles, such as, for example, a conversation with the Guest Editors of an upcoming special issue. Another thing that has happened in the meantime is the rising popularity of ChatGPT; we have thus also asked about its understanding of the papers published in IEEE JSAC.

This is the 12-th issue for this year, such that the monthly pace was kept, but the articles from the issues published in September 2022 and later have not yet been featured. Below is a brief highlight for each of those months. In addition to our brief description, we have also asked ChatGPT to summarize the paper abstract in less than 100 words. As a side note, ChatGPT was not always successful in reaching the <100 word limit, such that multiple attempts were needed for some of the abstracts.

September 2022

This is the final issue of the Series on Machine Learning in Communications and Networks, from which the highlighted article is:

K. Kang, Q. Hu, Y. Cai, G. Yu, J. Hoydis and Y. C. Eldar, "Mixed-Timescale Deep-Unfolding for Joint Channel Estimation and Hybrid Beamforming," in IEEE Journal on Selected Areas in Communications, vol. 40, no. 9, pp. 2510-2528, Sept. 2022, doi: 10.1109/JS

The paper treats a downlink scenario of a massive MIMO systems that works in Time Division Duplex (TDD) mode, where the users also have multiple antennas. One key assumption is that the number of RF chains is less than the number of antennas, which necessitates hybrid analog-digital beamforming. This work uses end-to-end deep-unfolding neural network for joint channel estimation and hybrid beamforming, aiming to maximize the sum rate in the system. The mixed timescale refers to the different time scales used to optimize the analog and the digital beamforming, respectively.

ChatGPT < 100 words-summary: The authors propose a deep-unfolding neural network algorithm for joint channel estimation and hybrid beamforming in massive multiple-input multiple-output systems. The algorithm maximizes the system sum rate in time-division duplex massive MIMO systems and reduces computational complexity and signaling overhead. The authors use the recursive least-squares and stochastic successive convex approximation algorithms for channel estimation and hybrid beamforming, respectively, and consider a mixed-timescale hybrid beamforming scheme to reduce signaling overhead. The proposed algorithm significantly outperforms conventional algorithms with reduced computational complexity and signaling overhead.

October 2022

The special issue was dedicated to “Antenna Array Enabled Space/Air/Ground Communications and Networking” and the highlighted article is:

B. Shen, Y. Wu, J. An, C. Xing, L. Zhao and W. Zhang, "Random Access With Massive MIMO-OTFS in LEO Satellite Communications," in IEEE Journal on Selected Areas in Communications, vol. 40, no. 10, pp. 2865-2881, Oct. 2022, doi: 10.1109/JSAC.2022.3196128.

Orthogonal Time-Frequency Space (OTFS) is a recent transmission technique that operates in a delay-Doppler domain and is suitable for high-mobility scenarios that exhibit a significant Doppler effect. In that sense, communication through dynamic links with LEO satellites is very suitable for OTFS modulation. The paper treats the problem of grant-free, massive random access and addresses the issue of channel estimation and device detection through the novel perspective provided by the delay-Doppler domain.     

ChatGPT < 100 words-summary: This paper proposes algorithms for joint channel estimation and device activity detection in grant-free random access systems for communication between a large number of Internet of Things devices and a low-earth orbit satellite using massive multiple-input multiple-output with orthogonal time-frequency space modulation. The algorithms exploit the sparsity of the channel in the delay-Doppler-angle domain and use sparse Bayesian learning and covariance-free methods for channel estimation, and the generalized approximate message passing algorithm combined with sparse Bayesian learning and two-dimensional convolution for decreasing computation. Simulation results show that the proposed algorithms outperform conventional methods.

November 2022

While the main waves of the pandemic seem to be behind us, there are still aftershocks from its impact and this special issue is titled “Edge-Based Wireless Communications Technologies to Counter Communicable Infectious Diseases”. The featured paper is:

X. Li, S. Guo, H. -N. Dai and D. Li, "Infectious Probability Analysis on COVID-19 Spreading With Wireless Edge Networks," in IEEE Journal on Selected Areas in Communications, vol. 40, no. 11, pp. 3239-3254, Nov. 2022, doi: 10.1109/JSAC.2022.3211534.

The paper develops a model for predicting infection probability based on the recorded detention time and the density of individuals within a wireless edge network. Here the “detention time” is the access time of the individuals to a given edge network. The model uses stochastic geometry and, based on the results, the authors discuss some possible countermeasures.

ChatGPT < 100 words-summary: This paper proposes a method for predicting the infectious probability of individuals in a wireless edge network and proposes countermeasures against the spread of COVID-19 using wireless edge networks. The method is based on stochastic geometry and takes into account the recorded detention time and the density of individuals within the network, as well as three types of mobility models and a static model of individuals. Numerical results show that the proposed method is accurate and offers useful insights for mitigating the spread of infectious diseases through the use of wireless edge networks.

December 2022

The years that passes was by no means optimal for the blockchain technology, due to the turbulences in cryptocurrency exchanges. Yet, blockchain technology remains relevant for applications beyond cryptocurrency and communications aspect of it is essential for attaining high performance in terms of processing and verification of transactions. The December special issue is on “Intelligent Blockchain for Future Communications and Networking: Technologies, Trends, and Applications” the selected article is:

X. Wang, X. Jiang, Y. Liu, J. Wang and Y. Sun, "Data Propagation for Low Latency Blockchain Systems," in IEEE Journal on Selected Areas in Communications, vol. 40, no. 12, pp. 3631-3644, Dec. 2022, doi: 10.1109/JSAC.2022.3213330.

The problem addressed in this paper is efficient broadcasting, an essential operation in blockchain systems. The main goal is to reduce the propagation latency, while minimizing redundant transmissions. This is achieved by optimizing the broadcasting process through unsupervised learning.  

ChatGPT < 100 words-summary: In this paper, the authors propose a new broadcasting mechanism for blockchain systems called Swift. Swift optimizes the peer-to-peer topology construction and broadcast algorithm in a structured network using unsupervised learning and a greedy algorithm, reducing the propagation latency of the blockchain network and avoiding redundant bandwidth usage. The authors implemented a prototype of Swift and tested it on a network with 1000 blockchain nodes, finding that it reduces propagation latency by 19.8% and increases throughput by 18% while maintaining low latency and stable redundant bandwidth usage.

Looking into 2023 and Beyond

The 12 special issues that will be published in 2023 are already lined up, along with several special issues that will be published in 2024. They are reflected in the call for papers, some of them already closed and others still open for submission. Two of the special issues that will be receiving submissions by mid-next year are clearly reaching over to communities that are beyond the usual community in communication engineering and networking. The first one is “The Quantum Internet: Principles, Protocols, and Architectures”, featuring Guest Editors from both communication and quantum community (all female, supposedly for the first time in the history of IEEE JSAC). The second one is on “Space Communications New Frontiers: From Near Earth to Deep Space”, with Guest Editors from academia, research institutions, ESA and NASA.  

In 2023 we are looking forward to new special issue proposals that are at the frontier of communication and networking, as well as new research articles submitted and published in IEEE JSAC.

Statements and opinions given in a work published by the IEEE or the IEEE Communications Society are the expressions of the author(s). Responsibility for the content of published articles rests upon the authors(s), not IEEE nor the IEEE Communications Society.

Sign In to Comment