Military Communications Conference
28 November – 2 December 2022 // National Capital Region, USA
Celebrating 40 Years of MILCOM - Transforming Decision Making through JADC2

Tutorials

Monday, 28 November 9:00 - 12:15 (AM Break at 10:30 - 10:45)

TUT-01: Hybrid Quantum-Classical Computing for Future Network Optimization
TUT-02: 5G and Beyond: The Path towards 6G Networks
TUT-03: Software Defined Radio Implementation of Reinforcement Learning for Interference Mitigation within an OFDM-based Wireless Communication System

Monday, 28 November 12:15 - 13:30 Box Lunch

Monday, 28 November 13:30 - 16:45 (PM Break at 15:00 - 15:15) 

TUT-04: IoT Supply Chain Security Risk Assessment and Mitigation: Methodologies and Computational Tools
TUT-05: Covert and LPD Communications in the Era of 5G and 6G Wireless
TUT-06: Intelligent Learning Algorithms: Building Next-Generation Military Networks with Artificial Intelligence

Friday, 02 December 9:00 - 12:15 (AM Break at 10:30 - 10:45)

TUT-07: Fresh and Welcome Capabilities of Multi-Channel Polyphase Analysis and Synthesis Filter Banks
TUT-08: In-Band-Full-Duplex Radio for Integrated Access/Bakhaul and Integrated Sensing/Communications in 6G Networks

Friday, 02 December 12:15 - 13:30 Box Lunch

Friday, 02 December 13:30 - 16:45 (PM Break at 15:00 - 15:15) 

TUT-09: QoS-Driven Promising Techniques for 6G Multimedia Mobile Wireless Networks
TUT-10: Recent Advances in 5G Sidelink Technologies
TUT-11: Holographic Radio: A New Paradigm for Ultra-Massive MIMO
TUT-12: Beyond Identification: HF RFID and NFC for Digital Twins


Monday, 28 November 9:00 - 12:15
Room: Brookside A/Lower Level
TUT-01: Hybrid Quantum-Classical Computing for Future Network Optimization

Presenter(s): Lei Fan and Zhu Han (Dept. of Engineering Technology, University of Houston)

This tutorial will give a detailed introduction to quantum computing and its applications in the network communication area. This tutorial first introduces the basics of quantum computing and what quantum parallelism is. Then this tutorial will present more details about the hybrid quantum-classical computing paradigm and its special case for mixed-binary linear programming problems. Third, this tutorial will discuss how the proposed network resource optimization paradigm could be used, including NFV, multi-access edge computing, fog/cloud computing, and cloud radio access network (C-RAN). Finally, this tutorial discusses the conclusion and future work that arises when designing and implementing hybrid quantum-classical algorithms.

Lei Fan

Lei Fan

Lei Fan (M’15-SM’20) received the B.S. degree in electrical engineering from the Hefei University of Technology, Hefei, China, in 2009, and the Ph.D. degree in industrial and systems engineering from the University of Florida, Gainesville, FL, USA, in 2015. He was an Application Engineer with General Electric from 2015 to 2017 and a Software Engineer with the Siemens Industry. Currently, He is an assistant professor at the University of Houston. His research interests include the quantum computing and optimization of complex system operations, planning.

Zhu HanZhu Han

Zhu Han (S’01–M’04-SM’09-F’14) received the B.S. degree in electronic engineering from Tsinghua University, in 1997, and the M.S. and Ph.D. degrees in electrical and computer engineering from the University of Maryland, College Park, in 1999 and 2003, respectively. From 2000 to 2002, he was an R&D Engineer of JDSU, Germantown, Maryland. From 2003 to 2006, he was a Research Associate at the University of Maryland. From 2006 to 2008, he was an assistant professor at Boise State University, Idaho. Currently, he is a John and Rebecca Moores Professor in the Electrical and Computer Engineering Department as well as in the Computer Science Department at the University of Houston, Texas. His research interests include wireless resource allocation and management, wireless communications and networking, game theory, big data analysis, security, and smart grid. Dr. Han received an NSF Career Award in 2010, the Fred W. Ellersick Prize of the IEEE Communication Society in 2011, the EURASIP Best Paper Award for the Journal on Advances in Signal Processing in 2015, IEEE Leonard G. Abraham Prize in the field of Communications Systems (best paper award in IEEE JSAC) in 2016, and several best paper awards in IEEE conferences. Dr. Han was an IEEE Communications Society Distinguished Lecturer from 2015-2018, AAAS fellow since 2019, and ACM distinguished Member since 2019. Dr. Han is a 1% highly cited researcher since 2017 according to Web of Science. Dr. Han is also the winner of the 2021 IEEE Kiyo Tomiyasu Award, for outstanding early to mid- career contributions to technologies holding the promise of innovative applications, with the following citation: ``for contributions to game theory and distributed management of autonomous communication networks."

Monday, 28 November 9:00 - 12:15
Room: Brookside B/Lower Level
TUT-02: 5G and Beyond: The Path towards 6G Networks

Presenter(s): Jack Burbank (Sabre Systems, Inc.)

The 5G and emerging 6G landscape is extremely complex with many competing and complementary technologies, standardization efforts, spectrum usage models, and industry-driven consortium developments that are all evolving rapidly. Furthermore, the 5G and emerging 6G landscape is evolving very differently around the globe in many cases without a unified global vision of next-generation wireless networks. The goal is to simplify the complexity such that attendees can walk away with a solid understanding of key trends and technologies in the 5G landscape and what researchers and standards engineers are working toward for future 6G architectures. This tutorial aims to provide attendees a solid understanding of 5G wireless communications architecture and the emerging 6G architecture. The tutorial will discuss the various technologies that comprise the overall 5G network architecture and how these technologies are either complementary or competitive in nature. This tutorial will provide attendees with a strong familiarity of 1) the 5G cellular standards as defined in 3GPP Release 15, 16, and 17, and 2) key IEEE 802.11 technologies that support 5G usage cases, with a particular focus will be spent on IEEE 802.11ax, or "WiFi 6." The tutorial will then also examine key “Beyond 5G” research and standardization activities (Release 18 and beyond), identifying key technology trends that will likely make up the future 6G network architecture. Discussion will also include emerging WiFi 7 technologies, such as IEEE 802.11be.

Jack BurbankJack Burbank

Jack L. Burbank earned his Bachelors of Science and Masters of Science degrees in Electrical Engineering from North Carolina State University in 1994 and 1998 respectively. Mr. Burbank currently is a senior wireless network engineer at Sabre Systems where he helps design, develop, and evaluate next-generation wireless capabilities for the tactical Army community. Mr. Burbank is an expert in the areas of wireless networking, modeling and simulation, wireless system development, and wireless network security. Mr. Burbank has published over 50 technical papers on topics of wireless networking (both terrestrial-based and space-based), and contributed to multiple books related to wireless networking. Mr. Burbank has authored books on the subjects of Wireless Networking and Modeling and Simulation. Mr. Burbank is active within the IEEE acting as technical reviewer, organizer, and chair for numerous IEEE conferences and periodicals. Mr. Burbank is editor of the Wiley-IEEE Press book series on IEEE standards. Mr. Burbank has previously served as an Associate Technical Editor of the IEEE Communications Magazine. Mr. Burbank previously taught courses on networking and wireless networking within The Johns Hopkins University Engineering for Professionals Program and is a senior member of the IEEE. Mr. Burbank has successfully led tutorials at numerous previous MILCOM conferences, with his tutorials often being among the highest attended and best reviewed tutorials of the previous programs.

Monday, 28 November 9:00 - 12:15
Room:  Glen Echo/Lower Level
TUT-03: Software Defined Radio Implementation of Reinforcement Learning for Interference  Mitigation within an OFDM-based Wireless Communication System

Presenter(s): Alyse Jones and William Headley (Virginia Tech National Security Institute)

Afflicted heavily by spectrum congestion, the unpredictable, dynamic conditions of the radio frequency (RF) spectrum have increasingly become a major obstacle for communication devices today. More specifically, a significant threat existing within this kind of environment is interference caused by collisions, which is increasingly unavoidable in an overcrowded spectrum. This is also true for interference that tends to be malicious and intentional in a military situation. Thus, these devices require a way to avoid such events. Cognitive radios (CR) were proposed as a solution through its transmission adaptability and decision-making capabilities within a radio. Through spectrum sensing, CRs can capture the current condition of the RF spectrum and based on its decision-making strategy, interpret these results to make an informed decision on what to do next to optimize its own communication. With the emergence of artificial intelligence, one such decision-making strategy CRs can utilize is Reinforcement Learning (RL). Unlike standard adaptive radios, CRs equipped with RL can predict the conditions of the RF spectrum, and using these predictions, understand what it must do in the future to operate optimally. Recognizing the usefulness of RL in hard-to-predict environments, such as the RF spectrum, research of RL within CRs has become more popular over the past decade, especially for interference mitigation. Therefore, for this tutorial, a representative real-time OFDM transmit/receive chain is implemented within the GNU Radio framework. The system, operating over-the-air through USRPs, leverages reinforcement learning, e.g., Q-Learning, to avoid interference with other spectrum users. However, operation of RL over real radios is not as straightforward as simulation-based implementations and requires additional effort and decision-making during implementation. Thus, the objectives of this tutorial include the following:

  • Provide an overview of reinforcement learning for use in cognitive radios, particularly for interference mitigation.
  • Highlight the additional considerations and design decisions that must be made when transitioning RL from software to hardware.
  • Demonstrate, step-by-step, what is required for RL to operate in a real-time wireless communication system, as designed in GNU Radio.

As such, the primary motivation is to demonstrate to researchers and engineers in this field how to implement reinforcement learning within GNU Radio with an existing modern communication framework, such as OFDM. At the end of the tutorial, through hands-on interaction with GNU Radio, the audience will understand how reinforcement learning can be integrated in wireless communications for over-the-air applications, and as a result, use what they have learned to further the deployment of RL in commercial and military communication systems.

Alyse JonesAlyse Jones

Alyse Jones is a Research Associate in the Spectrum Dominance Division at the Virginia Tech National Security Institute. She received her bachelor’s degree in electrical engineering from Louisiana Tech University in 2016 and her master’s degree in electrical engineering, with an emphasis on wireless communications, from Virginia Tech in August of 2022. Her master’s thesis, titled "Considerations of Reinforcement Learning within Real-Time Wireless Communication Systems", investigated the limitations and considerations that must be made when implementing reinforcement learning within communication systems designed for real-time deployment. The findings of her thesis can help RF engineers gain a better understanding of how radios with RL will behave in a realistic situation and how RL affects a wireless communication system in order to move closer to a system that can be readily deployable. Ms. Jones is continuing her education at Virginia Tech by pursuing a PhD in electrical engineering. For her PhD, she plans on furthering her work in reinforcement learning for RF for national security purposes.

William “Chris” Headley

Dr. William “Chris” Headley is the Associate Director for the Spectrum Dominance Division at the Virginia Tech National Security Institute, where he has served as a principal or co-principal investigator on a multitude of government and commercial projects totaling over $18M. Within the division he primarily oversees the Radio Frequency Machine Learning (RFML) portfolio which is at the forefront of this emerging field. Through his courtesy appointment within Virginia Tech’s Electrical and Computer Engineering department, he also serves as a mentor and advisor to both undergraduate and graduate student researchers, providing them with hands-on research opportunities through these projects as well as guiding them towards their degree requirements. Dr. Headley earned his BS/MS/PhD in Electrical Engineering at Virginia Tech. He has written over 40 conference/journal publications and holds an active TS-SCI clearance. His current research interests include spectrum sensing, radio frequency machine learning, and virtual reality educational opportunities.

Monday, 28 November 13:30 - 16:45
Room:  Brookside A/Lower Level
TUT-04: IoT Supply Chain Security Risk Assessment and Mitigation: Methodologies and Computational Tools

Presenter(s): Junaid Farooq (University of Michigan) and Quanyan Zhu (New York University)

The widespread adoption of the IoT is becoming indispensable in all industry verticals such as in energy, transportation, communications, emergency services, public administration, defense, etc., due to their burgeoning scale and complexity. However, the cyber-physical integration is also opening doors for malicious cyber activity to sabotage their performance and/or operation. Furthermore, the IoT is composed of various different interconnected components that may be designed, manufactured, and operated by different entities located in different parts of the world. This adds an additional threat vector relating to the supply chain of the IoT ecosystem with possible attacks from backdoor and stealthy channels. Since the incapacitation or destruction of infrastructure systems can have a debilitating effect on national security, economy, public health, and safety, it is imperative to understand risks in IoT systems and take necessary steps to mitigate them. This tutorial is aimed at identifying and categorizing the different types of security risks in IoT systems starting from the network layer to the supply chain layer. It will also provide an overview of the potential strategies that can be employed to avoid the possibility of large scale coordinated attacks from network entities or supply chain actors. Finally, an overview of the possible research directions relating to the security and resilience of IoT systems will be provided.

This tutorial is intended for the attendees of IEEE MILCOM 2022 from academia, military research organizations, and industry professionals. We believe that this tutorial will help researchers from academia and research organizations to understand and explore the methodologies for cyber risk assessment and its propagation in IoT systems. On the other hand, industry participants will be trained how to make use of a software tool, referred to as iSCRAM, for risk assessment of infrastructure systems and use its features for making risk informed decisions relating to the supply chain. A hands-on interactive demo will be provided to the attendees on iSCRAM. 

Junaid FarooqJunaid Farooq

Junaid Farooq received the B.S. degree in electrical engineering from the National University of Sciences and Technology (NUST), Pakistan in 2013, the M.S. degree in electrical engineering from the King Abdullah University of Science and Technology (KAUST), Saudi Arabia in 2015, and the Ph.D. degree in electrical engineering from New York University in 2020. He was a Research Assistant with the Qatar Mobility Innovations Center (QMIC), Qatar Science and Technology Park (QSTP), Doha, Qatar from 2015 to 2016. Currently, he is an assistant professor with the department of electrical and computer engineering at the University of Michigan-Dearborn, Dearborn, MI, USA. His research interests include modeling, analysis, optimization, and security of wireless communication systems, cyber-physical systems, and the Internet of things. He is a recipient of the President's Gold Medal for academic excellence from NUST, the Athanasios Papoulis Award for teaching excellence, and the Dante Youla Award for research excellence from the department of Electrical & Computer Engineering (ECE) at NYU Tandon School of Engineering. He also received the NYU university wide Outstanding Dissertation Award in 2021.

Quanyan ZhuQuanyan Zhu

Quanyan Zhu received B. Eng. in Honors Electrical Engineering from McGill University in 2006, M. A. Sc. from the University of Toronto in 2008, and Ph.D. from the University of Illinois at Urbana-Champaign (UIUC) in 2013. After stints at Princeton University, he is currently an associate professor at the Department of Electrical and Computer Engineering, New York University (NYU). He is an affiliated faculty member of the Center for Urban Science and Progress (CUSP) and Center for Cyber Security (CCS) at NYU. He is a recipient of many awards, including NSF CAREER Award and INFORMS Koopman Prize. He spearheaded and chaired INFOCOM Workshop on Communications and Control on Smart Energy Systems (CCSES), Midwest Workshop on Control and Game Theory (WCGT), and ICRA workshop on Security and Privacy of Robotics. His current research interests include game theory, machine learning, cyber deception, network optimization and control, Internet of Things, and cyber-physical systems. He is a co-author of three recent books published by Springer: Cyber-Security in Critical Infrastructures: A Game-Theoretic Approach (with S. Rass, S. Schauer, and S. König), Game Theory for Cyber Deception (with J. Pawlick), and Cybersecurity in Robotics (with S. Rass, B. Dieber, V. M. Vilches).

Monday, 28 November 13:30 - 16:45
Room:  Brookside B/Lower Level
TUT-05: Covert and LPD Communications in the Era of 5G and 6G Wireless

Presenter(s): Amitav Mukherjee (Tiami Networks)

This half-day tutorial aims to provide a comprehensive overview of the state-of-the-art in how next-generation mobile broadband systems such as 5G New Radio (NR) can be enhanced to provide covert and low probability of detection (LPD) communications capabilities. NR supports up to 20 Gbps peak throughputs using multiple input multiple output (MIMO) and wideband channels, operation across 30 GHz of RF spectrum, a data reliability of 99.999% or higher, a connection density of 106 devices per sq. km, and an air interface latency of 0.5 ms. These capabilities make 5G attractive for use in tactical, logistical, and battlefield communications. However, the broadband 5G NR waveform does not support conventional LPD methods, which makes it susceptible to adversarial countermeasures. This tutorial will commence by defining information-theoretic and signal processing-based LPD metrics, and an overview of conventional LPD techniques such as fast frequency hopping and spread spectrum. We then introduce and evaluate 5G signal processing techniques for LPD. The tutorial concludes with a look at LPD approaches for future 6G wireless systems in the terahertz regime.

Amitav MukherjeeAmitav Mukherjee

Dr. Amitav Mukherjee, the founder and President of Tiami Networks, has over twelve years of R&D experience in wireless communications systems design. He received a Ph.D. in Electrical and Computer Engineering from UC Irvine in 2012. He holds a BSEE from the University of Kansas, a MSEE from Wichita State University, and an MBA from the University of Illinois at Urbana-Champaign.

Tiami Networks's mission is to build the world's highest-performing and secure 5G networks for dual-use applications. Prior to founding Tiami, Amitav served as a Director of Wireless R&D at Charter Communications where he was the team lead and chief architect for all radio access network performance modeling and PHY layer R&D activities. He also served as a Charter standards delegate to 3GPP RAN1 and NextG Alliance for 5G and 6G wireless. Amitav has previously worked in 4G and 5G wireless standards and services at Verizon, Ericsson, and Hitachi. At Verizon, he was the radio architect for the Onsite private LTE/5G network service launched in 2020. At Ericsson, he led the R&D teams that designed the world’s first 4G cellular systems that operate in unlicensed spectrum (MulteFire and LAA-LTE).

Monday, 28 November 13:30 - 16:45
Room: Glen Echo/Lower Level
TUT-06: Intelligent Learning Algorithms: Building Next-Generation Military Networks with Artificial Intelligence

Presenter(s): Julia Andrusenko (JHU/APL); Jack Burbank (Sabre Systems, Inc.) and June Gordon (Sabre Systems, Inc.)

This tutorial aims to provide attendees with practical knowledge of advanced intelligent learning algorithms and how they can be applied to communications and networking problems. Attendees will first be provided with an introduction to intelligent algorithms as well as a crash course on some of the theory behind intelligent algorithms.  The tutorial will then provide an overview of many of the key types of learning algorithms, including machine learning algorithms, genetic algorithms, bio-inspired algorithms, deep learning, and then discuss the emerging field of multi-agent learning algorithms. Particular attention will be on neural network deep learning techniques, providing a deep-dive on the subject of neural networks. Strengths and weaknesses of various types of algorithms will be provided, developing and presenting a detailed taxonomy of all the various mainstream forms of intelligent learning. The tutorial will highlight various areas of communications and networking in which intelligent learning algorithms have played a key role in technology development, including all layers of the protocol stack. Many of the key relevant technology and policy standardization activities taking place across the world’s regulatory and standards bodies as they relate to intelligent algorithms will be discussed. The tutorial will present existing machine learning tools and libraries available to today’s practitioners such as TensorFlow, Python scikit-learn, PyTorch, etc., including demonstrations of these tools as applicable. Lastly, the tutorial will discuss key research areas in the area of machine learning.

Julia AndrusenkoJulia Andrusenko

Julia Andrusenko received her bachelor's and master's degrees in electrical engineering from Drexel University, Philadelphia, PA. She is a senior communications engineer at the Johns Hopkins University Applied Physics Laboratory (JHU/APL) and is the Chief Engineer of the Mission Critical Communications group of JHU/APL. Ms. Andrusenko has an over 20 years of experience in communications theory, wireless networking, satellite communications, Radio Frequency (RF) propagation prediction, communications systems vulnerability, computer simulation of communications systems, evolutionary computation, genetic algorithms/programming, MIMO, and millimeter wave technologies. She also has a substantial experience developing electronic warfare methodologies for various advanced commercial communications systems and military data links. Ms. Andrusenko is a published author of many technical papers and has co-authored two books: “Wireless Internetworking: Understanding Internetworking Challenges” through Wiley/IEEE Press and “Cognitive Electronic Warfare: An Artificial Intelligence Approach” through Artech House. Ms. Andrusenko is a senior member of the IEEE, a member of the IEEE Communications Society, and a voting Member of the IEEE 1900.5 Working Group (WG) on Policy Language and Architectures for Managing Cognitive Radio for Dynamic Spectrum Access Applications. Ms. Andrusenko has served as a session chair and organizer, technical reviewer, invited speaker, and panelist for various conferences. Ms. Andrusenko is also on the steering committee for the annual IEEE/APL 5G Technologies for First Responder and Tactical Networks Workshop.

Jack BurbankJack Burbank

Jack L. Burbank earned his Bachelors of Science and Masters of Science degrees in Electrical Engineering from North Carolina State University in 1994 and 1998 respectively. Mr. Burbank currently is a senior wireless network engineer at Sabre Systems where he helps design, develop, and evaluate next-generation wireless capabilities for the tactical Army community. Mr. Burbank is an expert in the areas of wireless networking, modeling and simulation, wireless system development, and wireless network security. Mr. Burbank has published over 50 technical papers on topics of wireless networking (both terrestrial-based and space-based), and contributed to multiple books related to wireless networking. Mr. Burbank has authored books on the subjects of Wireless Networking and Modeling and Simulation. Mr. Burbank is active within the IEEE acting as technical reviewer, organizer, and chair for numerous IEEE conferences and periodicals. Mr. Burbank is editor of the Wiley-IEEE Press book series on IEEE standards. Mr. Burbank has previously served as an Associate Technical Editor of the IEEE Communications Magazine. Mr. Burbank previously taught courses on networking and wireless networking within The Johns Hopkins University Engineering for Professionals Program and is a senior member of the IEEE. Mr. Burbank has successfully led tutorials at numerous previous MILCOM conferences, with his tutorials often being among the highest attended and best reviewed tutorials of the previous programs.

Friday, 02 December 9:00 - 12:15
Room:  Brookside A/Lower Level
TUT-07: Fresh and Welcome Capabilities of Multi-Channel Polyphase Analysis and Synthesis Filter Banks

Presenter(s): Fred Harris (University of California San Diego)


Polyphase Channelizers in Modern Communication Systems

We learn to design filters and how to apply their use in the sampled data domain that satisfy their often-repeated constraint; Linear Time Invariant (LTI)! The body of tools with which we are armed in LTI is remarkable: transfer functions, impulse response, superposition, reciprocity, commutability, and so on. One's intuition and understanding about sampled data filters fail us when we change the playing field to Linear Time Varying (LTV). All our tools vanish! In this presentation we will explain how an LTI filter is changed to an LTV filter and the three reasons we choose to do this. They are to reduce cost, to improve performance, and have fun being creative. We take our audience on a trip through Alice's looking glass where things seem to operate backwards and accomplish what appears to be applied magic. We learn how to form an M-path polyphase analysis filter bank and its dual, an M-path polyphase synthesis filter bank. These are amazing processing engines that perform their processing tasks by using spectral aliasing, caused by a sample rate change, to move spectral bands between baseband and selected center frequencies and then separate these aliases by their distinct phase profiles. Remarkably, they accomplish this with a single prototype filter and an inverse FFT that performs channelization of all the filters in the filter bank. Strangely, the same filter is centered at multiple center frequencies simultaneously. Even more remarkable is the capabilities offered by a cascade of the analysis and synthesis filter banks. How about channelizers with multiple simultaneous bandwidths and arbitrary center frequencies. Would an order of magnitude reduction in processing work load be of interest to you? This presentation is low on math and high in comprehension.

Fred Harris

Fred is a faculty Member in the ECE Department at University of California San Diego.He left the College of Engineering at San Diego State University after 50 years of teaching there. I maintain strong ties with special faculty in my former college. At UCSD he continues to teach courses in Digital Signal Processing and Communication Systems. He holds a number of patents on digital receiver and DSP technology and lecture throughout the world on DSP applications. He consults for organizations requiring high performance, cost effective DSP solutions. He is a former adjunct member of the Center for Communications Research in Princeton and at Imperial College in London.

Friday, 02 December 9:00 - 12:15
Room:  Brookside B/Lower Level
TUT-08: In-Band-Full-Duplex Radio for Integrated Access/Bakhaul and Integrated Sensing/Communications in 6G Networks

Presenter(s): Tharmalingam Ratnarajah (Institute for Digital Communications, The University of Edinburgh)

In-band-full-duplexing (IBFD) is an emerging paradigm for wireless communication in 6G wireless networks wherein the two communication directions can simultaneously utilize the same frequency band. By using the antenna, analog and digital interference cancellation techniques to mitigate the ensuing self-interference, the feasibility of IBFD links for standalone wireless links has been recently demonstrated. Furthermore, they allow simultaneous transmission and sensing, opening up avenues for new random-access schemes. The objective of this tutorial is to provide an overview of the following ingredients: 

1) To provide a recent advance on IBFD radio design in the frequency range 2 (FR2) band (≥25.250GHz); specifically, we review the antenna domain cancellation, wideband optical domain analog cancellation and digital domain cancellations. We will provide wideband hardware impairment models and hardware nonlinear effect models

2) To describe the design and analysis of IBFD transmission in the recently proposed 3GPP integrated access and backhaul (IAB) networks. Here we provide a 3GPP-inspired design for the IBFD-IAB networks in the FR2 band, which can enhance the spectral efficiency and coverage while reducing the latency 

3) To lay out the basics concepts of IBFD integrated sensing and communications (ISAC) and summarize the key advantages. We considered the multi-vehicle scenario and performed tracking and prediction using an extended Kalman filter at the IBFD-ISAC nodes

4) To give a vision for IBFD for IAB and ISAC research towards in 6G Networks. We also describe the implementation constraints, research challenges, opportunities and potential solutions.

Tharmalingam RatnarajahTharmalingam Ratnarajah

Prof. Tharm Ratnarajah is currently with the Institute for Digital Communications, the University of Edinburgh, Edinburgh, U.K., as a Professor in Digital Communications and Signal Processing. He was the Head of the Institute for Digital Communications during 2016-2018. Prior to this, he held various positions at McMaster University, Hamilton, Canada, (1997-1998), Nortel Networks (1998-2002), Ottawa, Canada, University of Ottawa, Canada, (2002-2004), Queen’s University of Belfast, U.K., (2004-2012). His research interests include signal processing and information-theoretic aspects of beyond 5G wireless networks, full-duplex radio, mmWave communications, random matrices theory, interference alignment, statistical and array signal processing and quantum information theory. He has published over 400 peer-review publications in these areas and holds four U.S. patents. He has supervised 16 PhD students and 21 post-doctoral research fellows and raised $11+ million USD of research funding. He was the coordinator of the E.U. projects ADEL (3.7M €) in the area of licensed shared access for 5G wireless networks, HARP (4.6M €) in the area of highly distributed MIMO, as well as E.U. Future and Emerging Technologies projects HIATUS (3.6M €) in the area of interference alignment and CROWN (3.4M €) in the area of cognitive radio networks. Dr Ratnarajah was an associate editor of IEEE Transactions on Signal Processing, 2015-2017, and Technical co-chair of The 17th IEEE International Workshop on Signal Processing advances in Wireless Communications, Edinburgh, U.K., 3-6, July 2016. Prof. Ratnarajah is a Fellow of the Higher Education Academy (FHEA).

 

Friday, 02 December 13:30 - 16:45
Room:  Brookside A/Lower Level
TUT-09: QoS-Driven Promising Techniques for 6G Multimedia Mobile Wireless Networks

Presenter(s): Xi Zhang (Department of Electrical and Computer Engineering, Texas A&M University)
 

Abstract – Background, Objectives, and Motivations

While 5G is being deployed around the world, the efforts and initiatives from academia, military industry, standard bodies have started to look beyond 6G, conceptualize 6G mobile wireless networks, and propose various 6G promising candidate techniques. Although it is widely recognized that various multimedia services such as video/audio streaming and even 3D immersive-media (e.g., XR – AR/MR/VR) will continue dominating the wireless traffics in 6G networks, how to efficiently support statistical delay and error-rate bounded QoS provisioning for wireless multimedia transmissions over 6G wireless networks remains one of the most difficult challengers because real-time big-data multimedia services are both highly spectrum-/computation-intensive and time-sensitive, for which the deterministic delay-bounded guarantee is practically infeasible due to randomly time-varying wireless channels and interferences. To overcome these difficulties, the academia and industry have made a great deal of efforts in developing various 6G standards and promising candidate techniques from the various theories, architectures, protocols, techniques, etc. Towards this end, focusing on 6G’s multimedia traffics this tutorial will define and introduce the new 6G-standard service class: Massive Ultra-Reliable and Low Latency Communications (mURLLC), and then we will address the 6G’s mURLLC-enabled fundamental pillar techniques, including the statistical delay and error-rate bounded quality-of-service (QoS) provisioning, Edge Artificial Intelligence (Edge-AI), Terahertz (THz) Wireless Nano-Networks, Cell-Free massive MIMO (CF m-MIMO), Finite Blocklength Coding (FBC), Age of Information (AoI), Intelligent Reflecting Surfaces (IRS), Unmanned Aerial Vehicle (UAV), Non-orthogonal multiple access (NOMA), Simultaneous Wireless Information and Power Transfer (SWIPT) and Energy Harvesting (EH) SWIPT/EH, information-centric network (ICN), network functions virtualization (NFV), and software defined networks (SDN), etc., and how these techniques can be integrated to efficiently support the statistical delay and error-rate bounded QoS provisioning for multimedia wireless communications over 6G mobile wireless networks. Furthermore, we will also discuss several open problems/challenges and future research directions in 6G mobile wireless networks.

Xi ZhangXi Zhang

Xi Zhang (S'89-SM'98-F'15) received the B.S. and M.S. degrees from Xidian University, Xi’an, China, the M.S. degree from Lehigh University, Bethlehem, PA, USA, all in Electrical Engineering and Computer Science, and the Ph.D. degree in Electrical Engineering and Computer Science (Electrical Engineering -- Systems) from The University of Michigan, Ann Arbor, MI, USA. He is currently a Full Professor and the Founding Director of the Networking and Information Systems Laboratory, Department of Electrical and Computer Engineering, Texas A&M University, College Station, TX, USA.

He is a Fellow of the IEEE for contributions to quality of service (QoS) theory in mobile wireless networks. He was with the Networks and Distributed Systems Research Department, AT&T Bell Laboratories, Murray Hill, NJ, USA, and AT&T Laboratories Research, Florham Park, NJ, in 1997. He was a Research Fellow with the School of Electrical Engineering, University of Technology, Sydney, Australia, and the Department of Electrical and Computer Engineering, James Cook University, Australia. He has published more than 400 research articles on wireless networks and communications systems, network protocol design and modeling, statistical communications, random signal processing, information theory, and control theory and systems. He is an IEEE Distinguished Lecturer of the IEEE Communications Society and IEEE Vehicular Technology Society. He received the TEES Select Young Faculty Award for Excellence in Research Performance from the Dwight Look College of Engineering at Texas A&M University, College Station, in 2006, and the Outstanding Faculty Award from Department of Electrical and Computer Engineering, Texas A&M University, in 2020. He received the U.S. National Science Foundation CAREER Award in 2004 for his research in the areas of mobile wireless and multicast networking and systems. He received the six best paper awards at IEEE GLOBECOM 2020, IEEE ICC 2018, IEEE GLOBECOM 2014, IEEE GLOBECOM 2009, IEEE GLOBECOM 2007, and IEEE WCNC 2010, respectively. One of his IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS papers has been listed as the IEEE Best Readings Paper (receiving the highest citation rate among all IEEE TRANSACTIONS/Journal articles in the area) on wireless cognitive radio networks and statistical QoS provisioning over mobile wireless networking.

Prof. Zhang is serving or has served as an Editor for IEEE TRANSACTIONS ON COMMUNICATIONS, IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, and IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING. He served twice as a Guest Editor for IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS for Special Issue on “Broadband Wireless Communications for High Speed Vehicles” and Special Issue on “Wireless Video Transmissions.” He was an Associate Editor for IEEE COMMUNICATIONS LETTERS. He served twice as the Lead Guest Editor for IEEE Communications Magazine for Special Issue on “Advances in Cooperative Wireless Networking” and Special Issue on “Underwater Wireless Communications and Networks: Theory and Applications.” He served as a Guest Editor for IEEE Wireless Communications Magazine for Special Issue on “Next Generation CDMA vs. OFDMA for 4G Wireless Applications.” He served as an Editor for Wireless Communications and Mobile Computing (Wiley), Journal of Computer Systems, Networks, and Communications, and Security and Communication Networks (Wiley). He served as an Area Editor for Computer Communications (Elsevier), and among many others. He is serving or has served as the TPC Chair for IEEE GLOBECOM 2011, the TPC Vice-Chair for IEEE INFOCOM 2010, the TPC Area Chair for IEEE INFOCOM 2012, the Panel/Demo/Poster Chair for ACM MobiCom 2011, the General Chair for IEEE WCNC 2013, and the TPC Chair for IEEE INFOCOM 2017–2019 Workshops on “Integrating Edge Computing, Caching, and Offloading in Next Generation Networks."

Friday, 02 December 13:30 - 16:45
Room:  Brookside B/Lower Level
TUT-10: Recent Advances in 5G Sidelink Technologies

Presenter(s): Vijitha Weerackody, Arnab Das, and Kent Benson (JHU/APL)

The sidelink is a 3GPP-standardized technology that provides direct UE-to-UE communication links, and it is gaining significant interest in many commercial and military applications, including in ultra-reliable low latency communications (URLLC), enhanced Mobile Broadband (eMBB), vehicle-to-everything (V2X) networks and AR/VR technologies.  The direct UE-to-UE communication support of the sidelink is attractive in scenarios where cellular network coverage is not available or it is not reliable. Because of the extensive commercial interests in sidelink-based applications, the 3GPP has undertaken an effort to introduce advanced capabilities to the sidelink to support the increased data rates and the low latency requirements of these applications. The following sidelink-based technologies are currently under development at the 3GPP: support at mmWave bands including beamforming and beam maintenance; technologies for unlicensed bands such that they co-exist fairly with incumbent technologies; operations to support multiple carriers; protocols to support relaying via a UE; and positioning and localizing technologies for the sidelink.

Although the lower-layer technologies used by the sidelink are similar to the base station-based technologies, there are some key differences in the underlying technologies that are used in the sidelink. The objective of this tutorial is to present an overview of the current sidelink technology, including the key physical layer and MAC layer issues, and address the sidelink technologies that will be developed at the 3GPP in the near future.

Vijitha Weerackody

Dr. Vijitha Weerackody is a member of the Principal Professional Staff at the Johns Hopkins University Applied Physics Laboratory. His expertise is in the general area of signal processing and communications with many patents and peer-reviewed publications in these areas. He is the co-author of several ITU-R Recommendations and Reports on spectrum sharing techniques for satellite systems and is actively contributing to the 3GPP TSG RAN WG1 on lower layer issues of 5G technologies. Before joining JHU/APL he worked at the Research Divisions of Bell Laboratories and Telcordia Technologies, and several startup companies. He also served as an adjunct faculty member at the University of Pennsylvania, Department of Electrical and Systems Engineering. Additionally, he is participating in numerous IEEE publication activities including serving as a Senior Editor for the IEEE Transactions on Aerospace and Electronics System. He holds a Ph.D. in electrical engineering from the University of Pennsylvania, Philadelphia, PA.

 

 

 Arnab Das

Dr. Arnab Das is a member of the Senior Professional Staff at The Johns Hopkins University Applied Physics Laboratory (JHU/APL) since 2009. He has worked on tasks encompassing a broad range of communications and networking topics spanning the physical layer to the application layer and ranging from theoretical analysis to field testing exercises, including modeling, simulation and algorithm development; analysis and development of spectrum-sharing methods and techniques; development of tactical communications architectures; test and evaluation of tactical RF communications systems; and technology strategy, vision, and roadmap studies focused on RF communications. His research interests include performance analysis of RF communications systems, mathematical modeling of communications networks using game theory and graph theory, constrained optimization techniques applied to communications systems, and the applications of machine learning to communications and networking. Arnab was appointed as a faculty member in the Johns Hopkins University Engineering for Professionals (JHU EP) program in Fall 2019. He currently is a professor for the Communication Systems Engineering course in the Department of Electrical and Computer Engineering (ECE). He holds a B.S. in Electrical Engineering from University of Illinois at Urbana-Champaign and a Ph.D. in Electrical Engineering from The Pennsylvania State University.
 

 Kent Benson

 Dr. Kent Benson is a member of the Senior Professional Staff at the Johns Hopkins University Applied Physics Laboratory since 2012. He has over 20 years of experience in wired and wireless communications, signal processing, data  networking, and systems engineering. He has worked in research and advanced development at companies in both the telecommunications and defense industries where he has served as the Lead Systems Engineer or Principal  Investigator for a number of program efforts. His areas of interest include waveform analysis and design, modeling and simulation, and communications system analysis. Most recently he has been working on projects in HF  communications, cellular technologies, and satellite systems. Dr. Benson has published in several IEEE journals and conferences. Dr. Benson received his Ph.D. in electrical engineering from the University of Wisconsin.

 

 

Friday, 02 December 13:30 - 16:45
Room: White Oak A/Lower Level
TUT-11: Holographic Radio: A New Paradigm for Ultra-Massive MIMO

Presenter(s):  Boya Di, Lingyang Song, Peking University, Beijing, China; Hongliang Zhang, Princeton University, Princeton, NJ, Prof. Zhu Han, Houston University, TX

Ultra-massive multiple-input multiple-output (MIMO) is one of the key enablers in the forthcoming sixth generation (6G) networks to provide revolutionary mobile connectivity and high-speed data services by exploiting spatial diversity. Widely-utilized phased arrays relying on costly components make the implementation of ultra-massive MIMO in practice become prohibitive from both cost and power consumption perspectives. The recent developed reconfigurable holographic surfaces (RHSs) composing of densely packing sub-wavelength metamaterial elements can achieve holographic beamforming without costly hardware components. By leveraging the holographic principle, the RHS serves as an ultra-thin and lightweight surface antenna integrated with the transceiver, thereby providing a promising alternative to phased arrays for realizing ultra-massive MIMO. In this tutorial, we will first provide a basic introduction of RHSs. We then introduce the unique features of RHSs which enables both communication and sensing, in a comprehensive way. Related design, analysis, optimization, and signal processing techniques will be presented. Typical RHS-based applications for the wireless communications and radio-frequency sensing will be explored. The implementation issues along with our developed prototypes and experiments will also be discussed. Several up-to-date challenges and potential research directions will be discussed as well.

Boya DiBoya Di

Boya Di (S’17-M’19) obtained her Ph.D. degree from the Department of Electronics, Peking University, China, in 2019. Prior to that, she received a B.S. degree in electronic engineering from Peking University in 2014. She was a postdoc researcher at Imperial College London and is now an assistant professor at Peking University. Her current research interests include holographic radio, reconfigurable intelligent surfaces, multi-agent systems, edge computing, and aerial access networks. She has published over 7 journal papers on reconfigurable holographic surface aided communications and sensing. She received the best doctoral thesis award from the China Education Society of Electronics in 2019. She also receives the 2021 IEEE ComSoc Asia-Pacific Outstanding Paper Award. She has served as an associate editor for IEEE Transactions on Vehicular Technology since June 2020. She has also served as a workshop co-chair for IEEE WCNC 2020&2021.

Lingyang SongLingyang Song

Lingyang Song (S’03-M’06-SM’12-F'19) received his PhD from the University of York, UK, in 2007, where he received the K. M. Stott Prize for excellent research. He worked as a research fellow at the University of Oslo, Norway until rejoining Philips Research UK in March 2008. In May 2009, he joined the School of Electronics Engineering and Computer Science, Peking University, and is now a Boya Distinguished Professor. His main research interests include wireless communications, mobile computing, and machine learning. Dr. Song is the co-author of many awards, including IEEE Leonard G. Abraham Prize in 2016, IEEE ICC 2014, IEEE ICC 2015, IEEE Globecom 2014, and the best demo award in the ACM Mobihoc 2015. He received National Science Fund for Distinguished Young Scholars in 2017, First Prize in Nature Science Award of Ministry of Education of China in 2017. Dr. Song has served as a IEEE ComSoc Distinguished Lecturer (2015-2018), an Area Editor of IEEE Transactions on Vehicular Technology (2019-), Co-chair of IEEE Communications Society Asia Pacific Board Technical Affairs Committee (2020-). He is a Clarivate Analytics Highly Cited Researcher.

Hongliang ZhangHongliang Zhang

Hongliang Zhang (S’15-M’19) received his B.S. and Ph.D. degrees at the School of Electrical Engineering and Computer Science at Peking University, in 2014 and 2019, respectively. He was a Postdoctoral Fellow in the Electrical and Computer Engineering Department at the University of Houston, Texas. Currently, he is a Postdoctoral Associate in the Department of Electrical and Computer Engineering at Princeton University, New Jersey. His current research interest includes reconfigurable intelligent surfaces, aerial access networks, optimization theory, and game theory. He received the best doctoral thesis award from the Chinese Institute of Electronics in 2019. He is an exemplary reviewer for IEEE Transactions on Communications in 2020. He also receives the 2021 IEEE Comsoc Heinrich Hertz Award for Best Communications Letters and the 2021 IEEE ComSoc Asia-Pacific Outstanding Paper Award. He has served as a TPC Member for many IEEE conferences, such as Globecom, ICC, and WCNC. He is currently an Editor for IEEE Communications Letters, IET Communications, and Frontiers in Signal Processing. He has also served as a Guest Editor for several journals, such as IEEE Internet of Things Journal, Journal of Communications and Networks, etc.

Zhu HanZhu Han

Zhu Han (S’01–M’04-SM’09-F’14) received a B.S. degree in electronic engineering from Tsinghua University, in 1997, and M.S. and Ph.D. degrees in electrical engineering from the University of Maryland, College Park, in 1999 and 2003, respectively. From 2000 to 2002, he was an R&D Engineer at JDSU, Germantown, Maryland. From 2003 to 2006, he was a Research Associate at the University of Maryland. From 2006 to 2008, he was an assistant professor at Boise State University, Idaho. Currently, he is a Professor in the Electrical and Computer Engineering Department as well as the Computer Science Department at the University of Houston, Texas. His research interests include wireless resource allocation and management, wireless communications and networking, game theory, wireless multimedia, security, and smart grid communication. Dr. Han received an NSF Career Award in 2010, the Fred W. Ellersick Prize of the IEEE Communication Society in 2011, the EURASIP Best Paper Award for the Journal on Advances in Signal Processing in 2015, the IEEE Kiyo Tomiyasu Award in 2021, and several best paper awards in IEEE conferences. Dr. Han is top 1% highly cited researcher according to Web of Science since 2017, and an AAAS fellow since 2019

 

Friday, 02 December 13:30 - 16:45
Room: White Oak B/Lower Level
TUT-12: Beyond Identification: HF RFID and NFC for Digital Twins

Presenter(s):  Hongzhi Guo, Norfolk State University and Amitangshu Pal, Indian Institute of Technology Kanpur

Digital Twin is an emerging technology that has attracted attention from industries, healthcare, and education, among others. It can create, process, and maintain virtual representations of physical objects using sensing data. Virtual representations are softwarized objects that can be used to simulate, optimize, and inform physical activities. To update the virtual representations, Digital Twin requires persistent surveillance of the physical environment using large-scale ultra-dense sensors. Wireless communication and networking are essential technologies to provide reliable, low-latency, and high-rate data communications between virtual representations and physical objects. Existing Internet of Things (IoT) technologies can be readily used for Digital Twin applications. However, there are still a few challenges that need innovations, such as the Internet of Everything (IoE) which can build Digital Twins for any physical object. This tutorial will focus on ultra-dense IoT applications where Digital Twins are used to modeling a large number of densely placed objects. This tutorial will introduce recent progress in High-Frequency (HF, 3 MHz to 30 MHz) RFID (Radio Frequency Identification) and NFC (Near Field Communication) that can provide ultra-dense reliable wireless networking, sensing, and identification for Digital Twin. The HF RFID and NFC devices are lightweight and battery-free that can be economically deployed. Also, the short communication range of HF RFID and NFC allows ultra-dense deployment without competing for spectrum with most existing IoT applications. Recent studies have demonstrated their high reliability and flexibility that can be attached to nearly everything to create Digital Twins.

This tutorial aims to introduce recent solutions and future research challenges for HF RFID and NFC that can provide ultra-dense persistent sensing for Digital Twins. The fundamental knowledge of Digital Twin and basic HF RFID and NFC protocols will be introduced. Technical issues include long-range HF RFID and NFC and near field beamforming using antenna arrays are addressed. Last, Digital Twin applications such as military applications, food supply chain monitoring, precision agriculture, and warehouse management will be introduced.

Hongzhi GuoHongzhi Guo

Dr. Hongzhi Guo is an Assistant Professor of Electrical Engineering at Norfolk State University. He received his Ph.D. degree from the University at Buffalo, the State University of New York in 2017, and his MS degree from Columbia University in 2013, both in Electrical Engineering. His broad research agenda is to develop foundations for wireless sensor networks and networked robotics to automate dangerous dirty dull tasks in extreme environments, such as underground and underwater. He received the NSF CAREER Award in 2022, the NSF HBCU-UP RIA award in 2020, the Jeffress Trust Awards Program in Interdisciplinary Research in 2020, and the Best Demo Award in IEEE INFOCOM 2017. He has been a reviewer for IEEE, ACM, and Elsevier publications and received the Best Reviewer recognition for IEEE Transactions on Wireless Communications in 2017. He is a Technical Committee Member for IEEE WiSee, MILCOM, and WCNEE. He served as the publicity co-chair for ACM MSWiM 2022.

Amitangshu Pal Amitangshu Pal

Dr. Amitangshu Pal received the B.E. degree in computer science and engineering from Jadavpur University, in 2008, and the Ph.D. degree in electrical and computer engineering from The University of North Carolina at Charlotte, in 2013. He is currently an Assistant Professor in Computer Science and Engineering department at IIT Kanpur. His current research interests include wireless sensor networks, reconfigurable optical networks, smart health-care, cyber-physical systems, mobile and pervasive computing, and cellular networks.

Thank you to our patrons

Thank You To Our Exhibitors

Thank You To our Supporters