Skip to main content
Publications lead hero image abstract pattern

Publications

Publication Date

Manuscript Submission Deadline

Feature Topic

Call for Papers

Over the past decade, cloud computing has played a dominant role in supporting the applications we rely on today. Mobile networks have been acting mostly as communication pipes connecting users to the cloud and with each other. As we evolve toward the Internet of Things (IoT), our 5G/6G and future mobile networks must support a much wider range of applications, including vehicular networking, automated manufacturing, smart cities, drones, smart grids, e-health, and the many emerging AI-enabled applications such as Virtual Reality (VR) and Augmented Reality (AR). The cloud computing plus communication pipe model is no longer adequate for supporting these emerging applications. For example, many IoT applications cannot tolerate the delays incurred by cloud computing. The endpoints are creating a vast and ever-growing amount of data that needs to be processed locally because sending all the data to the cloud will often be infeasible due to network bandwidth constraints and regulatory restrictions. Connecting every device directly to the cloud can often be impractical due to limited resources on the devices, software and management complexity, limited network agility and cognition, and system scalability. In such scenarios, users will desire local services. Many resource-constrained devices will also require local services to help perform many tasks that they cannot perform by themselves. Such tasks may range from computational-intensive user applications to security tasks that require heavy processing or information that the resource-constrained devices do not have. Future mobile networks will also require computing capabilities inside or close to the radio access networks (RANs) to enable advanced networking capabilities, such as establishing radio connections more timely and adjusting radio channel coding dynamically in response to changing user needs and communication environments, and allow user applications to be hosted in the RANs that are closer to the users.

These and the many other new requirements call for a new computing paradigm – fog/edge computing and networking. Fog/edge computing technologies envision an open horizontal architecture for distributing functions (from computing to storage to control and to networking functions) closer to users, not just to any specific type of network edge device but anywhere along the cloud-to-thing continuum that can best meet user requirements. Fog will integrate with the cloud to provide a seamless end-to-end computing platform along the cloud-to-thing continuum. Fog/edge services and user applications can be deployed anywhere along this computing continuum. The same function or application can be deployed and subsequently moved anywhere along the continuum, in the cloud, the fog, and even onto the endpoints, to best meet user requirements. Computing resources distributed along the cloud-to-thing continuum can be pooled together to support a user application. Fog/edge nodes will work autonomously when connectivity to the cloud is unavailable. They can also collaborate with each other to carry out tasks for the users.

Fog/edge technologies will play key roles in future computing and networking systems. Researchers investigate fog/edge computing and networking technologies to optimize of resources that are virtualized, pooled, and shared unpredictably. Fog networking revisits the role of clients in network architectures, more than just an end-user device, but also as an integral part of the control plane that monitors, measures, and manages the network. This is rewriting the traditional practice of using heavy-duty and dedicated network elements for network measurement and management fog/edge computing & networking combine the study of mobile communications, distributed systems, and big data analytics into an exciting new area.

With fog/edge computing and networking technologies, many new emerging services (such as V2V in Vehicular Telematics Services, Autonomous Car, Industry 4.0 and e- Healthcare Services) could be realized and implemented easily and economically. It could be also served as core engine to enable many Services in Internet of Things (IoT) applications. Vertical markets and applications will be critical for 5G/6G systems. There are opportunities in applying fog/edge technologies to facilitate the operations of vertical applications with integrated computing and communications design.

More recently, edge AI has emerged as the next frontier and a cornerstone for future intelligent networks. Edge AI has multiple levels. At the base level, edge devices use AI/ML models created somewhere else (typically in the cloud) to tackle complex tasks. At the top level, edge devices learn from their local data (“edge learning”) to help create (train) AI/ML-based network functions and user applications. Edge learning is becoming necessary and essential because sending the massive amount of data created at the network edge to the cloud for ML model training is increasingly impractical.

This Feature Topic is aimed to cover a wide variety of recent advancement and future directions on fog/edge computing, including trends towards 6G fog/edge, cutting-edge fog/edge research contributions, experiments and performance of fog/edge computing systems, challenges, and opportunities for fog/edge, novel business models, and killer applications. We welcome viewpoints and contributions from academia and industry. The topics of interest for this Feature Topic include, but not limited to, the following:

  • Visions toward future fog/edge evolution
  • Fog/edge technologies for 6G
  • Fog/edge based IoT services
  • Fog/edge computing and networking for mission-critical services
  • Edge AI and edge learning
  • Management and orchestration for fog/edge systems
  • Data analytics and machine learning in the fog/edge computing environment
  • Security and trust in fog/edge systems
  • Key organizations or consortia of fog/edge activities.
  • Standards and future direction of fog/edge systems
  • Experiences sharing of fog/edge testbeds and deployment.
  • Fog/edge platform for vertical industries (e.g. manufacturing, transportation)

Submission Guidelines

Manuscripts should conform to the standard format as indicated in the Information for Authors section of the Manuscript Submission Guidelines. Please, check these guidelines carefully before submitting since submissions not complying with them will be administratively rejected without review.

All manuscripts to be considered for publication must be submitted by the deadline through Manuscript Central. Select the “FT–2221 / Future Trends in Fog/Edge Computing and Networking” topic from the drop-down menu of Topic/Series titles. Please observe the dates specified here below noting that there will be no extension of submission deadline.

Important Dates

Manuscript Submission Deadline: 15 December 2022
Acceptance Notification: 15 April 2023
Final Manuscript Due: 1 May 2023
Publication Date: July 2023

Guest Editors

Hung-Yu Wei (Lead Guest Editor)
National Taiwan University, Taiwan

Tao Zhang
National Institute of Standards and Technology, USA

Russell Hsing
National Chiao Tung University, Taiwan

Doug Zuckerman
Peraton Labs, USA