Journal Information
Computer Communications
Impact Factor:

Call For Papers
Last updated by Jayden Hunter in 2020-02-18
Special Issues
Special Issue on Smart Green Computing for Wireless Sensor Networks
Submission Date: 2020-02-28

The past couple of decades have substantiated the aggrandizement of Wireless Sensor Networks (WSNs) in academia and industry. In the WSN, numerous sensor nodes are deployed and networked to perlustrate a specified region, such that the inquisitive data can be sensed, processed, stored and collected. The physical world can be bridged to computing system via WSN, which constitutes the basis for developing advanced smart applications. Various possible applications of WSNs have been exploited in the realms of smart home, green buildings, environmental engineering, healthcare, industry, and military applications etc. To enable the pervasive deployment of WSNs, the major challenges are the incongruity between the diverse functionalities demanded by applications and the inadequate energy supply for sensor nodes. A retrogressive situation can occur for a large-scale network. Therefore, in this special issue (SI), we scrutinize WSNs focusing on green computing. On one hand, we solicit contributions on energy-efficient cross- layer protocols, to proliferate the network lifespan. Explicitly, to practice WSNs in real world applications, it is needed to analyze a tradeoff between system performance and energy efficiency, acclimatizing sensing/networking functionalities to energy budget. On the other hand, we stoop to probing new approaches to rationally supply energy to sensor nodes. For instance, sensor nodes fortified with energy harvesting mechanism from surrounding environment. The aforementioned techniques are skookum to improve the sustainability and performance of WSNs. Therefore, it is pivotal to utterly investigate how we run a WSN in a green manner. Topics of primary interest are including but not limited to the following: Architectures of Intelligent green computing technology for WSNs. Smart energy harvesting/charging and power management techniques. Long-life sensor node deployment and topology control. Energy-efficient smart computing protocol design. AI based scheduling algorithms for sensor networks. Energy-efficient sensing techniques. New applications of self-sustainable sensor networks. Smart data routing, processing and storage strategies. Network modelling and performance analysis. Artificial intelligence approaches for coordinating devices in WSNs. AI based resource orchestration in WSNs. Wireless sensor Network in Internet of Things State-of-the-art reviews on smart green computing technology trends for WSNs. Experimental results and test-beds for smart green computing systems for WSNs.
Last updated by Dou Sun in 2019-10-14
Special Issue on Intelligent Edge: When Machine Learning Meets Edge Computing
Submission Date: 2020-03-15

The explosion of the big data generated by ubiquitous edge devices motivates the emergence of a new computing paradigm: edge computing. It has attracted attention from both academia and industry in recent years. In edge computing, computations are deployed mainly at the local network edge rather than at remote central computing infrastructures, thereby considerably reducing latency and possibly improving computation efficiency. This computing model has been applied in many areas such as mobile access networks, Internet of Things (IoT), and microservices, enabling novel applications that drastically change our daily lives. As a second trend, a new era of Artificial Intelligence (AI) research has delivered novel machine learning techniques that have been utilized in applications such as healthcare, industry, environment engineering, transportation, smart home and building automation, all of which heavily rely on technologies that can be deployed at the network’s edge. Therefore, intuitively, marrying machine learning techniques with edge computing has high potential to further boost the proliferation of truly intelligent edges. In light of the above observations, in this special issue, we look for original work on intelligent edge computing, addressing the particular challenges of this field. On one hand, conventional machine learning techniques usually entail powerful computing infrastructures (e.g., cloud computing platforms), while the entities at the edge may have only limited resources for computations and communications. This suggests that machine learning algorithms or, at least, the implementations of machine learning algorithms, should be revisited for edge computing, which represents a considerable risk and challenge at once. On the other hand, the adapted deployments of machine learning algorithms at the edge empower the “smartification” across different layers, e.g., from network communications to applications. This in turn allows new applications of machine learning and artificial intelligence, opening up new opportunities. The goal of this special issue is to offer a venue for researchers from both academia and industry to present their solutions for re-designing machine learning algorithms compatible to edge computing, and for building intelligent edge by machine learning techniques, possibly revealing new, compelling use cases. Relevant topics include, but are not limited to: l System architectures of intelligent edge computing l Modeling, analysis and measurement of intelligent edge computing l Machine learning algorithms and systems for edge computing l Machine learning-assisted networking and communication protocols for or using edge computing l Intelligent mobile edge computing l Architectures, techniques and applications of intelligent edge cloud l Resource management for intelligent edge computing l Security and privacy of intelligent edge computing l Data management and analytics of intelligent edge computing l Intelligent edge-cloud collaborations l Programming models and toolkits for intelligent edge computing l Distributed machine learning algorithms for edge computing l Smart applications of edge computing
Last updated by Dou Sun in 2019-11-19
Special Issue on Autonomous Learning-Based Algorithm for Heterogeneous Cellular Networks
Submission Date: 2020-03-30

The spectrum bands of the multiple base stations comprise the sets of orthogonal wireless channels and spectrum usage scenarios the device to device pairs transmit over the dedicated frequency bands and the device to device pairs operate on the shared cellular channels. The goal of each device pair is to jointly select the wireless channel and power level to maximize its reward, defined as the difference between the achieved throughput and the cost of power consumption, constrained by its minimum tolerable signal-to-interference-plus-noise ratio requirements. We formulate this problem as a stochastic non-cooperative game with multiple players where each player becomes a learning agent whose task is to learn its best strategy and develop a fully autonomous multi-agent Q-learning algorithm converging to a mixed-strategy Nash equilibrium. The learning algorithm shows relatively fast convergence and near-optimal performance after a small number of iterations. Potential topics included, but not limited Reinforcement Learning for self organization and power control of two-tier heterogeneous networks · Optimal new site deployment algorithm for heterogeneous cellular networks· Energy cost minimization in heterogeneous cellular networks with hybrid energy supplies· Configuration algorithm for service scalability in heterogeneous cellular networks· Q-learning based heterogeneous network selection algorithm· Bayesian reinforcement learning-based algorithm for heterogeneous cellular networks· Machine learning paradigms for next-generation communication networks · Online distributed user association for heterogeneous radio access network
Last updated by Dou Sun in 2019-09-07
Special Issue on Internet of Things and Augmented Reality in the age of 5G
Submission Date: 2020-03-30

In the past few decades, people have made great efforts on the Internet of Things, which makes it possible or accessible to be applied in various fields, including home robotics, intelligent cities and Augmented Reality (AR). Therefore, these applications have captured the attention and enhanced aspirations of researchers in fields of machine vision, computer graphics and computer vision. The 5G network is a technology which is intrinsically featured with high bandwidth, ultra-low latency and high speed in a wireless communication network. In particular, 5G will facilitate establishing the Internet of Things as an essential part in our life through laying the foundation for releasing its full potential. With huge improvements over the present functions of the 4G, 5G guarantees a more IoT-friendly ecosystem. Though it remains a long way before 5G becomes a mainstream, companies should begin to develop and re-imagine products and services to utilize the better functions of 5G. Through generating digital twins, 5G combined with IoT will bring all items on the shelf to the internet. If there are billions of hardware-connected devices, the potential for regular consumer goods with digital twins to become components of the new Internet of Things will be higher. Augmented Reality refers to a crucial technology which promotes a major paradigm shift in the way that users interact with data. This has just been known as a feasible solution to different critical needs recently. Moreover, augmented reality (AR) technology can be utilized to visualize data from hundreds of sensors concurrently, overlaying related and actionable information over your environment via a headset. With much more data flow, 5G helps AR technology to be much faster. Featured with easier and more reachable use, it is more likely to be widely applied in various different functions (including video gaming). In conclusion, the convergence of IoT and AR in the age of 5G is actually cool forthcoming wave which will be related with IoT, where the great data storage will enable an AR lens into the scenes in the ways that offer almost immediate insight at a level of depth unthinkable before. Hence, this part intends to introduce the latest findings to the Internet of Things and Augmented Reality technologies for different applications in the age of 5G. It can promote technologists to quicken the latest technical progress. Topics include, but are not limited to: = Novel IoT techniques = Human, IoT and AI Communication Protocols = 5G and Its applications = Augmented reality = 5G-based video transfer techniques for IoT = Novel IoT devices = IoT for augmented reality = IoT device search in the era of 5G = Knowledge-based discovery of devices, data and services in the IoT =Real-world Applications of IoT: security; healthcare; advertising; and government
Last updated by Dou Sun in 2019-09-07
Special Issue on Network Intelligence
Submission Date: 2020-04-01

Network Intelligence considers the embedding of Artificial Intelligence (AI) in future networks to fasten service delivery and operations, leverage Quality of Experience (QoE) and guarantee service availability, also allowing better agility, resiliency, faster customization and security. This concept inherits the solid background of autonomic networking, cognitive management, and artificial intelligence. It is envisioned as mandatory to manage, pilot and operate the forthcoming network built upon SDN, NFV and cloud. The main goal of this special issue is to present state-of-the-art research results and experience reports in the area of network intelligence, addressing topics such as artificial intelligence techniques and models for network and service management; smart service orchestration and delivery, dynamic Service Function Chaining, Intent and policy based management, centralized vs. distributed control of SDN/NFV based networks, analytics and big data approaches, knowledge creation and decision making. The attention is particularly focused on the particular application of machine learning tools to the optimization of next generation networks. Machine and deep learning techniques become increasingly popular and achieve remarkable success nowadays in many application domains, e.g., speech recognition, bioinformatics and computer vision. Machine learning is capable to exploit the hidden relationship from voluminous input data to complicated system outputs, especially for some advanced techniques, like the deep learning. Moreover, some other techniques, e.g., reinforcement learning, could further adapt the learning results in the new environments to evolve automatically. These features perfectly match the complex, dynamic and time-varying nature of today’s networking systems. This special issue will be devoted to both theoretical and practical evaluations related to the design, analysis and implementation of network intelligent techniques. Some of the relevant topics include, but are not limited to the following: Deep and Reinforcement learning for networking and communications in networks Data mining and big data analytics in networking Design and optimization in intelligent networks Adaptive networking algorithms Intent & Policy-based management for intelligent networks Innovative architectures and infrastructures for intelligent networks AI/ML for network management and orchestration AI/ML for network slicing optimization AI/ML for service placement and dynamic Service Function Chaining AI/ML for C-RAN resource management and medium access control Decision making mechanisms Routing optimization based on flow prediction network systems Data analytics for network and wireless measurements mining Methodologies for network problem diagnosis, anomaly detection and prediction Network Security based on AI/ML techniques AI/ML for multimedia networking AI/ML support for ultra-low latency applications AI/ML for IoT Open-source networking optimization tools for AI/ML applications Experiences and best-practices using machine learning in operational networks Novel context-aware, emotion-aware networking services Machine learning for user behavior prediction Modeling and performance evaluation for Intelligent Network Transfer learning and reinforcement learning for network management Intelligent network management Fault-tolerant network protocols using machine learning Big data analysis for network monitoring
Last updated by Dou Sun in 2019-11-24
Special Issue on Intelligent Resource Management in Mobile Edge Computing for IIoT applications
Submission Date: 2020-04-30

Scope and Motivation: The involvement of Wireless Sensor Network (WSN) in the field of Internet-of-Things (IoT) and Industrial IoT (IIoT) applications has become a new hotspot for the researchers and industries. The Artificial Intelligence (AI) and Machine Learning (ML) is an emerging technology that has proven to have great potential in communications, such as signal classification, channel estimation, and performance optimization. However, in the current era of 5G and Mobile Edge Computing (MEC), cooperative and heterogeneous communication scenarios are developing in a complex and large-scale trend for all the smart applications, especially for IIoTs. These technologies play a vital role in routing establishment, network resource optimization and energy-efficient computing, which are mainly operating in Network and Transport upper layer of OSI model. The power allocation and resource management in these networks remain a challenge due to the miniature size, limited battery life and dynamic movement of the sensor nodes in the Industrial applications. Therefore, for efficient power management and necessary optimization for IIoT applications, AI, Deep Learning (DL) and other Neural Network (NN) based approaches will come up as a solution for green communication. These technologies can be used in MEC paradigm for IIoT applications due to is merits of real-time response, resource capacity as well as low processing power. For the 5G system specifications (MEC ISG) and 3GPP ecosystem draws attention towards the edge computing enablers. These MEC systems provide opportunities of the ETSI Multi-access Edge Computing group. Recently, the International Telecommunication Union (ITU) have also expressed that the MEC services and resource management can be offered by both mobile network operators as well as third party vendors. Thus, the scope of these paradigms is growing exponentially towards the design, deployment, maintenance and security of MEC for Industry 4.0 applications. This special issue will focus on Intelligent Resource management in MEC with the theory and practical aspects of recent outcomes and developments in MEC IIoT applications. The issue invites the researchers and industry practitioners to submit high quality unpublished technical articles, highlighting the scopes and challenges for resource allocation in MEC IIoT applications. The topics of interest include, but are not limited to: Distributed Artificial Intelligence based computation models for MECs Resource optimization in IIoT applications Network set-up, security issues with MEC IIoT Resource allocation techniques for green MEC IIoT services Data offloading, traffic and cloud services for Industrial MEC Energy efficient user scheduling and resource allocation for Industry 4.0 Congestion control and low latency for MEC IIoT Neural network based ambience intelligence for IIoT AI based cloud virtualization in Industrial MEC Cloud and edge security for IIoT applications Fog and mist computing for Industry 4.0 Architecture and scalability for AI enabled mobile IIoT applications
Last updated by Dou Sun in 2019-11-24
Special Issue on Optimization of Cross-layer Collaborative Resource Allocation for Mobile Edge Computing, Caching and Communication
Submission Date: 2020-05-31

With the rapid development of mobile communications and the explosive usage of mobile devices (i.e., smart phones, laptops, tablets, etc.), the mobile Internet facilitates us with a pervasive and powerful platform to provide more and more emerging applications. However, many mobile devices usually have limited computation capabilities and battery power. Migrating computational tasks from the distributed devices to the infrastructure-based cloud servers has the potential to address the aforementioned issues. The cloud servers are always located in the center of core network and far away from the users, which may cause delay fluctuation and additional transmission energy cost. Mobile Edge Computing (MEC) is an emerging paradigm which pursues to provide better services by moving infrastructure-based cloud resources (computation, storage, bandwidth and et al) to the edge of the network. MEC is rapidly becoming a key technology of 5G, which helps to achieve the key technical indicators of 5G business, such as ultra-low latency, ultra-high energy efficiency and ultra-high reliability. Differ from the traditional cloud, MEC is close to the mobile users, which can reduce the access delay and the cost of using the cloud service. However, we are facing many challenges for scheduling the limited and heterogeneous MEC resources (computation resource and network resource). Firstly, how to implement a cross-layer optimization policy for MEC that jointly optimizes the application layer, network layer, data link layer, as well as physical layer of the protocol stack using an application-oriented objective function while satisfying the different user service requirements (i.e., energy saving, reducing execution delay, reducing price, and et al) is very essential. Secondly, a theoretical framework of cross-layer optimization to balance the efficiency and fairness of resource allocation of MEC, as well as maximize the profit of MEC service providers needs to be proposed. Thirdly, how to design cross-layer collaborative distributed resource management systems that meet the harsh requirements of MEC such as latency, scalability and mobility support, also needs to be considered. In addition, it is also essential to jointly optimize the resource allocation of computation and communication of both the mobile users and the MEC service provider to minimize the total energy consumption subject to the users’ latency constraint. It is highly expected that “mobile edge computing (MEC)” will be a key technology playing the most important role in 5G and future network. More importantly, it can improve user experience and user service quality. In addition, as pointed out in the section 1, the topic of joint cross-layer collaborative resource optimization for mobile edge computing is not only important but also faces many challenges. Therefore it is necessary to address them to come true the MEC based evolution in our life. This topic is very promising and will attract great interests from readers, including researchers from academia and industry, general readers, mobile application developer as well as students who are engaged in this study. This Special Section in Computer Communications is inviting researchers to report the-stated-of-the-art advances in joint cross-layer collaborative resource optimization. Authors are invited to submit original practical work and survey papers. Topics of interest include (but are not limited to): Theoretical modeling and performance analysis of resource optimization for MEC Joint cross-layer resource allocation for MEC New integration resource management architecture of cloud, MEC and user Cross-layer service discovery and service recommendations for MEC Multi-user computation offloading for MEC Multi-edge-server collaboration for MEC Delay minimization service provision for MEC Cross-layer collaborative distributed systems for MEC Cross-layer collaborative MEC Applications, such as smart city, smart grid, and Intelligent Transportation Systems Software-defined MEC Software-defined offloading for MEC Mobility management for MEC Security, privacy, and trust of MEC MEC for vehicular networks We also highly recommend the submission of multimedia with each article as it significantly increases the visibility, downloads, and citations of articles.
Last updated by Dou Sun in 2019-11-24
Related Journals
CCFFull NameImpact FactorPublisherISSN
Computer Communication Review ACM0146-4833
Journal of Complex AnalysisHindawi2314-4963
International Journal of Computer Networks & Communications AIRCC0975-2293
International Journal of Wireless and Mobile Networking AR Publication2347-9078
Communications in Mobile Computing Springer2192-1121
cIET Communications0.624IET1751-8628
AI Communications0.837IOS Press0921-7126
bSpeech Communication1.661ELSEVIER0167-6393
China Communications0.424China Communications Magazine, Co., Ltd.1673-5447
IEICE Transactions on Communications IEICE
Related Conferences
CCFCOREQUALISShortFull NameSubmissionNotificationConference
aa*a1INFOCOMInternational Conference on Computer Communications2019-07-242019-12-142020-04-27
MMCTSEInternational Conference on Mathematical Methods & Computational Techniques in Science & Engineering2017-12-312018-01-202018-02-16
baa1MobiHocInternational Symposium on Mobile Ad Hoc Networking and Computing2019-11-302020-03-292020-06-30
CVCIInternational Conference Computer Vision and Computational Intelligence2019-11-152019-12-052020-01-17
cab1ICCCNInternational Conference on Computer Communication Networks2020-03-022020-04-272020-08-03
EEETEMInternational Conference on Electrical and Electronic Engineering, Telecommunication Engineering, and Mechatronics2015-09-27 2015-10-27
D2MEInternational Conference on Design, Mechanical and Material Engineering2020-04-152020-05-202020-09-25
TechAALIET International Conference on Technologies for Active and Assisted Living2018-12-212019-01-182019-03-25
aNeurIPSConference on Neural Information Processing Systems2019-05-162019-09-042019-12-08
ccPCMPacific-Rim Conference on Multimedia2017-05-232017-07-012017-09-28