Journal Information
Computer Communications
Impact Factor:

Call For Papers
Computer and Communications networks are key infrastructures of the information society with high socio-economic value as they contribute to the correct operations of many critical services (from healthcare to finance and transportation). Internet is the core of today's computer-communication infrastructures. This has transformed the Internet, from a robust network for data transfer between computers, to a global, content-rich, communication and information system where contents are increasingly generated by the users, and distributed according to human social relations. Next-generation network technologies, architectures and protocols are therefore required to overcome the limitations of the legacy Internet and add new capabilities and services. The future Internet should be ubiquitous, secure, resilient, and closer to human communication paradigms.

Computer Communications is a peer-reviewed international journal that publishes high-quality scientific articles (both theory and practice) and survey papers covering all aspects of future computer communication networks (on all layers, except the physical layer), with a special attention to the evolution of the Internet architecture, protocols, services, and applications. Topics include, but are not limited to:

    Emerging technologies for next generation network
    Future Internet architecture, protocols and services
    Content- and service-centric architecture
    Mobile and ubiquitous networks
    Self organizing/autonomic networking
    Green networking
    Internet content search
    QoS and multimedia networking
    Opportunistic networking
    On-line social networks
    Internet of things
    Public safety communication networks
    Network applications (web, multimedia streaming, VoIP, gaming, etc.)
    Trust, security and privacy in computer and communication networks
    Modeling, measurement and simulation
    Complex network models
    Internet socio-economic models
    Experimental test-beds and research platforms
    Algorithmic aspects of communication networks
    Network scaling and limits
Last updated by Dou Sun in 2020-02-25
Special Issues
Special Issue on Autonomous Learning-Based Algorithm for Heterogeneous Cellular Networks
Submission Date: 2020-03-30

The spectrum bands of the multiple base stations comprise the sets of orthogonal wireless channels and spectrum usage scenarios the device to device pairs transmit over the dedicated frequency bands and the device to device pairs operate on the shared cellular channels. The goal of each device pair is to jointly select the wireless channel and power level to maximize its reward, defined as the difference between the achieved throughput and the cost of power consumption, constrained by its minimum tolerable signal-to-interference-plus-noise ratio requirements. We formulate this problem as a stochastic non-cooperative game with multiple players where each player becomes a learning agent whose task is to learn its best strategy and develop a fully autonomous multi-agent Q-learning algorithm converging to a mixed-strategy Nash equilibrium. The learning algorithm shows relatively fast convergence and near-optimal performance after a small number of iterations. Potential topics included, but not limited Reinforcement Learning for self organization and power control of two-tier heterogeneous networks · Optimal new site deployment algorithm for heterogeneous cellular networks· Energy cost minimization in heterogeneous cellular networks with hybrid energy supplies· Configuration algorithm for service scalability in heterogeneous cellular networks· Q-learning based heterogeneous network selection algorithm· Bayesian reinforcement learning-based algorithm for heterogeneous cellular networks· Machine learning paradigms for next-generation communication networks · Online distributed user association for heterogeneous radio access network
Last updated by Dou Sun in 2019-09-07
Special Issue on Internet of Things and Augmented Reality in the age of 5G
Submission Date: 2020-03-30

In the past few decades, people have made great efforts on the Internet of Things, which makes it possible or accessible to be applied in various fields, including home robotics, intelligent cities and Augmented Reality (AR). Therefore, these applications have captured the attention and enhanced aspirations of researchers in fields of machine vision, computer graphics and computer vision. The 5G network is a technology which is intrinsically featured with high bandwidth, ultra-low latency and high speed in a wireless communication network. In particular, 5G will facilitate establishing the Internet of Things as an essential part in our life through laying the foundation for releasing its full potential. With huge improvements over the present functions of the 4G, 5G guarantees a more IoT-friendly ecosystem. Though it remains a long way before 5G becomes a mainstream, companies should begin to develop and re-imagine products and services to utilize the better functions of 5G. Through generating digital twins, 5G combined with IoT will bring all items on the shelf to the internet. If there are billions of hardware-connected devices, the potential for regular consumer goods with digital twins to become components of the new Internet of Things will be higher. Augmented Reality refers to a crucial technology which promotes a major paradigm shift in the way that users interact with data. This has just been known as a feasible solution to different critical needs recently. Moreover, augmented reality (AR) technology can be utilized to visualize data from hundreds of sensors concurrently, overlaying related and actionable information over your environment via a headset. With much more data flow, 5G helps AR technology to be much faster. Featured with easier and more reachable use, it is more likely to be widely applied in various different functions (including video gaming). In conclusion, the convergence of IoT and AR in the age of 5G is actually cool forthcoming wave which will be related with IoT, where the great data storage will enable an AR lens into the scenes in the ways that offer almost immediate insight at a level of depth unthinkable before. Hence, this part intends to introduce the latest findings to the Internet of Things and Augmented Reality technologies for different applications in the age of 5G. It can promote technologists to quicken the latest technical progress. Topics include, but are not limited to: = Novel IoT techniques = Human, IoT and AI Communication Protocols = 5G and Its applications = Augmented reality = 5G-based video transfer techniques for IoT = Novel IoT devices = IoT for augmented reality = IoT device search in the era of 5G = Knowledge-based discovery of devices, data and services in the IoT =Real-world Applications of IoT: security; healthcare; advertising; and government
Last updated by Dou Sun in 2019-09-07
Special Issue on Network Intelligence
Submission Date: 2020-04-01

Network Intelligence considers the embedding of Artificial Intelligence (AI) in future networks to fasten service delivery and operations, leverage Quality of Experience (QoE) and guarantee service availability, also allowing better agility, resiliency, faster customization and security. This concept inherits the solid background of autonomic networking, cognitive management, and artificial intelligence. It is envisioned as mandatory to manage, pilot and operate the forthcoming network built upon SDN, NFV and cloud. The main goal of this special issue is to present state-of-the-art research results and experience reports in the area of network intelligence, addressing topics such as artificial intelligence techniques and models for network and service management; smart service orchestration and delivery, dynamic Service Function Chaining, Intent and policy based management, centralized vs. distributed control of SDN/NFV based networks, analytics and big data approaches, knowledge creation and decision making. The attention is particularly focused on the particular application of machine learning tools to the optimization of next generation networks. Machine and deep learning techniques become increasingly popular and achieve remarkable success nowadays in many application domains, e.g., speech recognition, bioinformatics and computer vision. Machine learning is capable to exploit the hidden relationship from voluminous input data to complicated system outputs, especially for some advanced techniques, like the deep learning. Moreover, some other techniques, e.g., reinforcement learning, could further adapt the learning results in the new environments to evolve automatically. These features perfectly match the complex, dynamic and time-varying nature of today’s networking systems. This special issue will be devoted to both theoretical and practical evaluations related to the design, analysis and implementation of network intelligent techniques. Some of the relevant topics include, but are not limited to the following: Deep and Reinforcement learning for networking and communications in networks Data mining and big data analytics in networking Design and optimization in intelligent networks Adaptive networking algorithms Intent & Policy-based management for intelligent networks Innovative architectures and infrastructures for intelligent networks AI/ML for network management and orchestration AI/ML for network slicing optimization AI/ML for service placement and dynamic Service Function Chaining AI/ML for C-RAN resource management and medium access control Decision making mechanisms Routing optimization based on flow prediction network systems Data analytics for network and wireless measurements mining Methodologies for network problem diagnosis, anomaly detection and prediction Network Security based on AI/ML techniques AI/ML for multimedia networking AI/ML support for ultra-low latency applications AI/ML for IoT Open-source networking optimization tools for AI/ML applications Experiences and best-practices using machine learning in operational networks Novel context-aware, emotion-aware networking services Machine learning for user behavior prediction Modeling and performance evaluation for Intelligent Network Transfer learning and reinforcement learning for network management Intelligent network management Fault-tolerant network protocols using machine learning Big data analysis for network monitoring
Last updated by Dou Sun in 2019-11-24
Special Issue on Intelligent Resource Management in Mobile Edge Computing for IIoT applications
Submission Date: 2020-04-30

Scope and Motivation: The involvement of Wireless Sensor Network (WSN) in the field of Internet-of-Things (IoT) and Industrial IoT (IIoT) applications has become a new hotspot for the researchers and industries. The Artificial Intelligence (AI) and Machine Learning (ML) is an emerging technology that has proven to have great potential in communications, such as signal classification, channel estimation, and performance optimization. However, in the current era of 5G and Mobile Edge Computing (MEC), cooperative and heterogeneous communication scenarios are developing in a complex and large-scale trend for all the smart applications, especially for IIoTs. These technologies play a vital role in routing establishment, network resource optimization and energy-efficient computing, which are mainly operating in Network and Transport upper layer of OSI model. The power allocation and resource management in these networks remain a challenge due to the miniature size, limited battery life and dynamic movement of the sensor nodes in the Industrial applications. Therefore, for efficient power management and necessary optimization for IIoT applications, AI, Deep Learning (DL) and other Neural Network (NN) based approaches will come up as a solution for green communication. These technologies can be used in MEC paradigm for IIoT applications due to is merits of real-time response, resource capacity as well as low processing power. For the 5G system specifications (MEC ISG) and 3GPP ecosystem draws attention towards the edge computing enablers. These MEC systems provide opportunities of the ETSI Multi-access Edge Computing group. Recently, the International Telecommunication Union (ITU) have also expressed that the MEC services and resource management can be offered by both mobile network operators as well as third party vendors. Thus, the scope of these paradigms is growing exponentially towards the design, deployment, maintenance and security of MEC for Industry 4.0 applications. This special issue will focus on Intelligent Resource management in MEC with the theory and practical aspects of recent outcomes and developments in MEC IIoT applications. The issue invites the researchers and industry practitioners to submit high quality unpublished technical articles, highlighting the scopes and challenges for resource allocation in MEC IIoT applications. The topics of interest include, but are not limited to: Distributed Artificial Intelligence based computation models for MECs Resource optimization in IIoT applications Network set-up, security issues with MEC IIoT Resource allocation techniques for green MEC IIoT services Data offloading, traffic and cloud services for Industrial MEC Energy efficient user scheduling and resource allocation for Industry 4.0 Congestion control and low latency for MEC IIoT Neural network based ambience intelligence for IIoT AI based cloud virtualization in Industrial MEC Cloud and edge security for IIoT applications Fog and mist computing for Industry 4.0 Architecture and scalability for AI enabled mobile IIoT applications
Last updated by Dou Sun in 2019-11-24
Special Issue on Security and Privacy in Internet of Medical Things: Problems and Solutions
Submission Date: 2020-04-30

Recent adoption of Internet of Things (IoT) into medical applications has generated huge interest among researchers in academia and industry. Internet of Medical Things (IoMT) forms an ecosystem of connected sensors, wearable medical devices and clinical systems capable of improving the quality of medical treatment with reduced cost and timely delivery of responses. Experts forecast, almost 40% IoT-related technology will be health related, more than any other category and the IoMT market will exceed $136.8 billion by 2021. Driven largely by the advances in Artificial Intelligence, Wireless Technologies, Sensors, Mobile Devices, Big Data, and Fog computing, IoMT is rapidly enhancing the efficiency of diagnosis, treatment and patient monitoring. While IoMT offers enormous benefits, the ubiquitously connected devices also raise serious security and privacy concerns. Recent reports reveal an alarming statistic with 72% of malicious traffic targeted at healthcare providers and 81% of healthcare providers accepting that one or more systems have been compromised. With most of the healthcare applications collecting, processing and transmitting sensitive and critical medical information, it is highly essential to secure the network from potential attacks. Attacks on the network and medical devices can cause significant physical harm and life-threatening damage to the patients. This special issue focuses on potential threats and problems faced by the network, connected devices, applications and systems in IoMT. The special issue encourages authors to submit papers that highlights the security and privacy concerns in IoMT application and possible solutions. Topics of interest for this special issue include, but are not limited to, Authorization and access control for IoMT Privacy preservation techniques for medical data in IoMT Wearable and implantable medical device security Big data analytics and privacy-preserving data mining Secure communication techniques for IoMT Vulnerabilities, threat models and risk management Blockchain for healthcare security in IoMT Intrusion detection and mitigation techniques for IoMT Privacy-enhancing cryptographic techniques in IoMT Middleware for privacy protection in IoMT applications Resilience models for advanced threats in IoMT Future perspectives of security and privacy issues in IoMT applications
Last updated by Dou Sun in 2020-02-25
Special Issue on Optimization of Cross-layer Collaborative Resource Allocation for Mobile Edge Computing, Caching and Communication
Submission Date: 2020-05-31

With the rapid development of mobile communications and the explosive usage of mobile devices (i.e., smart phones, laptops, tablets, etc.), the mobile Internet facilitates us with a pervasive and powerful platform to provide more and more emerging applications. However, many mobile devices usually have limited computation capabilities and battery power. Migrating computational tasks from the distributed devices to the infrastructure-based cloud servers has the potential to address the aforementioned issues. The cloud servers are always located in the center of core network and far away from the users, which may cause delay fluctuation and additional transmission energy cost. Mobile Edge Computing (MEC) is an emerging paradigm which pursues to provide better services by moving infrastructure-based cloud resources (computation, storage, bandwidth and et al) to the edge of the network. MEC is rapidly becoming a key technology of 5G, which helps to achieve the key technical indicators of 5G business, such as ultra-low latency, ultra-high energy efficiency and ultra-high reliability. Differ from the traditional cloud, MEC is close to the mobile users, which can reduce the access delay and the cost of using the cloud service. However, we are facing many challenges for scheduling the limited and heterogeneous MEC resources (computation resource and network resource). Firstly, how to implement a cross-layer optimization policy for MEC that jointly optimizes the application layer, network layer, data link layer, as well as physical layer of the protocol stack using an application-oriented objective function while satisfying the different user service requirements (i.e., energy saving, reducing execution delay, reducing price, and et al) is very essential. Secondly, a theoretical framework of cross-layer optimization to balance the efficiency and fairness of resource allocation of MEC, as well as maximize the profit of MEC service providers needs to be proposed. Thirdly, how to design cross-layer collaborative distributed resource management systems that meet the harsh requirements of MEC such as latency, scalability and mobility support, also needs to be considered. In addition, it is also essential to jointly optimize the resource allocation of computation and communication of both the mobile users and the MEC service provider to minimize the total energy consumption subject to the users’ latency constraint. It is highly expected that “mobile edge computing (MEC)” will be a key technology playing the most important role in 5G and future network. More importantly, it can improve user experience and user service quality. In addition, as pointed out in the section 1, the topic of joint cross-layer collaborative resource optimization for mobile edge computing is not only important but also faces many challenges. Therefore it is necessary to address them to come true the MEC based evolution in our life. This topic is very promising and will attract great interests from readers, including researchers from academia and industry, general readers, mobile application developer as well as students who are engaged in this study. This Special Section in Computer Communications is inviting researchers to report the-stated-of-the-art advances in joint cross-layer collaborative resource optimization. Authors are invited to submit original practical work and survey papers. Topics of interest include (but are not limited to): Theoretical modeling and performance analysis of resource optimization for MEC Joint cross-layer resource allocation for MEC New integration resource management architecture of cloud, MEC and user Cross-layer service discovery and service recommendations for MEC Multi-user computation offloading for MEC Multi-edge-server collaboration for MEC Delay minimization service provision for MEC Cross-layer collaborative distributed systems for MEC Cross-layer collaborative MEC Applications, such as smart city, smart grid, and Intelligent Transportation Systems Software-defined MEC Software-defined offloading for MEC Mobility management for MEC Security, privacy, and trust of MEC MEC for vehicular networks We also highly recommend the submission of multimedia with each article as it significantly increases the visibility, downloads, and citations of articles.
Last updated by Dou Sun in 2019-11-24
Special Issue on AI-Driven Sensing and Computing for Cyber-Physical Systems
Submission Date: 2020-07-01

The cyber-physical system (CPS) has been coming into our view and will be applied in our daily life and business process management. The emerging CPS must be robust and responsive for its implementation in coordinated, distributed, and connected ways. It is expected that future CPS will far exceed today’s systems on a variety of characteristics, for example, capability, adaptability, resiliency, safety, security, and usability. With the rapid development of computing and sensing technologies, such as ubiquitous wireless sensor networks, the amount of data from dissimilar sensors and social media has increased tremendously. Conventional data fusion algorithms such as registration, association, and fusion are not effective for massive datasets. New research opportunities and challenges for content analysis on CPS networks have arisen. Making sense of these volumes of Big Data requires cutting-edge tools that can analyze and extract useful knowledge from vast and diverse data streams. Current research in Intelligent Sensing addresses the following issues: AI-Driven Sensing as a novel methodology for user-centered research; development of new services and applications based on human sensing, computation, and problem solving; engineering of improved AI-Driven Sensing platforms including quality control mechanisms; incentive design of work; usage of Participatory Sensing for professional business; and theoretical frameworks for evaluation. This is opening a vast space of opportunities to extend the current networks, communications, and computer applications to more pervasive and mobile applications. The purpose of this SI is to provide a forum for researchers and practitioners to exchange ideas and progress in related areas. In this special issue, we invite articles on innovative research to address challenges of Analytics and Applications on AI-Driven Sensing and Computing for Cyber-Physical Systems. Topics of interests include, but are not limited to: Distributed processing for data sensor data in CPS networks Approximate reasoning and pattern recognition for CPS networks AI in mobile networking AI-Driven analytics for social media-sensor data integration AI platforms for efficient integration with CPS networks Virtualized and cloud-oriented resources for big data processing for CPS networks machine learning algorithms for CPS networks Visual analytics on CPS networks
Last updated by Dou Sun in 2020-03-06
Special Issue on Industrial communication networks in smart factory 4.0
Submission Date: 2020-08-30

With the rapid development of electronics, information technology and advanced manufacturing technology, the production mode of manufacturing enterprises is shifting from digital to intelligent. These exponentially growing developments have accelerated the emergence of a new era of manufacturing that combines virtual reality technology based on the Cyber-Physical Systems (CPSs). Emergence of such an intelligent manufacturing technology gives the ability to respond rapidly to design changes and innovation, which has a huge competitive advantage over traditional manufacturing processes. Europe 2020 strategy, Industry 4.0 strategy and China manufacturing 2025 have been proposed as a direct result of countries focusing their attention on this new technology. United States has gradually accelerated the speed of reindustrialization and manufacturing reflow. The transformation of intelligent manufacturing systems will have a profound and lasting worldwide impact on the future of manufacturing. Industry 4.0 settings need to handle challenges with ICT tools, cost efficiency, fault tolerance, autonomous decision-making, full lifecycle traceability control, business-intelligence capabilities, new forms of human-machine interaction, higher capacities to handle data processing, energy-efficiency demands, and cooperative tasks. The use of information communication technologies such as Internet of things (IoT), augmented and virtual reality, fog and edge computing, together with wireless sensor networks, will enable novel cyber-secured, resilient, human-centric, and context-aware applications to face these upcoming challenges. Potential topics include, but are not limited to: Architecture and protocol design for smart factory 4.0 Resource management in industrial IoT systems 5G for future industrial automation Security, safety and privacy issues in industrial wireless networks and applications Performance evaluation, simulation, RF measurements, and modeling of industrial IoT systems Intelligent Machine to machine communications in industrial IoT Cognitive Industrial systems Cloud-based industrial internet of things solutions
Last updated by Dou Sun in 2020-03-18
Related Conferences