Journal Information
The Journal of Systems Architecture: Embedded Software Design (JSA)
https://www.sciencedirect.com/journal/journal-of-systems-architecture
Impact Factor:
4.1
Publisher:
Elsevier
ISSN:
1383-7621
Viewed:
43352
Tracked:
45
Call For Papers
Aims & Scope

The Journal of Systems Architecture: Embedded Software Design (JSA) is a journal covering all design and architectural aspects related to embedded systems and software. It ranges from the microarchitecture level via the system software level up to the application-specific architecture level. Aspects such as real-time systems, operating systems, programming languages, communications (limited to analysis and the software stack), mobile systems, parallel and distributed architectures as well as additional subjects in the computer and system architecture area will fall within the scope of this journal. Technology will not be a main focus, but its use and relevance to particular designs will be. Case studies are welcome but must contribute more than just a design for a particular piece of software.

Design automation of such systems including methodologies, techniques and tools for their design as well as novel designs of software components fall within the scope of this journal. Novel applications that use embedded systems are also central in this journal. While JSA does not focus on hardware design, hardware/software co-design techniques with an emphasis on software are also relevant here.

We invite you to convert your open source software into an additional journal publication in Software Impacts, a multi-disciplinary open access journal. Software Impacts provides a scholarly reference to software that has been used to address a research challenge. The journal disseminates impactful and re-usable scientific software through Original Software Publications (OSP) which describe the application of the software to research and the published outputs.
Last updated by Dou Sun in 2026-01-20
Special Issues
Special Issue on AI-Driven Real-Time Distributed Computing for the Edge-Cloud Continuum
Submission Date: 2026-01-31

Guest editors: Daniel Casini, PhD Scuola Superiore Sant'Anna, Pisa, Italy Pascal Berthou, PhD University of Toulouse III, UPS, CNRS-LAAS Toulouse, France Mustafa Al Lail, PhD Texas A&M International University, Laredo, TX, USA AKRAM HAKIRI, PhD University of Pau & Pays de l’Adour, France Aniruddha S Gokhale, PhD Vanderbilt University, USA Thierry Gayraud, PhD LAAS-CNRS, University of Toulouse, CNRS, UPS, France Special issue information: The widespread evolution of Artificial Intelligence (AI)-driven Internet of Things (IoT) applications, such as autonomous systems, smart cities, and industrial automation, have significantly altered the landscape of real-time distributed computing. These advancements lay the foundation for future innovations toward ensuring system performance, dependability, testability, reliability, flexibility, scalability, and autonomous computing. These ilities are particularly important as we move toward 2030, where themes such as AI-driven networks, 6G connectivity, massive twining, metaverse, self-autonomous robots, and smart autonomous systems demand robust, real-time processing and intelligent decision-making across a distributed network infrastructure. This special issue solicits high-quality papers pertaining to all aspects of objects, components, service-oriented real-time and distributed computing technology that address the growing challenges of real-time distributed computing, particularly through case studies and applications that demonstrate the efficacy of proposed methods in real-world distributed systems, to meet the demands of next-generation edge AI-enabled IoT applications such as autonomous vehicles, smart cities, intelligent transportation systems, industrial automation systems and industry 4.0, smart grids, avionics, spatial, under-water, autonomous vehicles, consumer electronics, multimedia processing, etc. with an emphasis on scalability, security, and integration with modern technologies. The specific SI focus areas include, but are not limited to: Distributed and/or Real-Time Image, video, and Stream Processing Emerging Next-Gen Software-Defined Embedded Systems and Networks Federated Learning, TinyML, Edge ML, Generative AI, and Fog Computing Real-Time Data Analytics, Management, and Monitoring Middleware, Cloud Connectivity, and Microservices DevOps for Distributed Real-time Computing Optimization Algorithms, metaheuristics, and graphs for Edge-Cloud continuum Sustainable and Green Computing Transformation Formal Methods, Verification, and Model Checking Ontology-based Knowledge Modeling Dependability, Fault tolerance, and Resilience AI/ML Algorithms for Real-Time Analytics Operating Systems, Middleware, and System software and Software architectures Blockchain and Security Enhancements Digital Twins for Distributed and/or Real-Time IoT Systems and Applications Manuscript submission information: Prospective authors should submit their manuscripts following The Journal of Systems Architecture (JSA) guidelines. Details can be found at: Guide for authors - Journal of Systems Architecture. Solicited original submissions must not be currently under consideration for publication in other venues. All manuscripts and any supplementary material should be submitted through the Submission site for Journal of Systems Architecture. Please select the “VSI:AI4ORC” option as article type of the paper. All submissions deemed suitable by the editors to be sent for peer review will be reviewed by at least two independent reviewers. Once your manuscript is accepted, it will go into production to be published in the special issue. The special issue anticipates receiving extended papers from the IEEE ISORC 2025 conference. Manuscript Submission Deadline: January 31, 2026, 2025
Last updated by Dou Sun in 2026-01-20
Special Issue on Reliable Software Technologies (AEiC2026)
Submission Date: 2026-02-13

The 30th Ada-Europe International Conference on Reliable Software Technologies (AEiC 2026) will take place in Västerås, Sweden. The conference schedule comprises a journal track, a regular track, an industrial track, a work-in-progress track, a vendor exhibition, parallel tutorials, and satellite workshops. Following a journal-first model, this edition of the conference includes a "Journal-Track" which seeks original and high-quality papers that describe mature research work on the conference topics. Accepted journal-track papers will be published in the "Reliable Software Technologies (AEiC2026)" Special Issue of the Journal of Systems Architecture (JSA). Guest editors: Professor Axel Jantsch TU Wien, Austria axel.jantsch@tuwien.ac.at Associate Professor António Casimiro University of Lisbon, Portugal casim@ciencias.ulisboa.pt Special issue information: Topics: Formal and model-based engineering of critical systems Formal specification; Formal verification; Integrated methods for engineering reliable software-intensive systems; Formal architectural patterns; Multi-aspect modelling and verification; Model-based engineering of safety-critical systems. High-Integrity Systems and Reliability Theory and Practice of High-Integrity Systems: Medium to Large-Scale Distribution, Fault Tolerance, Security, Reliability, Trust and Safety, Languages Vulnerabilities, Assurance Case; Software Architectures for Reliable Systems: Design Patterns, Frameworks, Architecture-Centered Development, Component-based Design and Development; Methods and Techniques for Quality Software Development and Maintenance: Requirements Engineering, Re-engineering and Reverse Engineering, Reuse, Software Management Issues, Compilers, Libraries, Support Tools. AI for High-Integrity Systems Engineering AI for code generation, AI for test-generation, AI for refactoring of code, AI for code comprehension, AI for program analysis; Evaluation of how well AI solutions for software engineering perform (efficiency, accuracy, …); Reliable/responsible/robust AI. Real-Time Systems Design and Implementation of Real-Time and Embedded Systems: Real-Time Scheduling, Design Methods and Techniques, Architecture Modelling, HW/SW Co-Design, Reliability and Performance; Design and Implementation of Mixed-Criticality Systems: Scheduling Methods, Architectures, Design Methods, Analysis Methods. Ada Language Ada Language and Technologies: Compilation Issues, Runtimes, Ravenscar, Profiles, Distributed Systems, SPARK; Experiences with Ada: Reviews of the Ada 2012 or 2022 language features, implementation and use issues, positioning in the market and in the software engineering curriculum, lessons learned on Ada Education and Training Activities with bearing on any of the conference topics. Domain applications Mainstream and Emerging Applications with Reliability Requirements: Manufacturing, Robotics, Avionics, Space, Health Care, Transportation, Cloud Environments, Smart Energy Systems, Serious Games, etc; Experience Reports in Reliable System Development: Case Studies and Comparative Assessments, Management Approaches, Qualitative and Quantitative Metrics. Manuscript submission information: General information for submitting papers to JSA can be found at Guide for authors - Journal of Systems Architecture. Submissions should be made online at Submission site for Journal of Systems Architecture. Please select the “VSI:AEiC2026” option as article type of the paper. JSA has adopted the Virtual Special Issue model to speed up the publication process, where Special Issue papers are published in regular issues, but marked as SI papers. Acceptance decisions are made on a rolling basis. Therefore, authors are encouraged to submit papers early, and need not wait until the submission deadline. Important Dates: Open date: 13-Jan-2026 Submission deadline: 13-Feb-2026 Acceptance deadline: 30-Jun-2026
Last updated by Dou Sun in 2026-01-20
Special Issue on Sustainable Computing Algorithm, Architecture, and Applications for LLM in Embedded Computing Systems
Submission Date: 2026-03-20

The rapid advancement of large language models (LLMs), has revolutionized natural language processing (NLP), enabling breakthroughs in tasks like text generation, summarization, and code completion. Building on this progress, multimodal large languagemodels (MLLMs) have further extended these capabilities to integrate and reason across text, images, audio, and video, transforming fields such as computer vision, robotics, and humancomputer interaction. However, the increasing scale of LLMs, driven by the need for higher performance and generalization, results in massive computational demands, including trillions of FLOPs, extensive memory requirements, and significant energy consumption. To address these challenges, the research community is focusing on developing energy efficient system architectures that maintain high performance in embedded computing systems. Key innovations include LLM-specific hardware accelerators, such as TPUs, NPUs, and neuromorphic chips, designed to optimize transformer-based architectures and reduce inference latency. Heterogeneous computing architectures, integrating CPUs, GPUs, and LLM-specific accelerators, enable dynamic task allocation and significantly enhance energy efficiency during both training and inference. Advanced packaging techniques, such as 3D stacking and chiplet-based designs, are being adopted to improve memory bandwidth and reduce energy consumption in LLM deployments. Additionally, distributed LLM frameworks leveraging edge computing and cloud-edge collaboration, enable efficient task offloading and reduce latency for real-time applications like conversational AI. Furthermore, LLM-driven resource management, utilizing reinforcement learning for energy-aware scheduling, is optimizing power usage during both training and inference phases. This special issue seeks to bring together insights and breakthroughs in sustainable system architectures for LLMs, focusing on scalable and energy-efficient solutions for training and deployment in embedded computing systems. We invite contributions that explore innovative hardware solutions, software optimizations, heterogeneous computing, memory architectures, and resource management strategies for large-scale LLM deployment. The objective is to advance scalable, high-performance LLM systems that balance computational efficiency with environmental sustainability. Guest editors: Prof. Xiaokang Wang, School of Computer Science and Artificial Intelligence, Zhengzhou University Prof. Parimala Thulasiraman University of Manitoba, Canada Dr. Tongfeng Weng National University of Singapore, Singapore Prof. Hai Jiang Beijing University of Posts and Telecommunications, China Special issue information: In this special issue, we solicit original work exclusively on emerging topics or advances of sustainable system architecture for large language models (LLM)/ multimodal large language models (MLLM).The list of possible topics includes, but not limited to: Energy-Efficient Architectures for LLM/MLLM Models Lightweight Multimodal Fusion Techniques for Edge-Deployed LLMs Sustainable Hardware Accelerators for Edge LLM/MLLM Workloads Sustainable Training and Inference of LLM/MLLM on Heterogeneous Edge Devices Heterogeneous System Architectures for Scalable LLM/MLLM On-Device Fine-Tuning of LLMs for Edge Computing: Energy-Aware Algorithms Energy-Aware Edge-Cloud Collaboration for Sustainable LLM Deployment AI Algorithm Co-Design for Energy-Constrained Embedded Systems Scalable Solutions for Real-Time AI Processing on Edge Devices Memory and Data Transfer Architectures Optimized for Embedded AI AI/ML Algorithms for Embedded Systems Distributed Real-time LLM/MLLM model for Embedded Systems End-Edge-Cloud Collaboration for LLM/MLLM Sustainable Parallel & Distributed Computing System Architectures for LLM/MLLM Scalable Algorithms for Energy-Efficient Computation in LLM Models High-Performance Network Communication for Edge AI Systems Energy-Efficient Hardware and Software Design for AI Systems Federated Learning Architectures for LLM/MLLM AI-Driven Optimizations for Supply Chain Management in AI System Production Techniques for Efficient Model Compression and Pruning in Edge AI Tools and Frameworks for Energy Monitoring and Optimization in Edge AI Pipelines Edge-Native LLMs: Redefining Model Architectures for Sustainable Embedded Systems Sustainable LLM Applications in Smart Cities: Multimodal and Edge-Collaborative Approaches Manuscript submission information: General information for submitting papers to JSA can be found at Guide for authors - Journal of Systems Architecture. Submissions should be made online at Submission site for Journal of Systems Architecture. Please select the “VSI:Sustainable Computing 3A” option as article type of the paper. Final Manuscript Submission Deadline: 20th March 2026
Last updated by Dou Sun in 2026-01-20
Special Issue on Security and Efficiency for LLM-Based Edge Intelligence
Submission Date: 2026-06-30

With the rapid development of artificial intelligence, big data, and machine learning, Large Language Models (LLMs) have advanced rapidly in recent years, and they are now applied across a wide range of domains. In another aspect, Edge Intelligence has emerged as new hot research area. It can reduce latency and lowers bandwidth usage by processing data on local devices. The integration of LLM with Edge Intelligence (EI) has introduced a new field of research, which is called LLM-Based Edge Intelligence. Despite these tremendous advances, LLM-Based Edge Intelligence introduce considerable new security risks. LLMs are trained with massive volumes of data collected from online repositories, as well as forums and websites. This training process inevitably incorporates private information, including user credentials and secrets. Because of the powerful memorization capacity of LLMs, LLMs are prone to reproducing such sensitive information. On the other hand, there is large room to improve the efficiency and scalability of LLM-Based Edge Intelligence with new optimization techniques considering the resource-constrained edge environments. Guest editors: Prof. Meikang Qiu, Augusta University, USA Prof. Wenqi Wei, Fordham University, USA Special issue information: This special issue thus serves as a forum for gathering original research outcomes in security and privacy protection for new architectures, approaches, and applications of LLM-Based Edge Intelligence, with the ultimate goal of advancing secure, efficient, robust, and reliable LLM-based Edge Intelligent systems. Scope of the special issue Security for LLM-based Edge Intelligent systems Privacy protection for LLM-based Edge Intelligent systems Storage and memory system for LLM-based Edge Intelligent systems Energy-efficient computing for LLM-based Edge Intelligent systems Intelligent resource provision and load balance for LLM-based Edge Intelligent systems Protocols and architecture for LLM-based Edge Intelligent systems Advanced multi-cloud and hybrid cloud for LLM-based Edge Intelligent systems Environment impact and evaluation for LLM-based Edge Intelligent systems New robust algorithms design for LLM-based Edge Intelligent systems Reliable design and framework for LLM-based Edge Intelligent systems Manuscript submission information: General information for submitting papers to JSA can be found at Guide for authors - Journal of Systems Architecture. Submissions should be made online at Submission site for Journal of Systems Architecture. Please select the “VSI: LLMEI” option as article type of the paper. Final Manuscript Submission Deadline: 30th June 2026
Last updated by Dou Sun in 2026-01-20
Related Journals