期刊信息
Neurocomputing
登录查看期刊网址

影响因子:
6.5
出版商:
Elsevier
ISSN:
0925-2312
浏览:
133442
关注:
194
征稿
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.

NEW! Neurocomputing's Software Track allows you to expose your complete Software work to the community through a novel Publication format: the Original Software Publication

Overview:

Neurocomputing welcomes theoretical contributions aimed at winning further understanding of neural networks and learning systems, including, but not restricted to, architectures, learning methods, analysis of network dynamics, theories of learning, self-organization, biological neural network modelling, sensorimotor transformations and interdisciplinary topics with artificial intelligence, artificial life, cognitive science, computational learning theory, fuzzy logic, genetic algorithms, information theory, machine learning, neurobiology and pattern recognition.

Neurocomputing covers practical aspects with contributions on advances in hardware and software development environments for neurocomputing, including, but not restricted to, simulation software environments, emulation hardware architectures, models of concurrent computation, neurocomputers, and neurochips (digital, analog, optical, and biodevices).

Neurocomputing reports on applications in different fields, including, but not restricted to, signal processing, speech processing, image processing, computer vision, control, robotics, optimization, scheduling, resource allocation and financial forecasting.

Types of publications:

Neurocomputing publishes reviews of literature about neurocomputing and affine fields.

Neurocomputing reports on meetings, including, but not restricted to, conferences, workshops and seminars.

NEW! The Neurocomputing Software Track

Neurocomputing Software Track publishes a new format, the Original Software Publication (OSP) to disseminate exiting and useful software in the areas of neural networks and learning systems, including, but not restricted to, architectures, learning methods, analysis of network dynamics, theories of learning, self-organization, biological neural network modelling, sensorimotor transformations and interdisciplinary topics with artificial intelligence, artificial life, cognitive science, computational learning theory, fuzzy logic, genetic algorithms, information theory, machine learning, neurobiology and pattern recognition. We encourage high-quality original software submissions which contain non-trivial contributions in the above areas related to the implementations of algorithms, toolboxes, and real systems. The software must adhere to a recognized legal license, such as OSI approved licenses.

Importantly, the software will be a full peer reviewed publication that is able to capture your software updates once they are released. To fully acknowledge the author's/developers work your software will be fully citable as an Original Software Publication, archived and indexed and available as a complete online "body of work" for other researchers and practitioners to discover.
最后更新 Dou Sun 在 2025-09-22
Special Issues
Special Issue on Scalable Machine Learning on High-Performance Computing Platforms
截稿日期: 2026-04-30

In recent years, machine learning has become an essential computational tool in many areas of science and engineering. At the same time, the increasing scale of data, model size, and computational complexity has made high-performance computing a fundamental enabler for modern machine learning research and applications. Conversely, machine learning techniques are being increasingly explored to improve efficiency, scalability, and robustness in high-performance computing systems and scientific workflows.

This Special Issue aims to provide a focused forum for recent research at the intersection of machine learning and high-performance computing, with an emphasis on scalable methods, algorithm–system interaction, and application-driven studies. Rather than promoting new paradigms, the Special Issue seeks to collect solid, well-founded contributions that address practical challenges arising from large-scale learning and computation.

The scope includes methodological developments, system-level studies, and representative applications where machine learning and high-performance computing are closely coupled.

Guest editors:

Prof. Yiqin Lu (Executive Guest Editor)
South China University of Technology, Guangzhou, China
Email: qin@scut.edu.cn

Prof. Guo Chen
Hunan University, Changsha, China
Email: guochen@hnu.edu.cn

Assoc. Prof. Rui Yin
University of Florida, Gainesville, Florida, United States
Email: ruiyin@ufl.edu

Special issue information:

Topics of interest include, but are not limited to:

Machine Learning on High-Performance Computing Systems

Distributed and parallel training of machine learning models
Model and data parallelism for large-scale learning
Communication-efficient and scalable learning algorithms
Performance analysis and optimization of learning workloads

High-Performance Computing Architectures and Systems

GPU- and accelerator-based computing for machine learning
Heterogeneous computing platforms and programming models
Memory management and data movement for large-scale learning
Energy-efficient and reliable computing for machine learning workloads

Machine Learning in Scientific and Engineering Computing

Machine learning for numerical simulation and modeling
Surrogate and reduced-order modeling
Hybrid data-driven and physics-based approaches
Uncertainty quantification and reliability analysis

Machine Learning for HPC System Management

Resource allocation and scheduling
Performance modeling and autotuning
Fault detection and system monitoring

Manuscript submission information:

Important Dates:

Submission Open Date: January 31, 2026
Manuscript Submission Deadline: April 30, 2026
First-round Decisions: June 30, 2026
Revised Manuscript Due: August 31, 2026
Final Decisions: September 30, 2026
Publication: before December 31, 2026

Prospective authors should be invited by the Guest Editor team, and follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system.

Please refer to the Guide for Authors to prepare your manuscript.

For any further information, the authors may contact the Guest Editors.

Keywords:

Machine Learning; high-Performance Computing

https://www.sciencedirect.com/special-issue/329987/scalable-machine-learning-on-high-performance-computing-platforms
最后更新 Dou Sun 在 2026-03-11
Special Issue on Positive Noise Learning
截稿日期: 2026-06-15

Noise is an emerging and popular keyword in recent years. The noise-based models have attracted more and more attention in the artificial intelligence community, including but not limited to random forest, dropout in neural networks (a kind of structural positive noise), generative adversarial networks, adversarial training, noisy augmentation, positive-incentive noise, diffusion models, and flow matching models. Although most of these models don’t explicitly claim that they aim to learn noise, they actually utilize the positive noise implicitly. In many current studies, it is pointed out that noise can also be beneficial to large models. Therefore, noise should not be simply regarded as a harmful component any more. The positivity of noise deserves more systematic studies.

Although there are plenty of noise-related models, scientific studies of beneficial noise learning are still lacking to some extent. Most of these noise-based models just use positive noise in a heuristic way. This Special Issue calls for papers that study several attractive, natural, and urgent questions: (1) how a model learns the positive noise in a controllable manner; (2) what kind of noise will be beneficial to specific models/tasks; (3) the theoretical bound of positive noise.

Guest editors:

Dr. Hongyuan Zhang (Executive Guest Editor)
The University of Hong Kong, Hong Kong
Email: hyzhang98@gmail.com
Research Interests: Noise Learning, Representation Learning, Generative AI, Embodied AI

Prof. Xuelong Li
Chief Technology Officer (CTO) and Chief Scientist of China Telecom, China
Email: xuelong_li@ieee.org
Research Interests: Noise Analysis, Computer Vision, Imaging

Special issue information:

This Special Issue seeks to cover a wide range of topics related to positive noise learning and analysis, including but not limited to:

1. Positive-incentive noise;
2. Noise-based generative models such as GAN, diffusion models, and flow matching;
3. Positive noisy and uncertain structure in deep learning models;
4. Noisy model training such as positive noisy labels and adversarial training;
5. Noisy augmentations in diverse fields such as representation learning and signal detection;
6. Positive noise in large models;
7. Positive noise in data acquisition;
8. Explainable analysis for beneficial noise.

Manuscript submission information:

Important Dates:

Submission Open Date: August 20, 2025
Submission Deadline: May 15, 2026
Final Acceptance Deadline: January 15, 2027

Prospective authors should follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system. Authors must select “VSI: Positive Noise Learning" when they reach the "Article Type" step.

Please refer to the Guide for Authors to prepare your manuscript.

For any further information, the authors may contact the Guest Editors.

Keywords:

Noise Learning, Information Theory, Explainability, Generative Models
最后更新 Dou Sun 在 2025-09-22
Special Issue on Neural Dynamics in Intelligent Computing and Applications
截稿日期: 2026-10-30

In recent decades, neural dynamics has emerged as a fundamental mechanism for understanding and enhancing intelligent computing, offering powerful tools for modeling, prediction, optimization, and adaptive decision-making. With the rapid growth of computational resources and the increasing complexity of real-world tasks, neural dynamics has found wide-ranging applications in domains such as autonomous systems, healthcare, financial technology, robotics, and smart manufacturing. By capturing the temporal evolution of neural states and interactions, neural dynamics provides adaptive, data-driven solutions with strong nonlinear modeling and representation capabilities, enabling intelligent systems to achieve improved accuracy, robustness, and generalization in dynamic and uncertain environments.

Neural dynamics emphasizes the time-varying processes, feedback mechanisms, and emergent behaviors of neural systems, thereby offering a novel perspective on learning, adaptation, and control. The integration of neural dynamics with control theory, optimization, reinforcement learning, and neuromorphic computing has opened up new research frontiers, such as interpretable dynamic modeling, safe and robust decision-making under uncertainty, and real-time adaptive control in rapidly changing environments.

Given the significant progress and growing body of research in this area, this Special Issue now specifically invites survey/review articles that comprehensively summarize and critically evaluate recent advances in neural dynamics for intelligent computing. Survey/review papers should provide insightful syntheses of existing work, identify key challenges and emerging trends, and propose future research directions.

The aim of this Special Issue is to gather high-quality survey/review articles that survey the latest developments in neural dynamics, with particular emphasis on theoretical foundations, innovative modeling approaches, algorithmic design, and applications in emerging technologies. We encourage contributions that explore the integration of neural dynamics with control systems, robotics, natural language processing, and real-time decision-making in complex environments from a survey/review perspective.

Guest editors:

Prof. Long Jin (Executive Guest Editor)
Lanzhou University, Lanzhou, China
Email: jinlong@lzu.edu.cn
Areas of Expertise: dynamics, neural networks, robotics, and optimization

Prof. Shuai Li
University of Oulu, Oulu, Finland
Email: shuai.li@oulu.fi
Areas of Expertise: dynamics, neural networks, robotics, and intelligent control

Special issue information:

Topics for this call for survey/review papers include but not restricted to:

Theoretical foundations of neural dynamics in intelligent computing (robustness, convergence, generalization, interpretability)
Novel neural dynamics architectures and learning paradigms for intelligent computing
Neural dynamics-based adaptive control, predictive control, and robust optimization
Neural dynamics for multi-agent systems, distributed intelligence, and collaborative decision-making
Neural dynamics for robotics: motion planning, autonomous navigation, and humanrobot interaction
Neural dynamics in medical diagnosis, bio-signal analysis, and healthcare decision support
Neural dynamics in financial forecasting, recommender system, risk management, and smart manufacturing
Integration of neural dynamics with reinforcement learning, evolutionary algorithms, and signal processing techniques
Real-time learning and deployment of neural dynamics in time-varying, uncertain, or safety-critical environments
Emerging applications of neural dynamics-based intelligent computing in braincomputer interfaces, autonomous driving, soft robotics, and industrial intelligence

Manuscript submission information:

Important Dates:

Submission Open Date: December 20, 2025
Submission Deadline: October 30, 2026
Final Acceptance Deadline: December 31, 2026

Prospective authors should follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system. Authors must select “VSI: Neural Dynamics" when they reach the "Article Type" step.

Please refer to the Guide for Authors to prepare your manuscript.

For any further information, the authors may contact the Guest Editors.

Keywords:

Neural dynamics, intelligent computing

https://www.sciencedirect.com/special-issue/328893/neural-dynamics-in-intelligent-computing-and-applications
最后更新 Dou Sun 在 2026-03-11
相关期刊
CCF全称影响因子出版商ISSN
cNeurocomputing6.5Elsevier0925-2312
aIEEE Transactions on Services Computing5.8IEEE1939-1374
Cluster Computing4.1Springer1386-7857
cJournal of Grid Computing2.9Springer1570-7873
Computing2.8Springer0010-485X
cThe Journal of Supercomputing2.7Springer0920-8542
cSoft Computing2.5Springer1432-7643
Memetic Computing2.3Springer1865-9284
bNeural Computation2.1MIT Press0899-7667
cNatural Computing1.6Springer1567-7818
相关会议
CCFCOREQUALIS简称全称截稿日期通知日期会议日期
baa2ICSInternational Conference on Supercomputing2026-02-022026-04-062026-07-06
aa*a1STOCACM Symposium on Theory of Computing2025-11-042026-02-012026-06-22
cISC'International Supercomputing Conference2024-12-102025-02-282025-06-10
cab1SCCInternational Conference on Services Computing2022-03-012022-04-152022-07-10
b2ICSCInternational Conference on Semantic Computing2020-10-122020-11-252021-01-27
ba2ICACInternational Conference on Autonomic Computing2019-02-222019-04-082019-06-16
cGrCInternational Conference on Granular Computing2015-08-252015-09-042015-10-29
b2ICOMPInternational Conference on Internet Computing2015-04-152015-04-302015-07-27
b4SocialComInternational Conference on Social Computing2014-04-302014-06-052014-08-04
aa2GRIDInternational Conference on Grid Computing2012-04-252012-05-152012-09-20