Información de la Revista
Neurocomputing
Por favor Iniciar para ver el sitio web de la revista
Factor de Impacto: |
6.5 |
Editor: |
Elsevier |
ISSN: |
0925-2312 |
Vistas: |
133441 |
Seguidores: |
194 |
Solicitud de Artículos
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
NEW! Neurocomputing's Software Track allows you to expose your complete Software work to the community through a novel Publication format: the Original Software Publication
Overview:
Neurocomputing welcomes theoretical contributions aimed at winning further understanding of neural networks and learning systems, including, but not restricted to, architectures, learning methods, analysis of network dynamics, theories of learning, self-organization, biological neural network modelling, sensorimotor transformations and interdisciplinary topics with artificial intelligence, artificial life, cognitive science, computational learning theory, fuzzy logic, genetic algorithms, information theory, machine learning, neurobiology and pattern recognition.
Neurocomputing covers practical aspects with contributions on advances in hardware and software development environments for neurocomputing, including, but not restricted to, simulation software environments, emulation hardware architectures, models of concurrent computation, neurocomputers, and neurochips (digital, analog, optical, and biodevices).
Neurocomputing reports on applications in different fields, including, but not restricted to, signal processing, speech processing, image processing, computer vision, control, robotics, optimization, scheduling, resource allocation and financial forecasting.
Types of publications:
Neurocomputing publishes reviews of literature about neurocomputing and affine fields.
Neurocomputing reports on meetings, including, but not restricted to, conferences, workshops and seminars.
NEW! The Neurocomputing Software Track
Neurocomputing Software Track publishes a new format, the Original Software Publication (OSP) to disseminate exiting and useful software in the areas of neural networks and learning systems, including, but not restricted to, architectures, learning methods, analysis of network dynamics, theories of learning, self-organization, biological neural network modelling, sensorimotor transformations and interdisciplinary topics with artificial intelligence, artificial life, cognitive science, computational learning theory, fuzzy logic, genetic algorithms, information theory, machine learning, neurobiology and pattern recognition. We encourage high-quality original software submissions which contain non-trivial contributions in the above areas related to the implementations of algorithms, toolboxes, and real systems. The software must adhere to a recognized legal license, such as OSI approved licenses.
Importantly, the software will be a full peer reviewed publication that is able to capture your software updates once they are released. To fully acknowledge the author's/developers work your software will be fully citable as an Original Software Publication, archived and indexed and available as a complete online "body of work" for other researchers and practitioners to discover.
NEW! Neurocomputing's Software Track allows you to expose your complete Software work to the community through a novel Publication format: the Original Software Publication
Overview:
Neurocomputing welcomes theoretical contributions aimed at winning further understanding of neural networks and learning systems, including, but not restricted to, architectures, learning methods, analysis of network dynamics, theories of learning, self-organization, biological neural network modelling, sensorimotor transformations and interdisciplinary topics with artificial intelligence, artificial life, cognitive science, computational learning theory, fuzzy logic, genetic algorithms, information theory, machine learning, neurobiology and pattern recognition.
Neurocomputing covers practical aspects with contributions on advances in hardware and software development environments for neurocomputing, including, but not restricted to, simulation software environments, emulation hardware architectures, models of concurrent computation, neurocomputers, and neurochips (digital, analog, optical, and biodevices).
Neurocomputing reports on applications in different fields, including, but not restricted to, signal processing, speech processing, image processing, computer vision, control, robotics, optimization, scheduling, resource allocation and financial forecasting.
Types of publications:
Neurocomputing publishes reviews of literature about neurocomputing and affine fields.
Neurocomputing reports on meetings, including, but not restricted to, conferences, workshops and seminars.
NEW! The Neurocomputing Software Track
Neurocomputing Software Track publishes a new format, the Original Software Publication (OSP) to disseminate exiting and useful software in the areas of neural networks and learning systems, including, but not restricted to, architectures, learning methods, analysis of network dynamics, theories of learning, self-organization, biological neural network modelling, sensorimotor transformations and interdisciplinary topics with artificial intelligence, artificial life, cognitive science, computational learning theory, fuzzy logic, genetic algorithms, information theory, machine learning, neurobiology and pattern recognition. We encourage high-quality original software submissions which contain non-trivial contributions in the above areas related to the implementations of algorithms, toolboxes, and real systems. The software must adhere to a recognized legal license, such as OSI approved licenses.
Importantly, the software will be a full peer reviewed publication that is able to capture your software updates once they are released. To fully acknowledge the author's/developers work your software will be fully citable as an Original Software Publication, archived and indexed and available as a complete online "body of work" for other researchers and practitioners to discover.
Última Actualización Por Dou Sun en 2025-09-22
Special Issues
Special Issue on Scalable Machine Learning on High-Performance Computing PlatformsDía de Entrega: 2026-04-30In recent years, machine learning has become an essential computational tool in many areas of science and engineering. At the same time, the increasing scale of data, model size, and computational complexity has made high-performance computing a fundamental enabler for modern machine learning research and applications. Conversely, machine learning techniques are being increasingly explored to improve efficiency, scalability, and robustness in high-performance computing systems and scientific workflows.
This Special Issue aims to provide a focused forum for recent research at the intersection of machine learning and high-performance computing, with an emphasis on scalable methods, algorithm–system interaction, and application-driven studies. Rather than promoting new paradigms, the Special Issue seeks to collect solid, well-founded contributions that address practical challenges arising from large-scale learning and computation.
The scope includes methodological developments, system-level studies, and representative applications where machine learning and high-performance computing are closely coupled.
Guest editors:
Prof. Yiqin Lu (Executive Guest Editor)
South China University of Technology, Guangzhou, China
Email: qin@scut.edu.cn
Prof. Guo Chen
Hunan University, Changsha, China
Email: guochen@hnu.edu.cn
Assoc. Prof. Rui Yin
University of Florida, Gainesville, Florida, United States
Email: ruiyin@ufl.edu
Special issue information:
Topics of interest include, but are not limited to:
Machine Learning on High-Performance Computing Systems
Distributed and parallel training of machine learning models
Model and data parallelism for large-scale learning
Communication-efficient and scalable learning algorithms
Performance analysis and optimization of learning workloads
High-Performance Computing Architectures and Systems
GPU- and accelerator-based computing for machine learning
Heterogeneous computing platforms and programming models
Memory management and data movement for large-scale learning
Energy-efficient and reliable computing for machine learning workloads
Machine Learning in Scientific and Engineering Computing
Machine learning for numerical simulation and modeling
Surrogate and reduced-order modeling
Hybrid data-driven and physics-based approaches
Uncertainty quantification and reliability analysis
Machine Learning for HPC System Management
Resource allocation and scheduling
Performance modeling and autotuning
Fault detection and system monitoring
Manuscript submission information:
Important Dates:
Submission Open Date: January 31, 2026
Manuscript Submission Deadline: April 30, 2026
First-round Decisions: June 30, 2026
Revised Manuscript Due: August 31, 2026
Final Decisions: September 30, 2026
Publication: before December 31, 2026
Prospective authors should be invited by the Guest Editor team, and follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system.
Please refer to the Guide for Authors to prepare your manuscript.
For any further information, the authors may contact the Guest Editors.
Keywords:
Machine Learning; high-Performance Computing
https://www.sciencedirect.com/special-issue/329987/scalable-machine-learning-on-high-performance-computing-platforms
This Special Issue aims to provide a focused forum for recent research at the intersection of machine learning and high-performance computing, with an emphasis on scalable methods, algorithm–system interaction, and application-driven studies. Rather than promoting new paradigms, the Special Issue seeks to collect solid, well-founded contributions that address practical challenges arising from large-scale learning and computation.
The scope includes methodological developments, system-level studies, and representative applications where machine learning and high-performance computing are closely coupled.
Guest editors:
Prof. Yiqin Lu (Executive Guest Editor)
South China University of Technology, Guangzhou, China
Email: qin@scut.edu.cn
Prof. Guo Chen
Hunan University, Changsha, China
Email: guochen@hnu.edu.cn
Assoc. Prof. Rui Yin
University of Florida, Gainesville, Florida, United States
Email: ruiyin@ufl.edu
Special issue information:
Topics of interest include, but are not limited to:
Machine Learning on High-Performance Computing Systems
Distributed and parallel training of machine learning models
Model and data parallelism for large-scale learning
Communication-efficient and scalable learning algorithms
Performance analysis and optimization of learning workloads
High-Performance Computing Architectures and Systems
GPU- and accelerator-based computing for machine learning
Heterogeneous computing platforms and programming models
Memory management and data movement for large-scale learning
Energy-efficient and reliable computing for machine learning workloads
Machine Learning in Scientific and Engineering Computing
Machine learning for numerical simulation and modeling
Surrogate and reduced-order modeling
Hybrid data-driven and physics-based approaches
Uncertainty quantification and reliability analysis
Machine Learning for HPC System Management
Resource allocation and scheduling
Performance modeling and autotuning
Fault detection and system monitoring
Manuscript submission information:
Important Dates:
Submission Open Date: January 31, 2026
Manuscript Submission Deadline: April 30, 2026
First-round Decisions: June 30, 2026
Revised Manuscript Due: August 31, 2026
Final Decisions: September 30, 2026
Publication: before December 31, 2026
Prospective authors should be invited by the Guest Editor team, and follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system.
Please refer to the Guide for Authors to prepare your manuscript.
For any further information, the authors may contact the Guest Editors.
Keywords:
Machine Learning; high-Performance Computing
https://www.sciencedirect.com/special-issue/329987/scalable-machine-learning-on-high-performance-computing-platforms
Última Actualización Por Dou Sun en 2026-03-11
Special Issue on Positive Noise LearningDía de Entrega: 2026-06-15Noise is an emerging and popular keyword in recent years. The noise-based models have attracted more and more attention in the artificial intelligence community, including but not limited to random forest, dropout in neural networks (a kind of structural positive noise), generative adversarial networks, adversarial training, noisy augmentation, positive-incentive noise, diffusion models, and flow matching models. Although most of these models don’t explicitly claim that they aim to learn noise, they actually utilize the positive noise implicitly. In many current studies, it is pointed out that noise can also be beneficial to large models. Therefore, noise should not be simply regarded as a harmful component any more. The positivity of noise deserves more systematic studies.
Although there are plenty of noise-related models, scientific studies of beneficial noise learning are still lacking to some extent. Most of these noise-based models just use positive noise in a heuristic way. This Special Issue calls for papers that study several attractive, natural, and urgent questions: (1) how a model learns the positive noise in a controllable manner; (2) what kind of noise will be beneficial to specific models/tasks; (3) the theoretical bound of positive noise.
Guest editors:
Dr. Hongyuan Zhang (Executive Guest Editor)
The University of Hong Kong, Hong Kong
Email: hyzhang98@gmail.com
Research Interests: Noise Learning, Representation Learning, Generative AI, Embodied AI
Prof. Xuelong Li
Chief Technology Officer (CTO) and Chief Scientist of China Telecom, China
Email: xuelong_li@ieee.org
Research Interests: Noise Analysis, Computer Vision, Imaging
Special issue information:
This Special Issue seeks to cover a wide range of topics related to positive noise learning and analysis, including but not limited to:
1. Positive-incentive noise;
2. Noise-based generative models such as GAN, diffusion models, and flow matching;
3. Positive noisy and uncertain structure in deep learning models;
4. Noisy model training such as positive noisy labels and adversarial training;
5. Noisy augmentations in diverse fields such as representation learning and signal detection;
6. Positive noise in large models;
7. Positive noise in data acquisition;
8. Explainable analysis for beneficial noise.
Manuscript submission information:
Important Dates:
Submission Open Date: August 20, 2025
Submission Deadline: May 15, 2026
Final Acceptance Deadline: January 15, 2027
Prospective authors should follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system. Authors must select “VSI: Positive Noise Learning" when they reach the "Article Type" step.
Please refer to the Guide for Authors to prepare your manuscript.
For any further information, the authors may contact the Guest Editors.
Keywords:
Noise Learning, Information Theory, Explainability, Generative Models
Although there are plenty of noise-related models, scientific studies of beneficial noise learning are still lacking to some extent. Most of these noise-based models just use positive noise in a heuristic way. This Special Issue calls for papers that study several attractive, natural, and urgent questions: (1) how a model learns the positive noise in a controllable manner; (2) what kind of noise will be beneficial to specific models/tasks; (3) the theoretical bound of positive noise.
Guest editors:
Dr. Hongyuan Zhang (Executive Guest Editor)
The University of Hong Kong, Hong Kong
Email: hyzhang98@gmail.com
Research Interests: Noise Learning, Representation Learning, Generative AI, Embodied AI
Prof. Xuelong Li
Chief Technology Officer (CTO) and Chief Scientist of China Telecom, China
Email: xuelong_li@ieee.org
Research Interests: Noise Analysis, Computer Vision, Imaging
Special issue information:
This Special Issue seeks to cover a wide range of topics related to positive noise learning and analysis, including but not limited to:
1. Positive-incentive noise;
2. Noise-based generative models such as GAN, diffusion models, and flow matching;
3. Positive noisy and uncertain structure in deep learning models;
4. Noisy model training such as positive noisy labels and adversarial training;
5. Noisy augmentations in diverse fields such as representation learning and signal detection;
6. Positive noise in large models;
7. Positive noise in data acquisition;
8. Explainable analysis for beneficial noise.
Manuscript submission information:
Important Dates:
Submission Open Date: August 20, 2025
Submission Deadline: May 15, 2026
Final Acceptance Deadline: January 15, 2027
Prospective authors should follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system. Authors must select “VSI: Positive Noise Learning" when they reach the "Article Type" step.
Please refer to the Guide for Authors to prepare your manuscript.
For any further information, the authors may contact the Guest Editors.
Keywords:
Noise Learning, Information Theory, Explainability, Generative Models
Última Actualización Por Dou Sun en 2025-09-22
Special Issue on Neural Dynamics in Intelligent Computing and ApplicationsDía de Entrega: 2026-10-30In recent decades, neural dynamics has emerged as a fundamental mechanism for understanding and enhancing intelligent computing, offering powerful tools for modeling, prediction, optimization, and adaptive decision-making. With the rapid growth of computational resources and the increasing complexity of real-world tasks, neural dynamics has found wide-ranging applications in domains such as autonomous systems, healthcare, financial technology, robotics, and smart manufacturing. By capturing the temporal evolution of neural states and interactions, neural dynamics provides adaptive, data-driven solutions with strong nonlinear modeling and representation capabilities, enabling intelligent systems to achieve improved accuracy, robustness, and generalization in dynamic and uncertain environments.
Neural dynamics emphasizes the time-varying processes, feedback mechanisms, and emergent behaviors of neural systems, thereby offering a novel perspective on learning, adaptation, and control. The integration of neural dynamics with control theory, optimization, reinforcement learning, and neuromorphic computing has opened up new research frontiers, such as interpretable dynamic modeling, safe and robust decision-making under uncertainty, and real-time adaptive control in rapidly changing environments.
Given the significant progress and growing body of research in this area, this Special Issue now specifically invites survey/review articles that comprehensively summarize and critically evaluate recent advances in neural dynamics for intelligent computing. Survey/review papers should provide insightful syntheses of existing work, identify key challenges and emerging trends, and propose future research directions.
The aim of this Special Issue is to gather high-quality survey/review articles that survey the latest developments in neural dynamics, with particular emphasis on theoretical foundations, innovative modeling approaches, algorithmic design, and applications in emerging technologies. We encourage contributions that explore the integration of neural dynamics with control systems, robotics, natural language processing, and real-time decision-making in complex environments from a survey/review perspective.
Guest editors:
Prof. Long Jin (Executive Guest Editor)
Lanzhou University, Lanzhou, China
Email: jinlong@lzu.edu.cn
Areas of Expertise: dynamics, neural networks, robotics, and optimization
Prof. Shuai Li
University of Oulu, Oulu, Finland
Email: shuai.li@oulu.fi
Areas of Expertise: dynamics, neural networks, robotics, and intelligent control
Special issue information:
Topics for this call for survey/review papers include but not restricted to:
Theoretical foundations of neural dynamics in intelligent computing (robustness, convergence, generalization, interpretability)
Novel neural dynamics architectures and learning paradigms for intelligent computing
Neural dynamics-based adaptive control, predictive control, and robust optimization
Neural dynamics for multi-agent systems, distributed intelligence, and collaborative decision-making
Neural dynamics for robotics: motion planning, autonomous navigation, and humanrobot interaction
Neural dynamics in medical diagnosis, bio-signal analysis, and healthcare decision support
Neural dynamics in financial forecasting, recommender system, risk management, and smart manufacturing
Integration of neural dynamics with reinforcement learning, evolutionary algorithms, and signal processing techniques
Real-time learning and deployment of neural dynamics in time-varying, uncertain, or safety-critical environments
Emerging applications of neural dynamics-based intelligent computing in braincomputer interfaces, autonomous driving, soft robotics, and industrial intelligence
Manuscript submission information:
Important Dates:
Submission Open Date: December 20, 2025
Submission Deadline: October 30, 2026
Final Acceptance Deadline: December 31, 2026
Prospective authors should follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system. Authors must select “VSI: Neural Dynamics" when they reach the "Article Type" step.
Please refer to the Guide for Authors to prepare your manuscript.
For any further information, the authors may contact the Guest Editors.
Keywords:
Neural dynamics, intelligent computing
https://www.sciencedirect.com/special-issue/328893/neural-dynamics-in-intelligent-computing-and-applications
Neural dynamics emphasizes the time-varying processes, feedback mechanisms, and emergent behaviors of neural systems, thereby offering a novel perspective on learning, adaptation, and control. The integration of neural dynamics with control theory, optimization, reinforcement learning, and neuromorphic computing has opened up new research frontiers, such as interpretable dynamic modeling, safe and robust decision-making under uncertainty, and real-time adaptive control in rapidly changing environments.
Given the significant progress and growing body of research in this area, this Special Issue now specifically invites survey/review articles that comprehensively summarize and critically evaluate recent advances in neural dynamics for intelligent computing. Survey/review papers should provide insightful syntheses of existing work, identify key challenges and emerging trends, and propose future research directions.
The aim of this Special Issue is to gather high-quality survey/review articles that survey the latest developments in neural dynamics, with particular emphasis on theoretical foundations, innovative modeling approaches, algorithmic design, and applications in emerging technologies. We encourage contributions that explore the integration of neural dynamics with control systems, robotics, natural language processing, and real-time decision-making in complex environments from a survey/review perspective.
Guest editors:
Prof. Long Jin (Executive Guest Editor)
Lanzhou University, Lanzhou, China
Email: jinlong@lzu.edu.cn
Areas of Expertise: dynamics, neural networks, robotics, and optimization
Prof. Shuai Li
University of Oulu, Oulu, Finland
Email: shuai.li@oulu.fi
Areas of Expertise: dynamics, neural networks, robotics, and intelligent control
Special issue information:
Topics for this call for survey/review papers include but not restricted to:
Theoretical foundations of neural dynamics in intelligent computing (robustness, convergence, generalization, interpretability)
Novel neural dynamics architectures and learning paradigms for intelligent computing
Neural dynamics-based adaptive control, predictive control, and robust optimization
Neural dynamics for multi-agent systems, distributed intelligence, and collaborative decision-making
Neural dynamics for robotics: motion planning, autonomous navigation, and humanrobot interaction
Neural dynamics in medical diagnosis, bio-signal analysis, and healthcare decision support
Neural dynamics in financial forecasting, recommender system, risk management, and smart manufacturing
Integration of neural dynamics with reinforcement learning, evolutionary algorithms, and signal processing techniques
Real-time learning and deployment of neural dynamics in time-varying, uncertain, or safety-critical environments
Emerging applications of neural dynamics-based intelligent computing in braincomputer interfaces, autonomous driving, soft robotics, and industrial intelligence
Manuscript submission information:
Important Dates:
Submission Open Date: December 20, 2025
Submission Deadline: October 30, 2026
Final Acceptance Deadline: December 31, 2026
Prospective authors should follow standard author instructions for Neurocomputing and submit their manuscripts online at Editorial Manager system. Authors must select “VSI: Neural Dynamics" when they reach the "Article Type" step.
Please refer to the Guide for Authors to prepare your manuscript.
For any further information, the authors may contact the Guest Editors.
Keywords:
Neural dynamics, intelligent computing
https://www.sciencedirect.com/special-issue/328893/neural-dynamics-in-intelligent-computing-and-applications
Última Actualización Por Dou Sun en 2026-03-11
Revistas Relacionadas
| CCF | Nombre Completo | Factor de Impacto | Editor | ISSN |
|---|---|---|---|---|
| c | Neurocomputing | 6.5 | Elsevier | 0925-2312 |
| a | IEEE Transactions on Services Computing | 5.8 | IEEE | 1939-1374 |
| Cluster Computing | 4.1 | Springer | 1386-7857 | |
| c | Journal of Grid Computing | 2.9 | Springer | 1570-7873 |
| Computing | 2.8 | Springer | 0010-485X | |
| c | The Journal of Supercomputing | 2.7 | Springer | 0920-8542 |
| c | Soft Computing | 2.5 | Springer | 1432-7643 |
| Memetic Computing | 2.3 | Springer | 1865-9284 | |
| b | Neural Computation | 2.1 | MIT Press | 0899-7667 |
| c | Natural Computing | 1.6 | Springer | 1567-7818 |
Conferencias Relacionadas
| CCF | CORE | QUALIS | Abreviación | Nombre Completo | Entrega | Notificación | Conferencia |
|---|---|---|---|---|---|---|---|
| b | a | a2 | ICS | International Conference on Supercomputing | 2026-02-02 | 2026-04-06 | 2026-07-06 |
| a | a* | a1 | STOC | ACM Symposium on Theory of Computing | 2025-11-04 | 2026-02-01 | 2026-06-22 |
| c | ISC' | International Supercomputing Conference | 2024-12-10 | 2025-02-28 | 2025-06-10 | ||
| c | a | b1 | SCC | International Conference on Services Computing | 2022-03-01 | 2022-04-15 | 2022-07-10 |
| b2 | ICSC | International Conference on Semantic Computing | 2020-10-12 | 2020-11-25 | 2021-01-27 | ||
| b | a2 | ICAC | International Conference on Autonomic Computing | 2019-02-22 | 2019-04-08 | 2019-06-16 | |
| c | GrC | International Conference on Granular Computing | 2015-08-25 | 2015-09-04 | 2015-10-29 | ||
| b2 | ICOMP | International Conference on Internet Computing | 2015-04-15 | 2015-04-30 | 2015-07-27 | ||
| b4 | SocialCom | International Conference on Social Computing | 2014-04-30 | 2014-06-05 | 2014-08-04 | ||
| a | a2 | GRID | International Conference on Grid Computing | 2012-04-25 | 2012-05-15 | 2012-09-20 |