Conferences related to Neural Networks

Back to Top

2023 Annual International Conference of the IEEE Engineering in Medicine & Biology Conference (EMBC)

The conference program will consist of plenary lectures, symposia, workshops and invitedsessions of the latest significant findings and developments in all the major fields of biomedical engineering.Submitted full papers will be peer reviewed. Accepted high quality papers will be presented in oral and poster sessions,will appear in the Conference Proceedings and will be indexed in PubMed/MEDLINE.


IECON 2020 - 46th Annual Conference of the IEEE Industrial Electronics Society

IECON is focusing on industrial and manufacturing theory and applications of electronics, controls, communications, instrumentation and computational intelligence.


IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium

All fields of satellite, airborne and ground remote sensing.


2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

robotics, intelligent systems, automation, mechatronics, micro/nano technologies, AI,


2019 International Joint Conference on Neural Networks (IJCNN)

IJCNN covers a wide range of topics in the field of neural networks, from biological neural network modeling to artificial neural computation.


More Conferences

Periodicals related to Neural Networks

Back to Top

Circuits and Systems for Video Technology, IEEE Transactions on

Video A/D and D/A, display technology, image analysis and processing, video signal characterization and representation, video compression techniques and signal processing, multidimensional filters and transforms, analog video signal processing, neural networks for video applications, nonlinear video signal processing, video storage and retrieval, computer vision, packet video, high-speed real-time circuits, VLSI architecture and implementation for video technology, multiprocessor systems--hardware and software-- ...


Circuits and Systems I: Regular Papers, IEEE Transactions on

Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.


Circuits and Systems II: Express Briefs, IEEE Transactions on

Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.


Circuits and Systems Magazine, IEEE


Computational Biology and Bioinformatics, IEEE/ACM Transactions on

Specific topics of interest include, but are not limited to, sequence analysis, comparison and alignment methods; motif, gene and signal recognition; molecular evolution; phylogenetics and phylogenomics; determination or prediction of the structure of RNA and Protein in two and three dimensions; DNA twisting and folding; gene expression and gene regulatory networks; deduction of metabolic pathways; micro-array design and analysis; proteomics; ...


More Periodicals

Most published Xplore authors for Neural Networks

Back to Top

Xplore Articles related to Neural Networks

Back to Top

Knowledge Extraction From Neural Networks Using the All-Permutations Fuzzy Rule Base: The LED Display Recognition Problem

IEEE Transactions on Neural Networks, 2007

A major drawback of artificial neural networks (ANNs) is their black-box character. Even when the trained network performs adequately, it is very difficult to understand its operation. In this letter, we use the mathematical equivalence between ANNs and a specific fuzzy rule base to extract the knowledge embedded in the network. We demonstrate this using a benchmark problem: the recognition ...


Further Results on Delay-Dependent Stability Criteria of Neural Networks With Time-Varying Delays

IEEE Transactions on Neural Networks, 2008

In this brief paper, an augmented Lyapunov functional, which takes an integral term of state vector into account, is introduced. Owing to the functional, an improved delay-dependent asymptotic stability criterion for delayed neural networks (NNs) is derived in term of linear matrix inequalities (LMIs). It is shown that the obtained criterion can provide less conservative result than some existing ones. ...


Wavelet Basis Function Neural Networks for Sequential Learning

IEEE Transactions on Neural Networks, 2008

In this letter, we develop the wavelet basis function neural networks (WBFNNs). It is analogous to radial basis function neural networks (RBFNNs) and to wavelet neural networks (WNNs). In WBFNNs, both the scaling function and the wavelet function of a multiresolution approximation (MRA) are adopted as the basis for approximating functions. A sequential learning algorithm for WBFNNs is presented and ...


Hypersausage Neural Networks and its Application in Face Recognition

2005 International Conference on Neural Networks and Brain, 2005

In this paper, we firstly give the nature of `hypersausages', study its structure and training of the network, then discuss the nature of it by way of experimenting with ORL face database, and finally, verify its unsurpassable advantages compared with other means


Simulating Dynamic Plastic Continuous Neural Networks by Finite Elements

IEEE Transactions on Neural Networks and Learning Systems, 2014

We introduce dynamic plastic continuous neural network (DPCNN), which is comprised of neurons distributed in a nonlinear plastic medium where wire-like connections of neural networks are replaced with the continuous medium. We use finite element method to model the dynamic phenomenon of information processing within the DPCNNs. During the training, instead of weights, the properties of the continuous material at ...


More Xplore Articles

Educational Resources on Neural Networks

Back to Top

IEEE.tv Videos

Towards On-Chip Optical FFTs for Convolutional Neural Networks - IEEE Rebooting Computing 2017
20 Years of Neural Networks: A Promising Start, A brilliant Future- Video contents
Artificial Neural Networks, Intro
ICASSP 2010 - Advances in Neural Engineering
Spike Timing, Rhythms, and the Effective Use of Neural Hardware
Lizhong Zheng's Globecom 2019 Keynote
Large-scale Neural Systems for Vision and Cognition
Emergent Neural Network in reinforcement learning
Behind Artificial Neural Networks
Deep Learning and the Representation of Natural Data
Complex-Valued Neural Networks
Learning with Memristive Neural Networks: Neuromorphic Computing - Joshua Yang at INC 2019
Complex Valued Neural Networks: Theory and Applications
A Comparator Design Targeted Towards Neural Net - David Mountain - ICRC San Mateo, 2019
Spiking Network Algorithms for Scientific Computing - William Severa: 2016 International Conference on Rebooting Computing
Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware - Emre Neftci: 2016 International Conference on Rebooting Computing
RNSnet: In-Memory Neural Network Acceleration Using Residue Number System - Sahand Salamat - ICRC 2018
High Throughput Neural Network based Embedded Streaming Multicore Processors - Tarek Taha: 2016 International Conference on Rebooting Computing
Active Space-Body Perception and Body Enhancement using Dynamical Neural Systems
Overcoming the Static Learning Bottleneck - the Need for Adaptive Neural Learning - Craig Vineyard: 2016 International Conference on Rebooting Computing

IEEE-USA E-Books

  • Knowledge Extraction From Neural Networks Using the All-Permutations Fuzzy Rule Base: The LED Display Recognition Problem

    A major drawback of artificial neural networks (ANNs) is their black-box character. Even when the trained network performs adequately, it is very difficult to understand its operation. In this letter, we use the mathematical equivalence between ANNs and a specific fuzzy rule base to extract the knowledge embedded in the network. We demonstrate this using a benchmark problem: the recognition of digits produced by a light emitting diode (LED) device. The method provides a symbolic and comprehensible description of the knowledge learned by the network during its training

  • Further Results on Delay-Dependent Stability Criteria of Neural Networks With Time-Varying Delays

    In this brief paper, an augmented Lyapunov functional, which takes an integral term of state vector into account, is introduced. Owing to the functional, an improved delay-dependent asymptotic stability criterion for delayed neural networks (NNs) is derived in term of linear matrix inequalities (LMIs). It is shown that the obtained criterion can provide less conservative result than some existing ones. When linear fractional uncertainties appear in NNs, a new robust delay-dependent stability condition is also given. Numerical examples are given to demonstrate the applicability of the proposed approach.

  • Wavelet Basis Function Neural Networks for Sequential Learning

    In this letter, we develop the wavelet basis function neural networks (WBFNNs). It is analogous to radial basis function neural networks (RBFNNs) and to wavelet neural networks (WNNs). In WBFNNs, both the scaling function and the wavelet function of a multiresolution approximation (MRA) are adopted as the basis for approximating functions. A sequential learning algorithm for WBFNNs is presented and compared to the sequential learning algorithm of RBFNNs. Experimental results show that WBFNNs have better generalization property and require shorter training time than RBFNNs.

  • Hypersausage Neural Networks and its Application in Face Recognition

    In this paper, we firstly give the nature of `hypersausages', study its structure and training of the network, then discuss the nature of it by way of experimenting with ORL face database, and finally, verify its unsurpassable advantages compared with other means

  • Simulating Dynamic Plastic Continuous Neural Networks by Finite Elements

    We introduce dynamic plastic continuous neural network (DPCNN), which is comprised of neurons distributed in a nonlinear plastic medium where wire-like connections of neural networks are replaced with the continuous medium. We use finite element method to model the dynamic phenomenon of information processing within the DPCNNs. During the training, instead of weights, the properties of the continuous material at its different locations and some properties of neurons are modified. Input and output can be vectors and/or continuous functions over lines and/or areas. Delay and feedback from neurons to themselves and from outputs occur in the DPCNNs. We model a simple form of the DPCNN where the medium is a rectangular plate of bilinear material, and the neurons continuously fire a signal, which is a function of the horizontal displacement.

  • Non-Divergence of Stochastic Discrete Time Algorithms for PCA Neural Networks

    Learning algorithms play an important role in the practical application of neural networks based on principal component analysis, often determining the success, or otherwise, of these applications. These algorithms cannot be divergent, but it is very difficult to directly study their convergence properties, because they are described by stochastic discrete time (SDT) algorithms. This brief analyzes the original SDT algorithms directly, and derives some invariant sets that guarantee the nondivergence of these algorithms in a stochastic environment by selecting proper learning parameters. Our theoretical results are verified by a series of simulation examples.

  • An Improved Algebraic Criterion for Global Exponential Stability of Recurrent Neural Networks With Time-Varying Delays

    This brief paper presents an M-matrix-based algebraic criterion for the global exponential stability of a class of recurrent neural networks with decreasing time-varying delays. The criterion improves some previous criteria based on M-matrix and is easy to be verified with the connection weights of the recurrent neural networks with decreasing time-varying delays. In addition, the rate of exponential convergence can be estimated via a simple computation based on the criterion herein.

  • Global Asymptotical Stability of Recurrent Neural Networks With Multiple Discrete Delays and Distributed Delays

    By employing the Lyapunov-Krasovskii functional and linear matrix inequality (LMI) approach, the problem of global asymptotical stability is studied for recurrent neural networks with both discrete time-varying delays and distributed time-varying delays. Some sufficient conditions are given for checking the global asymptotical stability of recurrent neural networks with mixed time-varying delay. The proposed LMI result is computationally efficient as it can be solved numerically using standard commercial software. Two examples are given to show the usefulness of the results

  • Continuous Attractors of Lotka–Volterra Recurrent Neural Networks With Infinite Neurons

    Continuous attractors of Lotka-Volterra recurrent neural networks (LV RNNs) with infinite neurons are studied in this brief. A continuous attractor is a collection of connected equilibria, and it has been recognized as a suitable model for describing the encoding of continuous stimuli in neural networks. The existence of the continuous attractors depends on many factors such as the connectivity and the external inputs of the network. A continuous attractor can be stable or unstable. It is shown in this brief that a LV RNN can possess multiple continuous attractors if the synaptic connections and the external inputs are Gussian-like in shape. Moreover, both stable and unstable continuous attractors can coexist in a network. Explicit expressions of the continuous attractors are calculated. Simulations are employed to illustrate the theory.

  • Global Asymptotic Stability of Delayed Cellular Neural Networks

    A new criterion for the global asymptotic stability of the equilibrium point of cellular neural networks with multiple time delays is presented. The obtained result possesses the structure of a linear matrix inequality and can be solved efficiently using the recently developed interior-point algorithm. A numerical example is used to show the effectiveness of the obtained result



Standards related to Neural Networks

Back to Top

No standards are currently tagged "Neural Networks"