Biological neural networks
10,647 resources related to Biological neural networks
- Topics related to Biological neural networks
- IEEE Organizations related to Biological neural networks
- Conferences related to Biological neural networks
- Periodicals related to Biological neural networks
- Most published Xplore authors for Biological neural networks
The conference program will consist of plenary lectures, symposia, workshops and invitedsessions of the latest significant findings and developments in all the major fields of biomedical engineering.Submitted papers will be peer reviewed. Accepted high quality papers will be presented in oral and postersessions, will appear in the Conference Proceedings and will be indexed in PubMed/MEDLINE
ISIE focuses on advancements in knowledge, new methods, and technologies relevant to industrial electronics, along with their applications and future developments.
The International Conference on Image Processing (ICIP), sponsored by the IEEE SignalProcessing Society, is the premier forum for the presentation of technological advances andresearch results in the fields of theoretical, experimental, and applied image and videoprocessing. ICIP 2020, the 27th in the series that has been held annually since 1994, bringstogether leading engineers and scientists in image and video processing from around the world.
The International Conference on Robotics and Automation (ICRA) is the IEEE Robotics and Automation Society’s biggest conference and one of the leading international forums for robotics researchers to present their work.
IECON is focusing on industrial and manufacturing theory and applications of electronics, controls, communications, instrumentation and computational intelligence.
The Transactions on Biomedical Circuits and Systems addresses areas at the crossroads of Circuits and Systems and Life Sciences. The main emphasis is on microelectronic issues in a wide range of applications found in life sciences, physical sciences and engineering. The primary goal of the journal is to bridge the unique scientific and technical activities of the Circuits and Systems ...
The IEEE Reviews in Biomedical Engineering will review the state-of-the-art and trends in the emerging field of biomedical engineering. This includes scholarly works, ranging from historic and modern development in biomedical engineering to the life sciences and medicine enabled by technologies covered by the various IEEE societies.
Broad coverage of concepts and methods of the physical and engineering sciences applied in biology and medicine, ranging from formalized mathematical theory through experimental science and technological development to practical clinical applications.
Video A/D and D/A, display technology, image analysis and processing, video signal characterization and representation, video compression techniques and signal processing, multidimensional filters and transforms, analog video signal processing, neural networks for video applications, nonlinear video signal processing, video storage and retrieval, computer vision, packet video, high-speed real-time circuits, VLSI architecture and implementation for video technology, multiprocessor systems--hardware and software-- ...
Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.
The Deep Learning Revolution, None
Fundamentals of Computational Intelligence: Neural Networks, Fuzzy Systems, and Evolutionary Computation, None
This chapter considers a class of neural networks that have a recurrent structure, including Grossberg network, Hopfield network, and cellular neural networks. The Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It consists of a set of neurons and a corresponding set of unit time delays, formatting a multiple-loop feedback system. There ...
International Multi Topic Conference, 2002. Abstracts. INMIC 2002., 2002
2008 Congress on Image and Signal Processing, 2008
Feedforward neural networks (FNN) are most heavily used to identify the relation between a given set of input and desired output patterns. By the universal approximation theorem, it is clear that a single-hidden layer FNN is suffcient for the outputs to approximate the corresponding desired outputs arbitrarily close and so we consider a single-hidden layer FNN. In practice, we set ...
2004 International Conference on Intelligent Mechatronics and Automation, 2004. Proceedings., 2004
In this paper, we study the techniqueof fusion among fuzzy logic, artificial neural network and chaos. We first review the past approach on the control of the associative memory chaotic neural network with Fuzzy Controller. Then we propose a new kind of fusion approach, i.e. to use the chaotic algorithm to train the fuzzy neural network. The future research prospects ...
20 Years of Neural Networks: A Promising Start, A brilliant Future- Video contents
Artificial Neural Networks, Intro
ICASSP 2010 - Advances in Neural Engineering
Spike Timing, Rhythms, and the Effective Use of Neural Hardware
Life Sciences Grand Challenge Conference - Roger Kamm
Emergent Neural Network in reinforcement learning
Deep Learning and the Representation of Natural Data
Behind Artificial Neural Networks
Complex-Valued Neural Networks
Lizhong Zheng's Globecom 2019 Keynote
Large-scale Neural Systems for Vision and Cognition
Complex Valued Neural Networks: Theory and Applications
A Comparator Design Targeted Towards Neural Net - David Mountain - ICRC San Mateo, 2019
Learning with Memristive Neural Networks: Neuromorphic Computing - Joshua Yang at INC 2019
High Throughput Neural Network based Embedded Streaming Multicore Processors - Tarek Taha: 2016 International Conference on Rebooting Computing
Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware - Emre Neftci: 2016 International Conference on Rebooting Computing
RNSnet: In-Memory Neural Network Acceleration Using Residue Number System - Sahand Salamat - ICRC 2018
Co-Design of Algorithms & Hardware for DNNs - Vivienne Sze - LPIRC 2018
Spiking Network Algorithms for Scientific Computing - William Severa: 2016 International Conference on Rebooting Computing
This chapter considers a class of neural networks that have a recurrent structure, including Grossberg network, Hopfield network, and cellular neural networks. The Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It consists of a set of neurons and a corresponding set of unit time delays, formatting a multiple-loop feedback system. There are three components to the Grossberg network: Layer 1, Layer 2, and the adaptive weights. Layer 1 is a rough model of the operation of the retina, while Layer 2 represents the visual cortex. Cellular neural networks contain linear and nonlinear circuit elements, which typically are linear capacitors, linear resistors, linear and nonlinear controlled sources, and independent sources. The chapter also describes the mathematical model of a nonlinear dynamic system, and discusses some of the important issues involved in neurodynamics.
Feedforward neural networks (FNN) are most heavily used to identify the relation between a given set of input and desired output patterns. By the universal approximation theorem, it is clear that a single-hidden layer FNN is suffcient for the outputs to approximate the corresponding desired outputs arbitrarily close and so we consider a single-hidden layer FNN. In practice, we set up an error function so as to measure the performance of the FNN. As the error function is nonlinear, we define an iterative process, learning algorithm, to obtain the optimal choice of the connection weights and thus set up a numerical optimization problem. In this paper, we consider a new error function defined on the hidden layer We propose a new learning algorithm based on the least square methods converges rapidly. We discuss our method with the classic learning algorithms and the convergence for these algorithms.
In this paper, we study the techniqueof fusion among fuzzy logic, artificial neural network and chaos. We first review the past approach on the control of the associative memory chaotic neural network with Fuzzy Controller. Then we propose a new kind of fusion approach, i.e. to use the chaotic algorithm to train the fuzzy neural network. The future research prospects for the fusion of the three theories are also given.
This paper tries to explain the network structures and methods of single-layer perceptron and multi-layer perceptron. It also analyses the linear division and un-division problems in logical operation performed by single-layer perceptron. XOR is linear un-division operation, which cannot be treated by single-layer perceptron. With the analysis, several solutions are proposed in the paper to solve the problems of XOR. Single-layer perceptron can be improved by multi-layer perceptron, functional perceptron or quadratic function. These solutions are designed and analyzed.
A formal selection and pruning technique based on the concept of local relative sensitivity index is proposed for feedforward neural networks. The mechanism of backpropagation training algorithm is revisited and the theoretical foundation of the improved selection and pruning technique is presented. This technique is based on parallel pruning of weights which are relatively redundant in a subgroup of a feedforward neural network. Comparative studies with a similar technique proposed in the literature show that the improved technique provides better pruning results in terms of reduction of model residues, improvement of generalization capability and reduction of network complexity. The effectiveness of the improved technique is demonstrated in developing neural network models of a number of nonlinear systems including three bit parity problem, Van der Pol equation, a chemical processes and two nonlinear discrete-time systems using the backpropagation training algorithm with adaptive learning rate.
Financial crisis is fatal to all companies. So, it is very important to establish a financial crisis prediction system for each company to resolve latent problems before they emerge. However, artificial neural network (ANN) offers an approach to computation that is different from conventional analytic methods. In this text, we construct a BP neural network model to predict the financial crisis of companies with the samples from Shanghai stock exchange and Shenzhen stock exchange. And it is proved that the model is appropriate.
An associative memory with fractal connections was studied. The performance of fractal neural network and randomly connected network were compared for the random patterns and fractally localized patterns, respectively. Since fractal patterns have very low activity level, the encoding scheme is sparse. We show numerically that the fractal neural network has a higher capability than randomly connected neural network for fractal pattern, but not for random patterns.
In this paper, a novel four-dimensional (4D) autonomous continuous time Hopfield-type neural network with two parameters is investigated. Computer simulations show that the 4D Hopfield neural network has rich and funny dynamics, and it can display equilibrium, periodic attractor, chaotic attractor and quasi-periodic attractor for different parameters. Moreover, when the system is chaotic, its positive Lyapunov exponent is much larger than those of the chaotic Hopfield neural networks already reported. The complex dynamical behaviors of the system are further investigated by means of Lyapunov exponents spectrum, bifurcation analysis and phase portraits.
No standards are currently tagged "Biological neural networks"