Conferences related to Multilayer perceptrons

Back to Top

2020 IEEE 29th International Symposium on Industrial Electronics (ISIE)

ISIE focuses on advancements in knowledge, new methods, and technologies relevant to industrial electronics, along with their applications and future developments.


ICC 2020 - 2020 IEEE International Conference on Communications

All topics relating to existing and emerging communications networking technologies.


ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

The ICASSP meeting is the world's largest and most comprehensive technical conference focused on signal processing and its applications. The conference will feature world-class speakers, tutorials, exhibits, and over 50 lecture and poster sessions.


IGARSS 2020 - 2020 IEEE International Geoscience and Remote Sensing Symposium

All fields of satellite, airborne and ground remote sensing.


IECON 2020 - 46th Annual Conference of the IEEE Industrial Electronics Society

IECON is focusing on industrial and manufacturing theory and applications of electronics, controls, communications, instrumentation and computational intelligence.



Periodicals related to Multilayer perceptrons

Back to Top

Aerospace and Electronic Systems Magazine, IEEE

The IEEE Aerospace and Electronic Systems Magazine publishes articles concerned with the various aspects of systems for space, air, ocean, or ground environments.


Antennas and Propagation, IEEE Transactions on

Experimental and theoretical advances in antennas including design and development, and in the propagation of electromagnetic waves including scattering, diffraction and interaction with continuous media; and applications pertinent to antennas and propagation, such as remote sensing, applied optics, and millimeter and submillimeter wave techniques.


Circuits and Systems I: Regular Papers, IEEE Transactions on

Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.


Communications Magazine, IEEE

IEEE Communications Magazine was the number three most-cited journal in telecommunications and the number eighteen cited journal in electrical and electronics engineering in 2004, according to the annual Journal Citation Report (2004 edition) published by the Institute for Scientific Information. Read more at http://www.ieee.org/products/citations.html. This magazine covers all areas of communications such as lightwave telecommunications, high-speed data communications, personal communications ...


Geoscience and Remote Sensing, IEEE Transactions on

Theory, concepts, and techniques of science and engineering as applied to sensing the earth, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.



Most published Xplore authors for Multilayer perceptrons

Back to Top

Xplore Articles related to Multilayer perceptrons

Back to Top

Recurrent Neural Networks

Fundamentals of Computational Intelligence: Neural Networks, Fuzzy Systems, and Evolutionary Computation, None

This chapter considers a class of neural networks that have a recurrent structure, including Grossberg network, Hopfield network, and cellular neural networks. The Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It consists of a set of neurons and a corresponding set of unit time delays, formatting a multiple-loop feedback system. There ...


Radial-Basis Function Networks

Fundamentals of Computational Intelligence: Neural Networks, Fuzzy Systems, and Evolutionary Computation, None

This chapter focuses on the radial-basis function (RBF) network as an alternative to multilayer perceptrons. It will be interesting to find that in a multilayer perceptron, the function approximation is defined by a nested set of weighted summations, while in a RBF network, the approximation is defined by a single weighted sum. The chapter focuses on the use of a ...


Corrections to "a new error function at hidden layers for fast training of multilayer perceptrons"

IEEE Transactions on Neural Networks, 1999

None


Multilayer Neural Networks and Backpropagation

Fundamentals of Computational Intelligence: Neural Networks, Fuzzy Systems, and Evolutionary Computation, None

A computationally effective method for training the multilayer perceptrons is the backpropagation algorithm, which is regarded as a landmark in the development of neural network. This chapter presents two different learning methods, batch learning and online learning, on the basis of how the supervised learning of the multilayer perceptron is actually performed. The essence of backpropagation learning is to encode ...


Neural networks applied for induction motor speed sensorless estimation

1995 Proceedings of the IEEE International Symposium on Industrial Electronics, 1995

In this paper, two schemes using neural networks (NN) as speed estimators in induction motor field oriented controlled (FOC) drives are compared. The first estimator consists of a two-layer neural network (often called a perceptron) while in the second a three-layer neural network is used for motor speed estimation. Both estimators are simulated with an indirect field oriented controller (IFOC) ...



Educational Resources on Multilayer perceptrons

Back to Top

IEEE-USA E-Books

  • Recurrent Neural Networks

    This chapter considers a class of neural networks that have a recurrent structure, including Grossberg network, Hopfield network, and cellular neural networks. The Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It consists of a set of neurons and a corresponding set of unit time delays, formatting a multiple-loop feedback system. There are three components to the Grossberg network: Layer 1, Layer 2, and the adaptive weights. Layer 1 is a rough model of the operation of the retina, while Layer 2 represents the visual cortex. Cellular neural networks contain linear and nonlinear circuit elements, which typically are linear capacitors, linear resistors, linear and nonlinear controlled sources, and independent sources. The chapter also describes the mathematical model of a nonlinear dynamic system, and discusses some of the important issues involved in neurodynamics.

  • Radial-Basis Function Networks

    This chapter focuses on the radial-basis function (RBF) network as an alternative to multilayer perceptrons. It will be interesting to find that in a multilayer perceptron, the function approximation is defined by a nested set of weighted summations, while in a RBF network, the approximation is defined by a single weighted sum. The chapter focuses on the use of a Gaussian function as the radial-basis function. The reason behind the choice of the Gaussian function as the radial-basis function in building RBF networks is that it has many desirable properties, which will become evident as the discussion progresses. It is important to point out that RBF networks and multilayer perceptrons can be trained in alternative ways besides those presented. For multilayer perceptrons, the backpropagation algorithm is simple to compute locally and it performs stochastic gradient descent in weight space when the algorithm is implemented in an online learning mode.

  • Corrections to "a new error function at hidden layers for fast training of multilayer perceptrons"

    None

  • Multilayer Neural Networks and Backpropagation

    A computationally effective method for training the multilayer perceptrons is the backpropagation algorithm, which is regarded as a landmark in the development of neural network. This chapter presents two different learning methods, batch learning and online learning, on the basis of how the supervised learning of the multilayer perceptron is actually performed. The essence of backpropagation learning is to encode an input-output mapping into the synaptic weights and thresholds of a multilayer perceptron. It is hoped that the network becomes well trained so that it learns enough about the past to generalize to the future. The chapter concludes with cross-validation and generalization. Cross-validation is appealing particularly when people have to design a large neural network with good generalization as the goal in different ways. Generalization is assumed that the test data are drawn from the same population used to generate the training data.

  • Neural networks applied for induction motor speed sensorless estimation

    In this paper, two schemes using neural networks (NN) as speed estimators in induction motor field oriented controlled (FOC) drives are compared. The first estimator consists of a two-layer neural network (often called a perceptron) while in the second a three-layer neural network is used for motor speed estimation. Both estimators are simulated with an indirect field oriented controller (IFOC) for induction motor drives.

  • Channel equalization for ISI channels using Wilcoxon generalized RBF

    This paper presents solution to channel equalization problem. Adaptive equalization techniques have been used for channel equalization extensively. Subsequently adaptive signal processing techniques using Artificial Neural Networks based Multilayer Perceptron Network, Radial Basis Function, Recurrent Network, Fuzzy and Adaptive Neuro fuzzy System have been used for these class of problems. In this paper we proposed a RBF equalizer trained with a new neural network learning paradigm called the Wilcoxon learning. This is termed as Wilcoxon generalized radial basis function network (WGRBFN), is a rank based statistics approach, is used to linear regression problems are usually robust against outliers. The Performance of the WGRBFN has been evaluated through extensive computer simulations and the results compared with other equalizers.

  • Classification of sperm cells according to their chromosomic content using a neural network trained with a genetic algorithm

    A priori determination of the sex of a human individual before gestation is a desirable goal in some cases. To achieve this, it is necessary to perform the separation of sperm cells containing either X or Y chromosomes. As is well known, male sex depends on the presence of chromosome Y. Once this separation is achieved in principle, we require to determine, with a high degree of accuracy, whether the sperm cells of interest contain the desired X or Y chromosomes. If we are able to obtain certain simple measurements regarding the sperm cells under consideration we will be able to control the fertilization process reliably. In this paper we report a method which allows for non-invasive verification of the characteristics of the separated sperm. We determined a set of easily measurable characteristics. From a sample drawn from previously cropped sperm we trained a neural network with a genetic algorithm. The trained network was able to perform a posteriori classification with an error much smaller than 1%. This percentage of efficiency is better than the ones reported in centers of assisted fecundation.

  • Self-regulated multilayer perceptron neural network for breast cancer classification

    The algorithm named self-regulated multilayer perceptron neural network for breast cancer classification (ML-NN) is designed for breast cancer classification. Conventionally, medical doctors need to manually delineate the suspicious breast cancer region. Many studies have suggested that segmentation manually is not only time consuming, but also machine and operator dependent. ML-NN utilise multilayer perceptron neural network on breast cancer classification to aid medical experts in diagnosis of breast cancer. Trained ML-NN can categorise the input medical images into benign, malignant and normal patients. By applying the present algorithm, breast medical images can be classified into cancer patient and normal patient without prior knowledge regarding the presence of cancer lesion. This method is aimed to assist medical experts for breast cancer patient diagnosis through implementation of supervised Multilayer Perceptron Neural Network. ML-NN can classified the input medical images as benign, malignant or normal patient with accuracy, specificity, sensitivity and AUC of 90.59%, 90.67%, 90.53%, and 0.906 ± 0.0227 respectively.

  • Quantizability and learning complexity in multilayer neural networks

    The relationship between quantizability and learning complexity in multilayer neural networks is examined. In a special neural network architecture that calculates node activations according to the certainty factor (CF) model of expert systems, the analysis based upon quantizability leads to lower and also better estimates for generalization dimensionality and sample complexity than those suggested by the multilayer perceptron model. This analysis is further supported by empirical simulation results.

  • Implementation of Universal Neural Network Approximator on a ULP Microcontroller for Wavelet Synthesis in Electroencephalography

    The paper describes the implementation of universal neural network approximator on microcontroller MSP430G2553 with ultralow power consumption. A multilayer perceptron is used as an approximating artificial neural network. Approximation is one of the stages of building wavelet neural network models. The approximation of electroencephalogram fragments allows obtaining wavelets adapted for its continuous wavelet transform. Implementation of neural network approximator on this microcontroller is one of the stages to create a portable system for automated electroencephalogram analysis. Two ways of implementation a neural network approximator are presented in the paper. The first way is based on the experiment with the neural network approximator without its training. The second way involves the experiment based on the interaction with the neural network approximator with training algorithm. The studies of the approximation rate depending on the number of iterations and the core clock frequency of the microcontroller have been conducted.



Standards related to Multilayer perceptrons

Back to Top

No standards are currently tagged "Multilayer perceptrons"