Conferences related to Neurons

Back to Top

2019 IEEE 17th International Conference on Industrial Informatics (INDIN)

Industrial information technologies


2019 IEEE 28th International Symposium on Industrial Electronics (ISIE)

The conference will provide a forum for discussions and presentations of advancements inknowledge, new methods and technologies relevant to industrial electronics, along with their applications and future developments.


2019 IEEE Industry Applications Society Annual Meeting

The Annual Meeting is a gathering of experts who work and conduct research in the industrial applications of electrical systems.


2019 IEEE International Conference on Systems, Man and Cybernetics (SMC)

2019 IEEE International Conference on Systems, Man, and Cybernetics (SMC2019) will be held in the south of Europe in Bari, one of the most beautiful and historical cities in Italy. The Bari region’s nickname is “Little California” for its nice weather and Bari's cuisine is one of Italian most traditional , based of local seafood and olive oil. SMC2019 is the flagship conference of the IEEE Systems, Man, and Cybernetics Society. It provides an international forum for researchers and practitioners to report up-to-the-minute innovations and developments, summarize state­of-the-art, and exchange ideas and advances in all aspects of systems science and engineering, human machine systems and cybernetics. Advances have importance in the creation of intelligent environments involving technologies interacting with humans to provide an enriching experience, and thereby improve quality of life.


2019 IEEE Photonics Conference (IPC)

The IEEE Photonics Conference, previously known as the IEEE LEOS Annual Meeting, offers technical presentations by the world’s leading scientists and engineers in the areas of lasers, optoelectronics, optical fiber networks, and associated lightwave technologies and applications. It also features compelling plenary talks on the industry’s most important issues, weekend events aimed at students and young photonics professionals, and a manufacturer’s exhibition.


More Conferences

Periodicals related to Neurons

Back to Top

Antennas and Propagation, IEEE Transactions on

Experimental and theoretical advances in antennas including design and development, and in the propagation of electromagnetic waves including scattering, diffraction and interaction with continuous media; and applications pertinent to antennas and propagation, such as remote sensing, applied optics, and millimeter and submillimeter wave techniques.


Biomedical Circuits and Systems, IEEE Transactions on

The Transactions on Biomedical Circuits and Systems addresses areas at the crossroads of Circuits and Systems and Life Sciences. The main emphasis is on microelectronic issues in a wide range of applications found in life sciences, physical sciences and engineering. The primary goal of the journal is to bridge the unique scientific and technical activities of the Circuits and Systems ...


Biomedical Engineering, IEEE Transactions on

Broad coverage of concepts and methods of the physical and engineering sciences applied in biology and medicine, ranging from formalized mathematical theory through experimental science and technological development to practical clinical applications.


Circuits and Systems I: Regular Papers, IEEE Transactions on

Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.


Circuits and Systems II: Express Briefs, IEEE Transactions on

Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.


More Periodicals

Most published Xplore authors for Neurons

Back to Top

Xplore Articles related to Neurons

Back to Top

RBF networks for density estimation

[{u'author_order': 1, u'affiliation': u'Department of Electronic & Electrical Engineering, University of Surrey, Guildford, Surrey GU2 5XH, United Kingdom', u'full_name': u'Lucia Sardo'}, {u'author_order': 2, u'affiliation': u'Department of Electronic & Electrical Engineering, University of Surrey, Guildford, Surrey GU2 5XH, United Kingdom', u'full_name': u'Josef Kittler'}] 1996 8th European Signal Processing Conference (EUSIPCO 1996), 1996

A non-parametric probability density function (pdf) estimation technique is presented. The estimation consists in approximating the unknown pdf by a network of Gaussian Radial Basis Functions (GRBFs). Complexity analysis is introduced in order to select the optimal number of GRBFs. Results obtained on real data show the potentiality of this technique.


Design of a solid-state neuron circuit for use in self-organizing systems

[{u'author_order': 1, u'affiliation': u'General Electric Research Laboratory, Schenectady, NY, USA', u'full_name': u'C. Coates'}, {u'author_order': 2, u'full_name': u'E. Fisch'}] 1960 IEEE International Solid-State Circuits Conference. Digest of Technical Papers, 1960

The artificial neuron or information-processing cell was designed primarily as a component for experimental neuron-network studies. With minor additions, it could be used for perceptron-type experiments. Each cell has ten exciting and ten inhibiting inputs, and each input has a separate weight associated with it. The operation of the cell depends upon the difference, between the weighted sums of the ...


Considerations underlying the study of sensory elements

[{u'author_order': 1, u'affiliation': u'Massachusetts Institute of Technology, Cambridge, MA, USA', u'full_name': u'J. Lettvin'}] 1963 IEEE International Solid-State Circuits Conference. Digest of Technical Papers, 1963

None


Nonlinear formant-pitch prediction using Recurrent Neural Networks

[{u'author_order': 1, u'affiliation': u'Department of Electrical and Electronics Engineering Eastern Mediterranean University, Gazi Magosa, Mersin-10, Turkey', u'full_name': u'Ekrem Varoglu'}, {u'author_order': 2, u'affiliation': u'Department of Electrical and Electronics Engineering Eastern Mediterranean University, Gazi Magosa, Mersin-10, Turkey', u'full_name': u'Kadri Hacioglu'}] 1996 8th European Signal Processing Conference (EUSIPCO 1996), 1996

In this study, a parallel structure is proposed for the nonlinear formant and pitch prediction of speech signals using Recurrent Neural Networks (RNN) The well known Real Time Recurrent Learning (RTRL) algorithm is used as the learning algorithm. Its performance is evaluated in terms of the mean-square error and sensitivity to pitch errors through extensive computer simulations and compared to ...


Synchronous Machine Steady-State Stability Annlysis Using An Artificial Neural Network

[{u'author_order': 1, u'full_name': u'Chao-Rong Chen'}, {u'author_order': 2, u'full_name': u'Yuan-Yih Hsu'}] IEEE Power Engineering Review, 1991

None


More Xplore Articles

Educational Resources on Neurons

Back to Top

eLearning

No eLearning Articles are currently tagged "Neurons"

IEEE-USA E-Books

  • Recurrent Neural Networks

    This chapter considers a class of neural networks that have a recurrent structure, including Grossberg network, Hopfield network, and cellular neural networks. The Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It consists of a set of neurons and a corresponding set of unit time delays, formatting a multiple-loop feedback system. There are three components to the Grossberg network: Layer 1, Layer 2, and the adaptive weights. Layer 1 is a rough model of the operation of the retina, while Layer 2 represents the visual cortex. Cellular neural networks contain linear and nonlinear circuit elements, which typically are linear capacitors, linear resistors, linear and nonlinear controlled sources, and independent sources. The chapter also describes the mathematical model of a nonlinear dynamic system, and discusses some of the important issues involved in neurodynamics.

  • Surface EMG Decomposition

    This chapter provides an overview of surface EMG decomposition techniques, along with their basic assumptions, properties, and limitations. Surface electrodes measure the electrical activity of several nearby muscle fibers that are active during a muscle contraction. The electrical activity of each fiber can be described by a single fiber action potential (SFAP) that propagates from the neuromuscular junction towards the tendons. There is large diversity of decomposition techniques that can roughly be categorized either as template matching or latent component analysis (blind source separation) approaches. Decomposition of surface EMG is a powerful tool enabling noninvasive insight not only into muscle control strategies, but also into peripheral muscle properties. It provides unambiguous information on physiological parameters of individual motor units that can easily be interpreted. The identification of motor units (MUs) discharge patterns from surface EMG signals, acquired during dynamic muscle contractions, needs to be addressed.

  • I Intelligence Reimagined

    None

  • Multilayer Neural Networks and Backpropagation

    A computationally effective method for training the multilayer perceptrons is the backpropagation algorithm, which is regarded as a landmark in the development of neural network. This chapter presents two different learning methods, batch learning and online learning, on the basis of how the supervised learning of the multilayer perceptron is actually performed. The essence of backpropagation learning is to encode an input-output mapping into the synaptic weights and thresholds of a multilayer perceptron. It is hoped that the network becomes well trained so that it learns enough about the past to generalize to the future. The chapter concludes with cross-validation and generalization. Cross-validation is appealing particularly when people have to design a large neural network with good generalization as the goal in different ways. Generalization is assumed that the test data are drawn from the same population used to generate the training data.

  • 16 Consciousness

    When his mother asked young Francis Crick what scientific problems he wanted to pursue in life, he told her there were only two that interested him: the mystery of life and the mystery of consciousness.1 Crick clearly had a keen sense for what is important, but he may not have appreciated the difficulty of these problems. Little did his mother know that, decades later in 1953, her son and James Watson would discover the structure of DNA—the loose thread that would eventually unravel one of life's great mysteries. But Crick (figure 16.1) was not content with this achievement.

  • Ear Physiology

    This chapter contains sections titled:IntroductionAnatomical Pathways From the Ear to the Perception of SoundThe Peripheral Auditory SystemHair Cell and Auditory Nerve FunctionsProperties of the Auditory NerveSummary and Block Diagram of the Peripheral Auditory SystemExercises

  • 11 NEURAL NETWORKS THAT CAN HEAR, SPEAK, AND REMEMBER

    We've spent most of the past few chapters looking at how deep neural networks are able to recognize objects in images. I've focused on these networks largely because many of the machines in this book use vision in some form to perceive the world around them. But what if we wanted our machines to have other ways to interact with the world—to generate English sentences, or to understand human speech, for example? Would convolutional networks prove useful for this as well? Are there other neural network “primitives” that would be helpful? Popping up a level, does it even make sense to use neural networks for tasks like understanding speech?

  • -Dimensional Vector Neuron and Its Application to the -Bit Parity Problem

    This chapter contains sections titled: * Introduction * Neuron Models with High-Dimensional Parameters * _N_-Dimensional Vector Neuron * Discussion * Conclusion

  • 8 HOW TO BEAT ATARI GAMES BY USING NEURAL NETWORKS

    Even before Google acquired DeepMind in 2014, word about this new research company was spreading quietly. At a machine learning conference in late 2012, for instance, DeepMind had been competing aggressively with companies like Facebook and Google to recruit members of the machine learning community.1 And conference attendees learned that the founder of this mysterious company was Demis Hassabis, a quiet, brilliant, and ambitious neuroscientist.

  • GLOSSARY

    None



Standards related to Neurons

Back to Top

No standards are currently tagged "Neurons"


Jobs related to Neurons

Back to Top