Conferences related to Expectation-maximization algorithms

Back to Top

ICC 2021 - IEEE International Conference on Communications

IEEE ICC is one of the two flagship IEEE conferences in the field of communications; Montreal is to host this conference in 2021. Each annual IEEE ICC conference typically attracts approximately 1,500-2,000 attendees, and will present over 1,000 research works over its duration. As well as being an opportunity to share pioneering research ideas and developments, the conference is also an excellent networking and publicity event, giving the opportunity for businesses and clients to link together, and presenting the scope for companies to publicize themselves and their products among the leaders of communications industries from all over the world.


2020 IEEE International Symposium on Information Theory (ISIT)

Information theory, coding theory, communication theory, signal processing, and foundations of machine learning


2020 IEEE International Conference on Robotics and Automation (ICRA)

The International Conference on Robotics and Automation (ICRA) is the IEEE Robotics and Automation Society’s biggest conference and one of the leading international forums for robotics researchers to present their work.


2020 IEEE International Symposium on Antennas and Propagation and North American Radio Science Meeting

The joint meeting is intended to provide an international forum for the exchange of information on state of the art research in the area of antennas and propagation, electromagnetic engineering and radio science


2020 IEEE 23rd International Conference on Information Fusion (FUSION)

The International Conference on Information Fusion is the premier forum for interchange of the latest research in data and information fusion, and its impacts on our society. The conference brings together researchers and practitioners from academia and industry to report on the latest scientific and technical advances.



Periodicals related to Expectation-maximization algorithms

Back to Top

Antennas and Propagation, IEEE Transactions on

Experimental and theoretical advances in antennas including design and development, and in the propagation of electromagnetic waves including scattering, diffraction and interaction with continuous media; and applications pertinent to antennas and propagation, such as remote sensing, applied optics, and millimeter and submillimeter wave techniques.


Audio, Speech, and Language Processing, IEEE Transactions on

Speech analysis, synthesis, coding speech recognition, speaker recognition, language modeling, speech production and perception, speech enhancement. In audio, transducers, room acoustics, active sound control, human audition, analysis/synthesis/coding of music, and consumer audio. (8) (IEEE Guide for Authors) The scope for the proposed transactions includes SPEECH PROCESSING - Transmission and storage of Speech signals; speech coding; speech enhancement and noise reduction; ...


Automatic Control, IEEE Transactions on

The theory, design and application of Control Systems. It shall encompass components, and the integration of these components, as are necessary for the construction of such systems. The word `systems' as used herein shall be interpreted to include physical, biological, organizational and other entities and combinations thereof, which can be represented through a mathematical symbolism. The Field of Interest: shall ...


Biomedical Engineering, IEEE Transactions on

Broad coverage of concepts and methods of the physical and engineering sciences applied in biology and medicine, ranging from formalized mathematical theory through experimental science and technological development to practical clinical applications.


Broadcasting, IEEE Transactions on

Broadcast technology, including devices, equipment, techniques, and systems related to broadcast technology, including the production, distribution, transmission, and propagation aspects.



Most published Xplore authors for Expectation-maximization algorithms

Back to Top

Xplore Articles related to Expectation-maximization algorithms

Back to Top

Offline handwritten character recognition based on discriminative training of orthogonal Gaussian mixture model

Proceedings of Sixth International Conference on Document Analysis and Recognition, 2001

The statistical approach to offline handwritten character recognition, in which classifier design is very important, has been used widely. To approximate the class conditional density more precisely, it can be represented by an orthogonal Gaussian mixture model (OGMM). The parameters of the OGMM are commonly estimated by an expectation-maximization algorithm (EM), which converges to the maximum likelihood estimation (MLE). Since ...


A generalized Volterra series method for reconstructing deterministic dynamics from noisy chaotic time series

Asia-Pacific Conference on Circuits and Systems, 2002

Many problems such as over-fitting and subset selection are introduced in the standard Volterra model (SVM) in reconstructing deterministic dynamics from noisy chaotic time series. An optimal transformed Volterra filtering (OTVF) with only a small number of Volterra terms able to reconstruct the underlying determinism was presented.


Information-theoretic matching of two point sets

IEEE Transactions on Image Processing, 2002

This paper describes the theoretic roadmap of least relative entropy matching of two point sets. The novel feature is to align two point sets without needing to establish explicit point correspondences. The recovery of transformational geometry is achieved using a mixture of principal axes registrations, whose parameters are estimated by minimizing the relative entropy between the two point distributions and ...


Language modeling by variable length sequences: theoretical formulation and evaluation of multigrams

1995 International Conference on Acoustics, Speech, and Signal Processing, 1995

The multigram model assumes that language can be described as the output of a memoryless source that emits variable-length sequences of words. The estimation of the model parameters can be formulated as a maximum likelihood estimation problem from incomplete data. We show that estimates of the model parameters can be computed through an iterative expectation-maximization algorithm and we describe a ...


Minimum classification error factor analysis (MCE-FA) for automatic speech recognition

1997 IEEE Workshop on Automatic Speech Recognition and Understanding Proceedings, 1997

Modeling acoustic correlation in automatic speech recognition systems is essential when the speech signal is non stationary or corrupted by noise. We present a statistical method for improved acoustic modeling in continuous density hidden Markov models (HMMs). Factor analysis uses a small number of parameters to model the covariance structure of the speech signal. These parameters are estimated by an ...



Educational Resources on Expectation-maximization algorithms

Back to Top

IEEE-USA E-Books

  • Offline handwritten character recognition based on discriminative training of orthogonal Gaussian mixture model

    The statistical approach to offline handwritten character recognition, in which classifier design is very important, has been used widely. To approximate the class conditional density more precisely, it can be represented by an orthogonal Gaussian mixture model (OGMM). The parameters of the OGMM are commonly estimated by an expectation-maximization algorithm (EM), which converges to the maximum likelihood estimation (MLE). Since the MLE cannot directly minimize the classification errors as the goal of classifier design, a discriminative training method based on the minimum classification error (MCE) criterion is used to adjust the parameters of the OGMM. In order to achieve good generalization, the complexity of the classifier, namely the number of components in the OGMM, is determined by following the structure risk minimization (SRM) principle. Finally, the recognition performance is demonstrated by applying it to the handwritten numerals in the NIST database.

  • A generalized Volterra series method for reconstructing deterministic dynamics from noisy chaotic time series

    Many problems such as over-fitting and subset selection are introduced in the standard Volterra model (SVM) in reconstructing deterministic dynamics from noisy chaotic time series. An optimal transformed Volterra filtering (OTVF) with only a small number of Volterra terms able to reconstruct the underlying determinism was presented.

  • Information-theoretic matching of two point sets

    This paper describes the theoretic roadmap of least relative entropy matching of two point sets. The novel feature is to align two point sets without needing to establish explicit point correspondences. The recovery of transformational geometry is achieved using a mixture of principal axes registrations, whose parameters are estimated by minimizing the relative entropy between the two point distributions and using the expectation- maximization algorithm. We give evidence of the optimality of the method and we then evaluate the algorithm's performance in both rigid and nonrigid image registration cases.

  • Language modeling by variable length sequences: theoretical formulation and evaluation of multigrams

    The multigram model assumes that language can be described as the output of a memoryless source that emits variable-length sequences of words. The estimation of the model parameters can be formulated as a maximum likelihood estimation problem from incomplete data. We show that estimates of the model parameters can be computed through an iterative expectation-maximization algorithm and we describe a forward-backward procedure for its implementation. We report the results of a systematical evaluation of multigrams for language modeling on the ATIS database. The objective performance measure is the test set perplexity. Our results show that multigrams outperform conventional n-grams for this task.

  • Minimum classification error factor analysis (MCE-FA) for automatic speech recognition

    Modeling acoustic correlation in automatic speech recognition systems is essential when the speech signal is non stationary or corrupted by noise. We present a statistical method for improved acoustic modeling in continuous density hidden Markov models (HMMs). Factor analysis uses a small number of parameters to model the covariance structure of the speech signal. These parameters are estimated by an Expectation-Maximization algorithm, then further adjusted using discriminative minimum classification error training. Experimental results on 1219 New Jersey town names demonstrate that the proposed method produces faster, smaller and more accurate recognition models.

  • Partitioning the variables for alternating optimization of real-valued scalar fields

    Summary form only given, as follows. Let x be a real-valued scalar field, partitioned into t subsets of non-overlapping variables X/sub i/ (i=1, ..., t). Alternating optimization (AO) is an iterative procedure for minimizing (or maximizing) the function f(x)= f(X/sub 1/, X/sub 2/, ..., X/sub t/) jointly over all variables by alternating restricted minimizations (or maximizations) over the individual subsets of variables X/sub 1/, ..., X/sub t/. AO is the basis for the c-means clustering algorithm (t=2), many forms of vector quantization (t = 2, 3 and 4) and the expectation maximization algorithm (t=4) for normal mixture decomposition. First we review where and how AO fits into the overall optimization landscape. Then we state (without proofs) two new theorems that give very general local and global convergence and rate-of- convergence results which hold for all partitionings of x. Finally, we discuss the important problem of choosing a "best" partitioning of the input variables for the AO approach. We show that the number of possible AO iteration schemes is larger than the number of standard partitions of the input variables. Two numerical examples are given to illustrate that the selection of the t subsets of x has an important effect on the rate of convergence of the corresponding AO algorithm to a solution.

  • A Probabilistic Algorithm for Meg Source Reconstruction

    We present a novel algorithm for source localization based on probabilistic modeling of stimulus-evoked MEG/EEG data. This algorithm localizes multiple dipoles with the computational complexity equivalent to a single dipole scan, and is therefore more efficient than traditional multidipole fitting procedures. The algorithm assumes that the activity of multiple dipolar sources can be characterized by a linear combination of known temporal basis functions with unknown coefficients. We model the sensor data as arising from activity in each voxel of interest, plus background activity. We estimate temporal basis functions from the data using a probabilistic algorithm called partitioned-factor analysis, previously developed in our lab. We model background activity outside the voxel of interest as an unknown linear mixture of unobserved background factors plus diagonal sensor noise. We use an expectation-maximization algorithm to calculate MAP estimates of unknown basis function coefficients, background mixing matrix, sensor noise covariance and the likelihood of a dipole in each voxel of interest. In simulations, the algorithm is able to accurately localize several simultaneously-active dipoles, at SNRs typical for averaged MEG data. The algorithm performs well even in configurations that include deep sources and highly correlated sources, and thus is superior to MUSIC and beamforming techniques which are sensitive to correlated sources. The algorithm also correctly localizes real somatosensory and auditory evoked fields to the postcentral sulcus and lower bank of the lateral sulcus, respectively

  • Unsupervised Image Layout Extraction

    We propose a novel unsupervised learning algorithm to extract the layout of an image by learning latent object-related aspects. Unlike traditional image segmentation algorithms that segment an image using feature similarity, our method is able to learn high-level object characteristics (aspects) from a large number of unlabeled images containing similar objects to facilitate image segmentation. Our method does not require human to annotate the training set and works without supervision. We use a graphical model to address the learning of aspects and layout extraction together. In particular, aspect- feature dependency from multiple images is learned via the expectation- maximization algorithm. We demonstrate that, by associating latent aspects to spatial structure, the proposed method achieves much better layout extraction results than using probabilistic latent semantic analysis.

  • Vehicle image classification via expectation-maximization algorithm

    In this paper, we present a statistical method to extract images of passenger cars from highway traffic scenes. The expectation-maximization (EM) algorithm is used to classify the vehicles in the images as either being passenger cars or some other bigger vehicles, cars versus non-cars. The vehicle classification algorithm uses training sets of 100-frame video sequences. The car group is comprised of passenger cars and light trucks. The non-car group is comprised of heavy single trucks as well as 3-axle and more combination trucks. We use the properties of their dimensional distribution and the probability of their appearance to identify the vehicle group. We present a validation of our algorithm using real-world traffic scenes.

  • Reliability assessment with amalgamated data via the expectation-maximization algorithm

    A solution is given to the problem of estimating reliability indicators in a context of crude data arising in an industrial study devoted to the reliability assessment of electronic calculators used in modern airplanes. The authors introduce the concept of amalgamated data and develop an expectation- maximization algorithm to obtain a maximum likelihood estimator of the reliability function and the cumulative failure intensity associated with the lifetime of calculators.



Standards related to Expectation-maximization algorithms

Back to Top

No standards are currently tagged "Expectation-maximization algorithms"