IEEE Organizations related to Mutual Information

Back to Top

No organizations are currently tagged "Mutual Information"



Conferences related to Mutual Information

Back to Top

2020 IEEE International Conference on Image Processing (ICIP)

The International Conference on Image Processing (ICIP), sponsored by the IEEE SignalProcessing Society, is the premier forum for the presentation of technological advances andresearch results in the fields of theoretical, experimental, and applied image and videoprocessing. ICIP 2020, the 27th in the series that has been held annually since 1994, bringstogether leading engineers and scientists in image and video processing from around the world.


2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC)

The Conference focuses on all aspects of instrumentation and measurement science andtechnology research development and applications. The list of program topics includes but isnot limited to: Measurement Science & Education, Measurement Systems, Measurement DataAcquisition, Measurements of Physical Quantities, and Measurement Applications.


2020 IEEE International Symposium on Information Theory (ISIT)

Information theory, coding theory, communication theory, signal processing, and foundations of machine learning


ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

The ICASSP meeting is the world's largest and most comprehensive technical conference focused on signal processing and its applications. The conference will feature world-class speakers, tutorials, exhibits, and over 50 lecture and poster sessions.


ICC 2020 - 2020 IEEE International Conference on Communications

All topics relating to existing and emerging communications networking technologies.


More Conferences

Periodicals related to Mutual Information

Back to Top

No periodicals are currently tagged "Mutual Information"


Most published Xplore authors for Mutual Information

Back to Top

Xplore Articles related to Mutual Information

Back to Top

Distortion bounds and a Protocol for One-Shot Transmission of Correlated Random Variables on a Non-Coherent Multiple-Access Channel

SCC 2013; 9th International ITG Conference on Systems, Communication and Coding, 2013

Bounds on the distortion derived in [1] are adapted to the case where two continuous random variables are sent over a multiple access channel with phase shifts using the help of two feedback channels. The first source is defined to be uniformly distributed and the second source is defined as the sum of the first source and an auxiliary random ...


Discrete Input Gaussian Channel

Information and Communication Theory, None

This chapter derives a constraint capacity, in form of the mutual information, for an M‐ary pulse amplitude modulated signal in the case when the (finite number) signal alternatives are likely equal. The loss made by using uniformly distributed inputs is derived and addressed as the shaping gain in a Gaussian channel. In most communication systems, the choice of signal is ...


An Investigation on Mutual Information for the Linear Predictive System and the Extrapolation of Speech Signals

Speech Communication; 10. ITG Symposium, 2012

Mutual information (MI) is an important information theoretic concept which has many applications in telecommunications, in blind source separation, and in machine learning. More recently, it has been also employed for the instrumental assessment of speech intelligibility where traditionally correlation based measures are used. In this paper, we address the difference between MI and correlation from the viewpoint of discovering ...


Canalizing Boolean Functions Maximize the Mutual Information

SCC 2013; 9th International ITG Conference on Systems, Communication and Coding, 2013

The ability of information processing in biologically motivated Boolean networks is of interest in recent information theoretic research. One measure to quantify this ability is the well known mutual information. Using Fourier analysis we show that canalizing functions maximize the mutual information between an input variable and the outcome of the function. We proof our result for Boolean functions with ...


Multi-edge Optimization of Low-Density Parity-Check Codes for Joint Source-Channel Coding

SCC 2013; 9th International ITG Conference on Systems, Communication and Coding, 2013

We present a novel joint source-channel coding system based on low-density parity-check codes where the amount of information about the source bits available at the decoder is increased by improving the connection profile between the factor graphs that compound the joint system. Furthermore, we propose an optimization strategy for the component codes based on a multi- edge-type joint optimization. Simulation ...


More Xplore Articles

Educational Resources on Mutual Information

Back to Top

IEEE.tv Videos

MicroApps: Making the Transition from AWR to Cadence Tools - A More Seamless & Effcient Flow for Mutual IC & PCB Customers (AWR)
IEEE Strategic Plan
IEEE Future Directions: Green Information and Communications Technology: An Overview
Information Technology: Careers for the information age
Self-Organization with Information Theoretic Learning
AuthorLab: Featuring the IEEE Author Posting Policy
Special Evening Panel Discussion: AI, Cognitive Information Processing, and Rebooting Computing - IEEE Rebooting Computing 2017
Data Modeling using Kernels and Information Theoretic Learning
EMBC 2011 -Keynote -The Impact of Information Technology on Health Care Delivery - John Glaser, PhD
David Ngar Ching Tse - IEEE Richard W. Hamming Medal, 2019 IEEE Honors Ceremony
Noise Enhanced Information Systems: Denoising Noisy Signals with Noise
Seven (Small) Steps to Protecting Big Trade Secrets and Confidential Information, Part 2 - IEEE USA
Augmented Reality in Operating Rooms
CES 2008: Ford and Sirius Team Up for In-Car Navigation
Seeing the Invisibles: A Backstage Tour of Information Forensics
Seven (Small) Steps to Protecting Big Trade Secrets and Confidential Information, Part 1 - IEEE USA
ICASSP 2012 Plenary-Dr. Chin-Hui Lee
8-Element, 1-3GHz Direct Space-to-Information Converter - Matthew Bajor - RFIC Showcase 2018
Karen Bartleson interviewed at World Summit on the Information Society
The role of aggregation guided by fuzzy quantifiers in Information Retrieval and in Social Media Analytics

IEEE-USA E-Books

  • Distortion bounds and a Protocol for One-Shot Transmission of Correlated Random Variables on a Non-Coherent Multiple-Access Channel

    Bounds on the distortion derived in [1] are adapted to the case where two continuous random variables are sent over a multiple access channel with phase shifts using the help of two feedback channels. The first source is defined to be uniformly distributed and the second source is defined as the sum of the first source and an auxiliary random variable which is also uniform. Additionally, using the same definition of the two sources, the two-round protocol introduced in [2] is studied in detail and a comparison is made in order to discuss the tightness of the information-theoretic bounds.

  • Discrete Input Gaussian Channel

    This chapter derives a constraint capacity, in form of the mutual information, for an M‐ary pulse amplitude modulated signal in the case when the (finite number) signal alternatives are likely equal. The loss made by using uniformly distributed inputs is derived and addressed as the shaping gain in a Gaussian channel. In most communication systems, the choice of signal is done according to uniform distribution over a discrete set of signals. Often it is easy to achieve a coding gain of a couple of dB by using some standard channel coding. But to achieve higher gains more complex codes must be used while considering shaping of the constellation. Finally, a parameter called the signal‐to‐noise ratio (SNR) gap is derived to show how far from the capacity an uncoded system is working for a specific achieved probability of error.

  • An Investigation on Mutual Information for the Linear Predictive System and the Extrapolation of Speech Signals

    Mutual information (MI) is an important information theoretic concept which has many applications in telecommunications, in blind source separation, and in machine learning. More recently, it has been also employed for the instrumental assessment of speech intelligibility where traditionally correlation based measures are used. In this paper, we address the difference between MI and correlation from the viewpoint of discovering dependencies between variables in the context of speech signals. We perform our investigation by considering the linear predictive approximation and the extrapolation of speech signals as examples. We compare a parametric MI estimation approach based on a Gaussian mixture model (GMM) with the knearest neighbor (KNN) approach which is a well-known non-parametric method available to estimate the MI. We show that the GMM-based MI estimator leads to more consistent results.

  • Canalizing Boolean Functions Maximize the Mutual Information

    The ability of information processing in biologically motivated Boolean networks is of interest in recent information theoretic research. One measure to quantify this ability is the well known mutual information. Using Fourier analysis we show that canalizing functions maximize the mutual information between an input variable and the outcome of the function. We proof our result for Boolean functions with uniform distributed as well as product distributed input variables.

  • Multi-edge Optimization of Low-Density Parity-Check Codes for Joint Source-Channel Coding

    We present a novel joint source-channel coding system based on low-density parity-check codes where the amount of information about the source bits available at the decoder is increased by improving the connection profile between the factor graphs that compound the joint system. Furthermore, we propose an optimization strategy for the component codes based on a multi- edge-type joint optimization. Simulation results show a significant improvement in the performance compared to existent joint systems based on low-density parity-check codes.

  • Signal Enhancement as Minimization of Relevant Information Loss

    We introduce the notion of relevant information loss for the purpose of casting the signal enhancement problem in information-theoretic terms. We show that many algorithms from machine learning can be reformulated using relevant information loss, which allows their application to the aforementioned problem. As a particular example we analyze principle component analysis for dimensionality reduction, discuss its optimality, and show that the relevant information loss can indeed vanish if the relevant information is concentrated on a lower-dimensional subspace of the input space.

  • Mutual Information Based Analysis for Physical-Layer Network Coding with Optimal Phase Control

    In this paper, two-way relaying networks are considered using physical-layer network coding in a two-phase protocol. In the multiple access phase, both sources transmit their messages to the relay simultaneously. Subsequently, the relay estimates and broadcasts the XOR-based network coded message back to the sources in the broadcast phase. We concentrate on the critical multiple access phase, where several detection and decoding schemes at the relay are studied and compared with respect to mutual information and system throughput. It is shown, that the system performance of the detection and decoding schemes under investigation is highly dependent on the phase difference of the two incoming messages at the relay, motivating a phase control strategy at the sources before transmission. Simulation results confirm our analysis and the superior performance achieved by phase control in practical systems.

  • Optimal Non-Uniform Mapping for Probabilistic Shaping

    The construction of optimal non-uniform mappings for discrete input memoryless channels (DIMCs) is investigated. An efficient algorithm to find optimal mappings is proposed and the rate by which a target distribution is approached is investigated. The results are applied to non-uniform mappings for additive white Gaussian noise (AWGN) channels with finite signal constellations. The mappings found by the proposed methods outperform those obtained via a central limit theorem approach as suggested in the literature.

  • Information Measures for Continuous Variables

    This chapter considers real‐valued random variables and the information measures adopted for this case. It shows that the definition for the differential entropy is not consistent with the interpretation of the entropy as uncertainty of the random variable. One way to understand this is to discretize a continuous density function to obtain a discrete variable. The chapter shows that the properties for the relative entropy for discrete distributions are still valid for continuous distributions. In many applications, the Gaussian distribution plays an important role and information theory is not an exception. The chapter shows that the entropy is maximized for the Gaussian distribution, over all distributions for a given mean and variance. The Gaussian distribution is defined for an n‐dimensional random vector and the differential entropy function is derived through the density function.

  • Information Measures

    Information theory is a mathematical theory of communication, based on probability theory. It was introduced by Claude Shannon in his landmark paper A Mathematical Theory of Communications in 1948. He gave a quantitative measure of the amount of information stored in a variable and gave limits of how much information can be transmitted from one place to another over a given communication channel. Both the entropy and the mutual information are important measures of information. The entropy states how much information is needed to determine the outcome of a random variable. The chapter derives a famous result on data processing, called the data‐processing lemma, and defines the entropy rate, which is the corresponding entropy measure for random processes. A natural way to define the entropy per symbol for a sequence is by treating the sequence as a multidimensional random variable and averaging over the number of symbols.



Standards related to Mutual Information

Back to Top

No standards are currently tagged "Mutual Information"


Jobs related to Mutual Information

Back to Top