33,117 resources related to Data Compression
- Topics related to Data Compression
- IEEE Organizations related to Data Compression
- Conferences related to Data Compression
- Periodicals related to Data Compression
- Most published Xplore authors for Data Compression
The conference program will consist of plenary lectures, symposia, workshops and invitedsessions of the latest significant findings and developments in all the major fields of biomedical engineering.Submitted full papers will be peer reviewed. Accepted high quality papers will be presented in oral and poster sessions,will appear in the Conference Proceedings and will be indexed in PubMed/MEDLINE.
ICC 2021 - IEEE International Conference on Communications
IEEE ICC is one of the two flagship IEEE conferences in the field of communications; Montreal is to host this conference in 2021. Each annual IEEE ICC conference typically attracts approximately 1,500-2,000 attendees, and will present over 1,000 research works over its duration. As well as being an opportunity to share pioneering research ideas and developments, the conference is also an excellent networking and publicity event, giving the opportunity for businesses and clients to link together, and presenting the scope for companies to publicize themselves and their products among the leaders of communications industries from all over the world.
The International Conference on Image Processing (ICIP), sponsored by the IEEE SignalProcessing Society, is the premier forum for the presentation of technological advances andresearch results in the fields of theoretical, experimental, and applied image and videoprocessing. ICIP 2020, the 27th in the series that has been held annually since 1994, bringstogether leading engineers and scientists in image and video processing from around the world.
The ICASSP meeting is the world's largest and most comprehensive technical conference focused on signal processing and its applications. The conference will feature world-class speakers, tutorials, exhibits, and over 50 lecture and poster sessions.
IEEE INFOCOM solicits research papers describing significant and innovative researchcontributions to the field of computer and data communication networks. We invite submissionson a wide range of research topics, spanning both theoretical and systems research.
The IEEE Aerospace and Electronic Systems Magazine publishes articles concerned with the various aspects of systems for space, air, ocean, or ground environments.
The theory, design and application of Control Systems. It shall encompass components, and the integration of these components, as are necessary for the construction of such systems. The word `systems' as used herein shall be interpreted to include physical, biological, organizational and other entities and combinations thereof, which can be represented through a mathematical symbolism. The Field of Interest: shall ...
The Transactions on Biomedical Circuits and Systems addresses areas at the crossroads of Circuits and Systems and Life Sciences. The main emphasis is on microelectronic issues in a wide range of applications found in life sciences, physical sciences and engineering. The primary goal of the journal is to bridge the unique scientific and technical activities of the Circuits and Systems ...
Broad coverage of concepts and methods of the physical and engineering sciences applied in biology and medicine, ranging from formalized mathematical theory through experimental science and technological development to practical clinical applications.
Broadcast technology, including devices, equipment, techniques, and systems related to broadcast technology, including the production, distribution, transmission, and propagation aspects.
2016 Data Compression Conference (DCC), 2016
BTRFS (B-tree filesystem) is a new copy on write (CoW) filesystem for Linux aimed at implementing advanced features while focusing on fault tolerance, repair and easy administration. One important feature for BTRFS is the transparent file compression. There are two compression algorithms available in BTRFS: ZLIB and LZO. Because the compression workload is very CPU intensive, normally we have to ...
2018 Data Compression Conference, 2018
A novel and hybrid data compression framework is proposed to compress wind tunnel data. Both lossless and lossy compression can be performed.
2012 Data Compression Conference, 2012
Polarimetric SAR (POLSAR) can offer more information than single-polarized SAR that has the capability to image in all weather and day-night conditions. The multi-polarization mode and wide swath requirements result in a huge amount of data, which may exceed the on-board storage and downlink bandwidth. Effective compression of POLSAR raw data is clearly one of the best solutions. This paper ...
2015 Data Compression Conference, 2015
Dube and Beaudoin have proposed a technique of lossless data compression called compression via substring enumeration (CSE) for a binary source alphabet. Dube and Yokoo proved that CSE has a linear complexity both in time and in space worst-case performance for the length of string to be encoded. Dubé and Yokoo have specified appropriate predictors of the uniform and combinatorial ...
Proceedings DCC '95 Data Compression Conference, 1995
Summary form only given. Reports a parallel scheme for text data compression. The scheme utilizes the simple, regular, modular and cascadable structure of cellular automata (CA) with local interconnection structure that ideally suits VLSI technology. The state transition behaviour of a particular class of non- group CA, referred to as TPSA (two predecessor single attractor) CA, has been studied extensively ...
IEEE Magnetics Distinguished Lecture - Alison B. Flatau
Big Data Analytics: Tools and Technologies - Big Data Analytics Tutorial Part 2
Estimating Sparse Eigenstructure for High Dimensional Data
Human-Guided Video Data Collection in Marine Environnment
Mahmoud Daneshmand on IoT and Big Data Analytics: IoT: Even Bigger Data
Concept of Arrays
Big Data and Analytics at Verizon
Explorations in BIG Data and sMall Data with a Fuzzy Perspective
Q&A with Dr. Sorel Reisman: IEEE Big Data Podcast, Episode 2
Deep Learning and the Representation of Natural Data
Consequences of Big Data on the Individual
Q&A with Dr. Mahmoud Daneshmand: IEEE Big Data Podcast, Episode 1
Single Crystal AlGaN Bulk Acoustic Wave Resonators on Silicon Substrates with High Electromechanical Coupling: RFIC Industry Showcase
TechNews: Big Data
Temporal Pattern Mining in Symbolic Time Point and Time Interval Data
ACADIS: brokering arctic data for research
Q&A with Dejan Milojicic: IEEE Big Data Podcast, Episode 7
Computational Intelligence in (e)Healthcare - Challenges and Opportunites
Q&A with Dave Belanger & Kathy Grise, Part 1: IEEE Big Data Podcast, Episode 12
BTRFS (B-tree filesystem) is a new copy on write (CoW) filesystem for Linux aimed at implementing advanced features while focusing on fault tolerance, repair and easy administration. One important feature for BTRFS is the transparent file compression. There are two compression algorithms available in BTRFS: ZLIB and LZO. Because the compression workload is very CPU intensive, normally we have to pay additional CPU power to enable data compression in the filesystem. In this work we study the nature of data compression in BTRFS, setup the test bench to measure the benefit and the cost when enable data compression in BTRFS, and develop a new hardware acceleration method to offload the compression and de-compression workloads from the BTRFS software stack. Our experiment proves the hardware offloading solution can provide BTRFS data compression with the least CPU overhead comparing with the software implementation of ZLIB and LZO, meanwhile it also reaches a very good compression ratio and disk write throughput.
A novel and hybrid data compression framework is proposed to compress wind tunnel data. Both lossless and lossy compression can be performed.
Polarimetric SAR (POLSAR) can offer more information than single-polarized SAR that has the capability to image in all weather and day-night conditions. The multi-polarization mode and wide swath requirements result in a huge amount of data, which may exceed the on-board storage and downlink bandwidth. Effective compression of POLSAR raw data is clearly one of the best solutions. This paper presents a compression method for POLSAR raw data with the relative phase between co-polarized and cross-polarized channel being reserved. In this method, the amplitude is quantized for Rayleigh distribution with the phase of HH and VV channel quantized for uniform distribution. In order to reserve relative phase information, the phase difference between co-polarized (HH and VV) and cross-polarized (HV and VH) channel is optimally quantized for triangular distribution. Results show that by quantizing the phase difference, the relative phase information between HH (VV) and HV (VH) channel can be well preserved.
Dube and Beaudoin have proposed a technique of lossless data compression called compression via substring enumeration (CSE) for a binary source alphabet. Dube and Yokoo proved that CSE has a linear complexity both in time and in space worst-case performance for the length of string to be encoded. Dubé and Yokoo have specified appropriate predictors of the uniform and combinatorial prediction models for CSE, and proved that CSE has the asymptotic optimality for stationary binary ergodic sources. Our previous study evaluated the worst-case maximum redundancy of the modified CSE for an arbitrary binary string from the class of k-th order Markov sources. We propose a generalization of CSE for k-th order Markov sources with a finite alphabet X based on Ota and Morita in this study.
Summary form only given. Reports a parallel scheme for text data compression. The scheme utilizes the simple, regular, modular and cascadable structure of cellular automata (CA) with local interconnection structure that ideally suits VLSI technology. The state transition behaviour of a particular class of non- group CA, referred to as TPSA (two predecessor single attractor) CA, has been studied extensively and the results are utilized to develop a parallel scheme for data compression. The state transition diagram of a TPSA CA generates a unique inverted binary tree. This inverted binary tree is a labeled tree whose leaves and internal nodes have a unique pattern generated by the CA in successive cycles. This unique structure can be viewed as a dictionary for text compression. In effect, storage and retrieval of dictionary of conventional data compression techniques get replaced by the autonomous mode operation of the CA that generates the dictionary dynamically with appropriate mapping of dictionary data to CA states wherever necessary.
In this paper, we propose a new compression/decompression algorithm called LZB which belongs to a class of algorithms related to Lempel-Ziv (LZ). The distinguishing characteristic of LZB is that it allows decompression from arbitrary points in compressed data and decompressing a portion of compressed data does not require decompressing from the beginning. This is accomplished by setting a limit on how far back a reference in compressed data can directly or indirectly point to. This limit is defined as a sliding gate. In the well- known LZ77 algorithm, every match may be (directly or indirectly) referenced back to its first detected occurrence in the input. Hence, in order to decompress any part of the data we need to decompress from the beginning of compressed data. However, in LZB each match is only traced within the confinement of the designated gate size. This eliminates the need to decompress from the beginning. An example is illustrated in Figure 1. In order to take advantage of LZB and decompress data from any arbitrary location in the stream, the decoder needs to know where to start. During the compression, LZB stores a list of markers which specify distinct positions in the compressed version and their corresponding positions in the uncompressed version. LZB starts the decompression from the closest marker preceding the gate of the beginning of the data of interest.
List update algorithms have been widely used as subroutines in compression schemas, most notably as part of Burrows-Wheeler compression. We performed an experimental comparison of various list update algorithms both as stand alone compression mechanisms and as a second stage of the BWT-based compression. We considered the following list update algorithms: move-to-front (MTF), sort-by- rank (SBR), frequency-count (FC), timestamp (TS), and transpose (TR) on text files of the Calgary Corpus. Our results showed that TR and FC usually outperform MTF and TS if we do not use BWT. This is in contrast with competitive analysis in which MTF and TS are superior to TS and FC. After BWT, MTF and TS have comparable performance and always outperform FC and TR. Our experiments are consistent with the intuition that BWT increases locality of reference and the predicted result from the locality of reference model of the work of Angelopoulos et al. (2008).
We propose a high performance compression system with error resilience for sensor networks.
Many methodologies and similarity measures based on data compression have been recently introduced to compute similarities between general kinds of data. Two important similarity indices are the normalized information distance (NID), with its approximation normalized compression distance (NCD), and the pattern recognition based on data compression (PRDC). At first sight NCD and PRDC are quite different: the former is a direct metric while the latter is a methodology which computes a compression distance with an intermediate step of encoding files into texts. In spite of this, it is possible to demonstrate that they are both based on estimates of Kolmogorov complexities (when this is known for the former but not for the latter). Finally, this results in the definition of a new measure: the model conditioned data compression based similarity measure (McDCSM), which is a modified version of PRDC, and is the topic of this paper.
Tiangong-2 Interferometric imaging radar altimeter (InIRA) applies the technology of small incident angle and high precision interferometric phase measurement, which has the characteristics of wide swath, imaging and high accuracy height measuring. Compared with traditional altimeter, InIRA can realize much wider swath and can obtain SAR image of observed area, so there is a high requirement of onboard memory and downlink speed for InIRA. As a result, we need to investigate data compression technique for InIRA to reduce the data rate. As an efficient algorithm for SAR echoes data compression, Block Adaptive Quantization (BAQ) is widely used to spaceborne SAR system for raw data compression. The motivation of this work is to evaluate the influence of BAQ algorithm on interferometric coherence and sea surface height error for InIRA.
No standards are currently tagged "Data Compression"