Conferences related to Data Management

Back to Top

OCEANS 2016

The Marine Technology Scociety and the Oceanic Engineering Society of the IEEE cosponor a joint annual conference and exposition on ocean science, engineering, and policy. The OCEANS conference covers four days. One day for tutorials and three for approx. 500 technical papers and 150 -200 exhibits.

  • OCEANS 2015 - MTS/IEEE Washington

    The Marine Technology Scociety and the Oceanic Engineering Society of the IEEE cosponor a joint annual conference and exposition on ocean science, engineering, and policy. The OCEANS conference covers four days. One day for tutorials and three for approx. 450 technical papers and 150-200 exhibits.

  • OCEANS 2014

    The OCEANS conference covers four days. One day for tutorials and three for approx. 450 technical papers and 150-200 exhibits.

  • OCEANS 2013

    Three days of 8-10 tracks of technical sessions (400-450 papers) and concurent exhibition (150-250 exhibitors)

  • OCEANS 2012

    Ocean related technology. Tutorials and three days of technical sessions and exhibits. 8-12 parallel technical tracks.

  • OCEANS 2011

    The Marine Technology Society and the Oceanic Engineering Scociety of the IEEE cosponsor a joint annual conference and exposition on ocean science engineering, and policy.

  • OCEANS 2010

    The Marine Technology Society and the Oceanic Engineering Scociety of the IEEE cosponsor a joint annual conference and exposition on ocean science engineering, and policy.

  • OCEANS 2009

  • OCEANS 2008

    The Marine Technology Society (MTS) and the Oceanic Engineering Society (OES) of the Institute of Electrical and Electronic Engineers (IEEE) cosponsor a joint conference and exposition on ocean science, engineering, education, and policy. Held annually in the fall, it has become a focal point for the ocean and marine community to meet, learn, and exhibit products and services. The conference includes technical sessions, workshops, student poster sessions, job fairs, tutorials and a large exhibit.

  • OCEANS 2007

  • OCEANS 2006

  • OCEANS 2005


2014 IEEE Semiconductor Wafer Test Workshop (SWTW 2014)

The IEEE SW Test Workshop is the only workshop specializing in semiconductor wafer level testing. It has a comprehensive technical program that is comnplemented by social activities which promote networking and sharing among the attendees. Booth displays at SWTW provide attendees with a one -stop opportunity to meet in person with all the key suppiers and learn about their new products and services.

  • 2013 IEEE Semiconductor Wafer Test Workshop (SWTW 2013)

    The IEEE SW Test Workshop is the only workshop specializing in semiconductor wafer level testing. It has a comprehensive technical program that is complemented by social activities which promote networking and sharing among the attendees. Booth displays at SWTW provide attendees with a on-stop opportunity to meet firsthand with all the key suppliers and learn about their new products and services.

  • 2012 IEEE Semiconductor Wafer Test Workshop (SWTW 2012)

    The IEEE SW Test Workshop is the only workshop specializing in semiconductor wafer level testing. It has a comprehensive technical program that is complemented by social activities which promote networking and sharing among the attendees. Booth displays at SWTW provide attendees with a on-stop opportunity to meet firsthand with all the key suppliers and learn about their new products and services.

  • 2011 IEEE Semiconductor Wafer Test Workshop (SWTW 2011)

    The IEEE SW Test Workshop is the only workshop specializing in semiconductor wafer level testing. It has a comprehensive technical program that is complemented by social activities which promote networking and sharing amoung attendees. Exhibit booth displays at SWTW provide attendees with a one-stop opportunity to meet firsthand with all key suppliers and learn about their new products and services.

  • 2010 IEEE Semiconductor Wafer Test Workshop (SWTW 2010)

    The IEEE SW Test Workshop is the only workshop specializing in semiconductor wafer level testing. It has a comprehensive technical program that is complemented by social activities which promote networking and sharing among the attendees. Booth displays at SWTW provide attendees with a one-stop opportunity to meet firsthand with all the key suppliers and learn about their new products and services.

  • 2009 IEEE Semiconductor Wafer Test Workshop (SWTW 2009)

    The IEEE SW Test Workshop is the only workshop specializing in semiconductor wafer level testing. It has a comprehensive technical program that is complemented by social activities which promote networking and sharing among the attendees. Booth displays at SWTW provide attendees with a one-stop opportunity to meet fi rsthand with all the key suppliers and learn about their new products and services.

  • 2008 IEEE Semiconductor Wafer Test Workshop (SWTW 2008)

    The SW Test Workshop is the only IEEE sponsored conference dealing with all aspects of semiconductor wafer and die level probe testing.


2013 ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN)

The conference features two interleaved tracks, the Information Processing (IP) track, and the Sensor Platforms, Tools and Design Methods (SPOTS) track.

  • 2012 ACM/IEEE 11th International Conference on Information Processing in Sensor Networks (IPSN)

    IPSN includes signal and image processing, information and coding theory, databases and information management, distributed algorithms, networks and protocols, wireless communications, collaborative objects and the Internet of Things, machine learning, and embedded systems design.

  • 2011 10th International Conference on Information Processing in Sensor Networks (IPSN)

    The International Conference on Information Processing in Sensor Networks (IPSN) is a leading, single-track, annual forum on sensor network research. IPSN brings together researchers from academia, industry, and government to present and discuss recent advances in both theoretical and experimental research.

  • 2010 9th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN)

    The International Conference on Information Processing in Sensor Networks (IPSN) is a leading, single-track, annual forum on sensor network research. IPSN brings together researchers from academia, industry, and government to present and discuss recent advances in both theoretical and experimental research.

  • 2009 8th ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN)

    The International Conference on Information Processing in Sensor Networks (IPSN) is a leading, single-track, annual forum that brings together researchers from academia, industry, and government to present and discuss recent advances in sensor network research and applications. The conference covers both theoretical and experimental research, as it pertains to sensor networks, in a broad range of disciplines including signal and image processing, information and coding theory, databases and information mana


2013 Eighth International Conference on Digital Information Management (ICDIM)

The principal aim of this conference is to bring people in academia, research laboratories and industry together, and offer a collaborative platform to address the emerging issues and solutions in digital information science and technology. The ICDIM intends to bridge the gap between different areas of digital information management, science and technology.

  • 2012 Seventh International Conference on Digital Information Management (ICDIM)

    The principal aim of this conference is to bring people in academia, research laboratories and industry together, and offer a collaborative platform to address the emerging issues and solutions in digital information science and technology.

  • 2011 Sixth International Conference on Digital Information Management (ICDIM)

    The ICDIM 2011 is a forum of academic and industrial researchers and scientists in digital information management and technology. It addresses the research in significant areas of information management, database management, and process management.

  • 2010 Fifth International Conference on Digital Information Management (ICDIM)

    The International Conference on Digital Information Management is a multidisciplinary conference on digital information management, science and technology.

  • 2009 Fourth International Conference on Digital Information Management (ICDIM)

    he principal aim of this conference is to bring people in academia, research laboratories and industry and offer a collaborative platform to address the emerging issues and solutions in digital information science and technology. The ICDIM intends to bridge the gap between different areas of digital information management, science and technology. This forum will address a large number of themes and issues. The conference will have original research and industrial papers on the theory, design and implementat

  • 2008 Third International Conference on Digital Information Management (ICDIM)

    The International Conference on Digital Information Management is a multidisciplinary conference on digital information management, science and technology. The principal aim of this conference is to bring people in academia, research laboratories and industry and offer a collaborative platform to address the emerging issues and solutions in digital information science and technology. The ICDIM intends to bridge the gap between different areas of digital information management, science and technology. This f


2013 IEEE International Conference on Data Engineering (ICDE 2013)

The annual IEEE International Conference on Data Engineering (ICDE) addresses research issues in designing, building, managing, and evaluating advanced data-intensive systems and applications. It is a leading forum for researchers, practitioners, developers, and users to explore cutting-edge ideas and to exchange techniques, tools, and experiences.


More Conferences

Periodicals related to Data Management

Back to Top

Engineering Management Review, IEEE

Reprints articles from other publications of significant interest to members. The papers are aimed at those engaged in managing research, development, or engineering activities. Reprints make it possible for the readers to receive the best of today's literature without having to subscribe to and read other periodicals.


Engineering Management, IEEE Transactions on

Management of technical functions such as research, development, and engineering in industry, government, university, and other settings. Emphasis is on studies carried on within an organization to help in decision making or policy formation for RD&E.


Knowledge and Data Engineering, IEEE Transactions on

Artificial intelligence techniques, including speech, voice, graphics, images, and documents; knowledge and data engineering tools and techniques; parallel and distributed processing; real-time distributed processing; system architectures, integration, and modeling; database design, modeling, and management; query design, and implementation languages; distributed database control; statistical databases; algorithms for data and knowledge management; performance evaluation of algorithms and systems; data communications aspects; system ...


Potentials, IEEE

This award-winning magazine for technology professionals explores career strategies, the latest research and important technical developments. IEEE Potentials covers theories to practical applications and highlights technology's global impact.


Proceedings of the IEEE

The most highly-cited general interest journal in electrical engineering and computer science, the Proceedings is the best way to stay informed on an exemplary range of topics. This journal also holds the distinction of having the longest useful archival life of any EE or computer related journal in the world! Since 1913, the Proceedings of the IEEE has been the ...


More Periodicals

Most published Xplore authors for Data Management

Back to Top

Xplore Articles related to Data Management

Back to Top

To Facilitate File Sync and File Share with Ezilla-Swift

Hui-Shan Chen; Yi-Lun Pan; Chang-Hsing Wu; Weicheng Huang 2014 IEEE 6th International Conference on Cloud Computing Technology and Science, 2014

Due to the rapid growth in the demand of Cloud Computing technologies and services, the fast and easy deployment of the Cloud environment and software has become mandatory to the providers. To help tackling such an issue, the Ezilla, which is considered as a private Cloud toolkit, has been developed by the Per Comp Team in the National Center for ...


An analysis of transmission line grounding techniques using a digitized data management solution

James Aylor; Will Pike; James Shaheen; Kristopher Van Norman; Callum Weinberg 2016 IEEE Systems and Information Engineering Design Symposium (SIEDS), 2016

In 2014, the Occupational Safety and Health Administration (OSHA) created new regulations regarding safe work practices for transmission line construction. These new regulations primarily focus upon improving the practices of grounding equipment during construction to create a safer work zone. Workers are at risk of touch and step potentials from high voltage lines and current propagation through the surrounding soil. ...


Compiling HPC Kernels for the REDEFINE CGRA

Kavitha T. Madhu; Saptarsi Das; Nalesh S.; S. K. Nandy; Ranjani Narayan 2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems, 2015

In this paper, we present a compilation flow for HPC kernels on the REDEFINE coarse-grain reconfigurable architecture (CGRA). REDEFINE is a scalable macro- dataflow machine in which the compute elements (CEs) communicate through messages. REDEFINE offers the ability to exploit high degree of coarse-grain and pipeline parallelism. The CEs in REDEFINE are enhanced with reconfigurable macro data-paths called HyperCells that ...


Automating Management of Resources on Desktop Machines

Mian Guan Lim; Frank Wang; Nan Jiang 2015 17th UKSim-AMSS International Conference on Modelling and Simulation (UKSim), 2015

Two challenges that affect retrieval tasks in a typical computer desktop user environment are the speed and accuracy of accessing information such as documents and media files. Generating a common model to fit patterns of use with a reasonable level of prediction is challenging. This is as the individual usage patterns on the desktop is assumed to vary largely depending ...


Establishing a postgraduate research programme in spatial data management

S. K. Cockcroft; P. G. Firns Software Education Conference, 1994. Proceedings., 1994

To cope with the supervision of increasing numbers of postgraduate students, a useful strategy is to establish research programmes comprising a number of interrelated projects. The success of such a programme is dependent upon the existence of clearly stated long-term and medium-term objectives. The paper describes a research programme, being developed within the University of Otago's Department of Information Science, ...


More Xplore Articles

Educational Resources on Data Management

Back to Top

eLearning

To Facilitate File Sync and File Share with Ezilla-Swift

Hui-Shan Chen; Yi-Lun Pan; Chang-Hsing Wu; Weicheng Huang 2014 IEEE 6th International Conference on Cloud Computing Technology and Science, 2014

Due to the rapid growth in the demand of Cloud Computing technologies and services, the fast and easy deployment of the Cloud environment and software has become mandatory to the providers. To help tackling such an issue, the Ezilla, which is considered as a private Cloud toolkit, has been developed by the Per Comp Team in the National Center for ...


An analysis of transmission line grounding techniques using a digitized data management solution

James Aylor; Will Pike; James Shaheen; Kristopher Van Norman; Callum Weinberg 2016 IEEE Systems and Information Engineering Design Symposium (SIEDS), 2016

In 2014, the Occupational Safety and Health Administration (OSHA) created new regulations regarding safe work practices for transmission line construction. These new regulations primarily focus upon improving the practices of grounding equipment during construction to create a safer work zone. Workers are at risk of touch and step potentials from high voltage lines and current propagation through the surrounding soil. ...


Compiling HPC Kernels for the REDEFINE CGRA

Kavitha T. Madhu; Saptarsi Das; Nalesh S.; S. K. Nandy; Ranjani Narayan 2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems, 2015

In this paper, we present a compilation flow for HPC kernels on the REDEFINE coarse-grain reconfigurable architecture (CGRA). REDEFINE is a scalable macro- dataflow machine in which the compute elements (CEs) communicate through messages. REDEFINE offers the ability to exploit high degree of coarse-grain and pipeline parallelism. The CEs in REDEFINE are enhanced with reconfigurable macro data-paths called HyperCells that ...


Automating Management of Resources on Desktop Machines

Mian Guan Lim; Frank Wang; Nan Jiang 2015 17th UKSim-AMSS International Conference on Modelling and Simulation (UKSim), 2015

Two challenges that affect retrieval tasks in a typical computer desktop user environment are the speed and accuracy of accessing information such as documents and media files. Generating a common model to fit patterns of use with a reasonable level of prediction is challenging. This is as the individual usage patterns on the desktop is assumed to vary largely depending ...


Establishing a postgraduate research programme in spatial data management

S. K. Cockcroft; P. G. Firns Software Education Conference, 1994. Proceedings., 1994

To cope with the supervision of increasing numbers of postgraduate students, a useful strategy is to establish research programmes comprising a number of interrelated projects. The success of such a programme is dependent upon the existence of clearly stated long-term and medium-term objectives. The paper describes a research programme, being developed within the University of Otago's Department of Information Science, ...


More eLearning Resources

IEEE-USA E-Books

  • An Integration Protocol for Extensible Namespaces (PEN)

    This paper describes a toolkit approach for building complex hierarchical namespaces based upon PEN (Protocol for Extensible Namespaces). The namespace paradigm provides a highly extensible and easily understood means for specifying how things are to be integrated. PEN's importance lies in its ability to simultaneously facilitate both hierarchical and contextual organization and access to information resources equally well by people and applications. The PEN concept has been successfully prototyped in a Design Data Management System (DDMS) for electronic CAD-based design.

  • No title

    Interactive display and visualization of large geometric and textured models is becoming a fundamental capability. There are numerous application areas, including games, movies, CAD, virtual prototyping, and scientific visualization. One of observations about geometric models used in interactive applications is that their model complexity continues to increase because of fundamental advances in 3D modeling, simulation, and data capture technologies. As computing power increases, users take advantage of the algorithmic advances and generate even more complex models and data sets. Therefore, there are many cases where we are required to visualize massive models that consist of hundreds of millions of triangles and, even, billions of triangles. However, interactive visualization and handling of such massive models still remains a challenge in computer graphics and visualization. In this monograph we discuss various techniques that enable interactive visualization of massive models. These techniques include visibility computation, simplification, levels-of-detail, and cache-coherent data management.We believe that the combinations of these techniques can make it possible to interactively visualize massive models in commodity hardware. Table of Contents: Introduction / Visibility / Simplification and Levels of Detail / Alternative Representations / Cache-Coherent Data Management / Conclusions / Bibliography

  • Restoration Techniques

    "At a time when bulk power systems operate close to their design limits, the restructuring of the electric power industry has created vulnerability to potential blackouts. Prompt and effective power system restoration is essential for the minimization of downtime and costs to the utility and its customers, which mount rapidly after a system blackout. Power System Restoration meets the complex challenges that arise from the dynamic capabilities of new technology in areas such as large-scale system analysis, communication and control, data management, artificial intelligence, and allied disciplines. It provides an up-to-date description of the restoration methodologies and implementation strategies practiced internationally. The book opens with a general overview of the restoration process and then covers: * Techniques used in restoration planning and training * Knowledge-based systems as operational aids in restoration * Issues associated with hydro and thermal power plants * High and extra-high voltage transmission systems * Restoration of distribution systems Power System Restoration is essential reading for all power system planners and operating engineers in the power industry. It is also a valuable reference for researchers, practicing power engineers, and engineering students." Sponsored by: IEEE Power Engineering Society

  • Ontology-Based Analysis of Protein Interaction Networks

    This chapter contains sections titled: Definition of Ontology Languages for Modeling Ontologies Biomedical Ontologies Ontology-Based Analysis of Protein Interaction Data Semantic Similarity Measures of Proteins The Gene Ontology Annotation Database (GOA) FussiMeg and ProteinOn Summary

  • No title

    Access control is one of the fundamental services that any Data Management System should provide. Its main goal is to protect data from unauthorized read and write operations. This is particularly crucial in today's open and interconnected world, where each kind of information can be easily made available to a huge user population, and where a damage or misuse of data may have unpredictable consequences that go beyond the boundaries where data reside or have been generated. This book provides an overview of the various developments in access control for data management systems. Discretionary, mandatory, and role-based access control will be discussed, by surveying the most relevant proposals and analyzing the benefits and drawbacks of each paradigm in view of the requirements of different application domains. Access control mechanisms provided by commercial Data Management Systems are presented and discussed. Finally, the last part of the book is devoted to discussion of some of the ost challenging and innovative research trends in the area of access control, such as those related to the Web 2.0 revolution or to the Database as a Service paradigm. This book is a valuable reference for an heterogeneous audience. It can be used as either an extended survey for people who are interested in access control or as a reference book for senior undergraduate or graduate courses in data security with a special focus on access control. It is also useful for technologists, researchers, managers, and developers who want to know more about access control and related emerging trends. Table of Contents: Access Control: Basic Concepts / Discretionary Access Control for Relational Data Management Systems / Discretionary Access Control for Advanced Data Models / Mandatory Access Control / Role-based Access Control / Emerging Trends in Access Control

  • Distribution System Restoration

    "At a time when bulk power systems operate close to their design limits, the restructuring of the electric power industry has created vulnerability to potential blackouts. Prompt and effective power system restoration is essential for the minimization of downtime and costs to the utility and its customers, which mount rapidly after a system blackout. Power System Restoration meets the complex challenges that arise from the dynamic capabilities of new technology in areas such as large-scale system analysis, communication and control, data management, artificial intelligence, and allied disciplines. It provides an up-to-date description of the restoration methodologies and implementation strategies practiced internationally. The book opens with a general overview of the restoration process and then covers: * Techniques used in restoration planning and training * Knowledge-based systems as operational aids in restoration * Issues associated with hydro and thermal power plants * High and extra-high voltage transmission systems * Restoration of distribution systems Power System Restoration is essential reading for all power system planners and operating engineers in the power industry. It is also a valuable reference for researchers, practicing power engineers, and engineering students." Sponsored by: IEEE Power Engineering Society

  • Cryptography

    The field of cryptography has experienced an unprecedented development in the past decade and the contributors to this book have been in the forefront of these developments. In an information-intensive society, it is essential to devise means to accomplish, with information alone, every function that it has been possible to achieve in the past with documents, personal control, and legal protocols (secrecy, signatures, witnessing, dating, certification of receipt and/or origination). This volume focuses on all these needs, covering all aspects of the science of information integrity, with an emphasis on the cryptographic elements of the subject. In addition to being an introductory guide and survey of all the latest developments, this book provides the engineer and scientist with algorithms, protocols, and applications. Of interest to computer scientists, communications engineers, data management specialists, cryptographers, mathematicians, security specialists, network engineers.

  • Data Undermining: Data Relationality and Networked Experience

    This chapter contains sections titled: Databases and the cognitive labor of data management, Data as relation of relations, The perceptible and imperceptibility, A selective genealogy of data mining, Relational data aesthetics after the relational networking of data, Becoming imperceptible in an age of perceptibility, Becoming imperceptible in an age of imperceptibility

  • Visualization of Protein Interaction Networks

    This chapter contains sections titled: Introduction Cytoscape CytoMCL NAViGaTOR BioLayout Express3D Medusa ProViz Ondex PIVOT Pajek Graphviz GraphCrunch VisANT PIANA Osprey cPATH PATIKA Summary

  • No title

    While many Web 2.0-inspired approaches to semantic content authoring do acknowledge motivation and incentives as the main drivers of user involvement, the amount of useful human contributions actually available will always remain a scarce resource. Complementarily, there are aspects of semantic content authoring in which automatic techniques have proven to perform reliably, and the added value of human (and collective) intelligence is often a question of cost and timing. The challenge that this book attempts to tackle is how these two approaches (machine- and human-driven computation) could be combined in order to improve the cost-performance ratio of creating, managing, and meaningfully using semantic content. To do so, we need to first understand how theories and practices from social sciences and economics about user behavior and incentives could be applied to semantic content authoring. We will introduce a methodology to help software designers to embed incentives-minded functiona ities into semantic applications, as well as best practices and guidelines. We will present several examples of such applications, addressing tasks such as ontology management, media annotation, and information extraction, which have been built with these considerations in mind. These examples illustrate key design issues of incentivized Semantic Web applications that might have a significant effect on the success and sustainable development of the applications: the suitability of the task and knowledge domain to the intended audience, and the mechanisms set up to ensure high-quality contributions, and extensive user involvement. Table of Contents: Semantic Data Management: A Human-driven Process / Fundamentals of Motivation and Incentives / Case Study: Motivating Employees to Annotate Content / Case Study: Building a Community of Practice Around Web Service Management and Annotation / Case Study: Games with a Purpose for Semantic Content Creation / Conclusions



Standards related to Data Management

Back to Top

Guide for Information Technology - Software Life Cycle Processes - Life Cycle Data

The base document ISO/IEC 12207, establishes a common framework for software life cycle processes, with well-defined terminology, that can be referenced by the software industry. It contains activities, and tasks that are to be applied during the acquisition of a system that contains software, a stand-alone software product, and software service and during the supply, development, operation, and maintenance of ...


IEEE Recommended Practice for the Transfer of Power Quality Data

This recommended practice will develop a criteria for the transfer of power quality data between instruments and computers. This data includes raw, processed, simulated, proposed, specified and calculated data. The transfer criteria will include the data as well as appropriate data characterization parameters, such as sampling rate, resolution, calibration status, instrument identification, and other pertinent or desired characteristics or data. ...


IEEE Standard for a Smart Transducer Interface for Sensors and Actuator -- Wireless Communication Protocols and Transducer Electronic Data Sheet (TEDS) Formats

This project establishes a standard for wireless communication methods and data format for transducers (sensors and actuators). The standard defines a TEDS based on the IEEE 1451 concept and protocols to access TEDS and transducer data. It adopts necessary wireless interfaces and protocols to facilitate the use of technically differentiated, existing wireless technology solutions. It does not specify transducer design, ...


IEEE Standard for a Smart Transducer Interface for Sensors and Actuators - Common Functions, Communication Protocols, and Transducer Electronic Data Sheet (TEDS) Formats

This project develops a set of common functionality for the family of IEEE 1451 smart transducer interface standards. This functionality is independent of the physical communications media. It includes the basic functions required to control and manage smart transducers, common communications protocols, and media-independent TEDS formats. It defines a set of implementation-independent APIs. This project does not specify signal conditioning ...


IEEE Standard for a Smart Transducer Interface for Sensors and Actuators - Digital Communication and Transducer Electronic Data Sheet (TEDS) Formats for Distributed Multidrop Systems

This project will develop a standard thqt defines a digital interface for connecting multiple physically separated transducers. It will leverage off the IEEE P1451.1 and IEEE P1451.2 standards. The standard will define the TEDS format, the electrical interface, channel identification protocols, hot swap protocols, time synchronization protocols, and the read and write logic functions used to access the TEDS and ...


More Standards