Conferences related to Biographies

Back to Top

2008 IEEE International Engineering Management Conference - EM 2008 (IEMC) (CONFERENCE CANCELLED)

All areas of engineering and technology management. This is the Engineering Management Society flagship conference series.



Periodicals related to Biographies

Back to Top

Spectrum, IEEE

IEEE Spectrum Magazine, the flagship publication of the IEEE, explores the development, applications and implications of new technologies. It anticipates trends in engineering, science, and technology, and provides a forum for understanding, discussion and leadership in these areas. IEEE Spectrum is the world's leading engineering and scientific magazine. Read by over 300,000 engineers worldwide, Spectrum provides international coverage of all ...




Xplore Articles related to Biographies

Back to Top

The Atmospheric Characterization for Exploration and Science (ACES) instrument suite for Mars

Scot Rafkin 2015 IEEE Aerospace Conference, 2015

The Atmospheric Characterization for Exploration and Science (ACES) instrument suite was designed to address the highest priority lower atmosphere goals and investigations identified by the Mars Exploration Program[1], and to address both exploration technology Strategic Knowledge Gaps (SKGs) and science goals identified for the Mars 2020 mission[2][3]. The ACES Mars surface in situ instrument suite measures atmospheric dust properties, fundamental ...


Corrections to “SF/RESURF LDMOST”

IEEE Transactions on Electron Devices, 2004

None


Impact of co-site interference on L/C-band spectrum for UAS control and non-payload communications

Robert J. Kerczewski; William D. Bishop; Douglas J. Hoder; Kurt A. Shalkhauser; Jeffrey D. Wilson 2015 IEEE Aerospace Conference, 2015

In order to provide for the safe integration of unmanned aircraft systems into the National Airspace System, the control and non-payload communications (CNPC) link connecting the ground-based pilot with the unmanned aircraft must be highly reliable. A specific requirement is that it must operate using aviation safety radiofrequency spectrum. The 2012 World Radiocommunication Conference (WRC-12) provided a potentially suitable allocation ...


Biographies

2005 IEEE International Reliability Physics Symposium, 2005. Proceedings. 43rd Annual., 2005

First Page of the Article ![](/xploreAssets/images/absImages/01493220.png)


Claude Shannon: Mastermind of Information Theory

George Strawn IT Professional, 2014

Claude Shannon helped create the digital IT revolution by contributing to both digital computing and digital communications. Learn about Shannon's contributions to digital circuit theory and information theory and the connection he initiated between information and physics.


More Xplore Articles

Educational Resources on Biographies

Back to Top

eLearning

The Atmospheric Characterization for Exploration and Science (ACES) instrument suite for Mars

Scot Rafkin 2015 IEEE Aerospace Conference, 2015

The Atmospheric Characterization for Exploration and Science (ACES) instrument suite was designed to address the highest priority lower atmosphere goals and investigations identified by the Mars Exploration Program[1], and to address both exploration technology Strategic Knowledge Gaps (SKGs) and science goals identified for the Mars 2020 mission[2][3]. The ACES Mars surface in situ instrument suite measures atmospheric dust properties, fundamental ...


Corrections to “SF/RESURF LDMOST”

IEEE Transactions on Electron Devices, 2004

None


Impact of co-site interference on L/C-band spectrum for UAS control and non-payload communications

Robert J. Kerczewski; William D. Bishop; Douglas J. Hoder; Kurt A. Shalkhauser; Jeffrey D. Wilson 2015 IEEE Aerospace Conference, 2015

In order to provide for the safe integration of unmanned aircraft systems into the National Airspace System, the control and non-payload communications (CNPC) link connecting the ground-based pilot with the unmanned aircraft must be highly reliable. A specific requirement is that it must operate using aviation safety radiofrequency spectrum. The 2012 World Radiocommunication Conference (WRC-12) provided a potentially suitable allocation ...


Biographies

2005 IEEE International Reliability Physics Symposium, 2005. Proceedings. 43rd Annual., 2005

First Page of the Article ![](/xploreAssets/images/absImages/01493220.png)


Claude Shannon: Mastermind of Information Theory

George Strawn IT Professional, 2014

Claude Shannon helped create the digital IT revolution by contributing to both digital computing and digital communications. Learn about Shannon's contributions to digital circuit theory and information theory and the connection he initiated between information and physics.


More eLearning Resources

IEEE.tv Videos

No IEEE.tv Videos are currently tagged "Biographies"

IEEE-USA E-Books

  • Authors' Biographies

    None

  • No title

    Today, embedded systems are used in many security-critical applications, from access control, electronic tickets, sensors, and smart devices (e.g., wearables) to automotive applications and critical infrastructures. These systems are increasingly used to produce and process both security-critical and privacy-sensitive data, which bear many security and privacy risks. Establishing trust in the underlying devices and making them resistant to software and hardware attacks is a fundamental requirement in many applications and a challenging, yet unsolved, task. Solutions solely based on software can never ensure their own integrity and trustworthiness while resource-constraints and economic factors often prevent the integration of sophisticated security hardware and cryptographic co-processors. In this context, Physically Unclonable Functions (PUFs) are an emerging and promising technology to establish trust in embedded systems with minimal hardware requirements. This book explores the des gn of trusted embedded systems based on PUFs. Specifically, it focuses on the integration of PUFs into secure and efficient cryptographic protocols that are suitable for a variety of embedded systems. It exemplarily discusses how PUFs can be integrated into lightweight device authentication and attestation schemes, which are popular and highly relevant applications of PUFs in practice. For the integration of PUFs into secure cryptographic systems, it is essential to have a clear view of their properties. This book gives an overview of different approaches to evaluate the properties of PUF implementations and presents the results of a large scale security analysis of different PUF types implemented in application- specific integrated circuits (ASICs). To analyze the security of PUF-based schemes as is common in modern cryptography, it is necessary to have a security framework for PUFs and PUF-based systems. In this book, we give a flavor of the formal modeling of PUFs that is in its beg nning and that is still undergoing further refinement in current research. The objective of this book is to provide a comprehensive overview of the current state of secure PUF-based cryptographic system design and the related challenges and limitations. Table of Contents: Preface / Introduction / Basics of Physically Unclonable Functions / Attacks on PUFs and PUF-based Systems / Advanced PUF Concepts / PUF Implementations and Evaluation / PUF-based Cryptographic Protocols / Security Model for PUF-based Systems / Conclusion / Terms and Abbreviations / Bibliography / Authors' Biographies

  • Editors' Biographies

    "This anthology combines 15 years of microstrip antenna technology research into one significant volume and includes a special introductory tutorial by the co-editors. Covering theory, design and modeling techniques and methods, this source book is an excellent reference tool for engineers who want to become more familiar with microstrip antennas and microwave systems. Proven antenna designs, novel solutions to practical design problemsand relevant papers describing the theory of operation and analysis of microstrip antennas are contained within this convenient reference."

  • No title

    This book is about how people (we refer to them as practitioners) can help guide participants in creating representations of issues or ideas, such as collaborative diagrams, especially in the context of Participatory Design (PD). At its best, such representations can reach a very high level of expressiveness and usefulness, an ideal we refer to as Knowledge Art. Achieving that level requires effective engagement, often aided by facilitators or other practitioners. Most PD research focuses on tools and methods, or on participant experience. The next source of advantage is to better illuminate the role of practitioners-the people working with participants, tools, and methods in service of a project's larger goals. Just like participants, practitioners experience challenges, interactions, and setbacks, and come up with creative ways to address them while maintaining their stance of service to participants and stakeholders. Our research interest is in understanding what moves and c oices practitioners make that either help or hinder participants' engagement with representations. We present a theoretical framework that looks at these choices from the experiential perspectives of narrative, aesthetics, ethics, sensemaking and improvisation and apply it to five diverse case studies of actual practice. Table of Contents: Acknowledgments / Introduction / Participatory Design and Representational Practice / Dimensions of Knowledge Art / Case Studies / Discussion and Conclusions / Appendix: Knowledge Art Analytics / Bibliography / Author Biographies

  • No title

    Cooperative network supercomputing is becoming increasingly popular for harnessing the power of the global Internet computing platform. A typical Internet supercomputer consists of a master computer or server and a large number of computers called workers, performing computation on behalf of the master. Despite the simplicity and benefits of a single master approach, as the scale of such computing environments grows, it becomes unrealistic to assume the existence of the infallible master that is able to coordinate the activities of multitudes of workers. Large-scale distributed systems are inherently dynamic and are subject to perturbations, such as failures of computers and network links, thus it is also necessary to consider fully distributed peer-to-peer solutions. We present a study of cooperative computing with the focus on modeling distributed computing settings, algorithmic techniques enabling one to combine efficiency and fault-tolerance in distributed systems, and the exposit on of trade-offs between efficiency and fault-tolerance for robust cooperative computing. The focus of the exposition is on the abstract problem, called Do-All, and formulated in terms of a system of cooperating processors that together need to perform a collection of tasks in the presence of adversity. Our presentation deals with models, algorithmic techniques, and analysis. Our goal is to present the most interesting approaches to algorithm design and analysis leading to many fundamental results in cooperative distributed computing. The algorithms selected for inclusion are among the most efficient that additionally serve as good pedagogical examples. Each chapter concludes with exercises and bibliographic notes that include a wealth of references to related work and relevant advanced results. Table of Contents: Introduction / Distributed Cooperation and Adversity / Paradigms and Techniques / Shared-Memory Algorithms / Message-Passing Algorithms / The Do-All Problem in Other Setting / Bibliography / Authors' Biographies

  • No title

    This synthesis lecture presents an intuitive introduction to the mathematics of motion and deformation in computer graphics. Starting with familiar concepts in graphics, such as Euler angles, quaternions, and affine transformations, we illustrate that a mathematical theory behind these concepts enables us to develop the techniques for efficient/effective creation of computer animation. This book, therefore, serves as a good guidepost to mathematics (differential geometry and Lie theory) for students of geometric modeling and animation in computer graphics. Experienced developers and researchers will also benefit from this book, since it gives a comprehensive overview of mathematical approaches that are particularly useful in character modeling, deformation, and animation. Table of Contents: Preface / Symbols and Notations / Introduction / Rigid Transformation / Affine Transformation / Exponential and Logarithm of Matrices / 2D Affine Transformation between Two Triangles / Global 2D Sh pe Interpolation / Parametrizing 3D Positive Affine Transformations / Further Readings / Bibliography / Authors' Biographies

  • No title

    Every year lives and properties are lost in road accidents. About one-fourth of these accidents are due to low vision in foggy weather. At present, there is no algorithm that is specifically designed for the removal of fog from videos. Application of a single-image fog removal algorithm over each video frame is a time-consuming and costly affair. It is demonstrated that with the intelligent use of temporal redundancy, fog removal algorithms designed for a single image can be extended to the real-time video application. Results confirm that the presented framework used for the extension of the fog removal algorithms for images to videos can reduce the complexity to a great extent with no loss of perceptual quality. This paves the way for the real-life application of the video fog removal algorithm. In order to remove fog, an efficient fog removal algorithm using anisotropic diffusion is developed. The presented fog removal algorithm uses new dark channel assumption and anisotropic diff sion for the initialization and refinement of the airlight map, respectively. Use of anisotropic diffusion helps to estimate the better airlight map estimation. The said fog removal algorithm requires a single image captured by uncalibrated camera system. The anisotropic diffusion-based fog removal algorithm can be applied in both RGB and HSI color space. This book shows that the use of HSI color space reduces the complexity further. The said fog removal algorithm requires pre- and post-processing steps for the better restoration of the foggy image. These pre- and post-processing steps have either data-driven or constant parameters that avoid the user intervention. Presented fog removal algorithm is independent of the intensity of the fog, thus even in the case of the heavy fog presented algorithm performs well. Qualitative and quantitative results confirm that the presented fog removal algorithm outperformed previous algorithms in terms of perceptual quality, color fidelity and execu ion time. The work presented in this book can find wide application in entertainment industries, transportation, tracking and consumer electronics. Table of Contents: Acknowledgments / Introduction / Analysis of Fog / Dataset and Performance Metrics / Important Fog Removal Algorithms / Single-Image Fog Removal Using an Anisotropic Diffusion / Video Fog Removal Framework Using an Uncalibrated Single Camera System / Conclusions and Future Directions / Bibliography / Authors' Biographies

  • Front Matter

    The prelims comprise: Half Title Title Copyright Contents Preface Editorial Notes Notes on Vocabulary List of Abbreviations Cryptanalytic Significance of the Analysis of Tunny Editors' Introduction Statistics at Bletchley Park Biographies of Authors Notes on the Editors of the Present Volume List of Figures

  • Front Matter

    The prelims comprise: Half-Title Page Title Page Copyright Page Table of Contents Foreword Preface Acknowledgments Author Biographies List of Symbols

  • No title

    Roughly a decade ago, power consumption and heat dissipation concerns forced the semiconductor industry to radically change its course, shifting from sequential to parallel computing. Unfortunately, improving performance of applications has now become much more difficult than in the good old days of frequency scaling. This is also affecting databases and data processing applications in general, and has led to the popularity of so-called data appliances--specialized data processing engines, where software and hardware are sold together in a closed box. Field-programmable gate arrays (FPGAs) increasingly play an important role in such systems. FPGAs are attractive because the performance gains of specialized hardware can be significant, while power consumption is much less than that of commodity processors. On the other hand, FPGAs are way more flexible than hard-wired circuits (ASICs) and can be integrated into complex systems in many different ways, e.g., directly in the network for a high-frequency trading application. This book gives an introduction to FPGA technology targeted at a database audience. In the first few chapters, we explain in detail the inner workings of FPGAs. Then we discuss techniques and design patterns that help mapping algorithms to FPGA hardware so that the inherent parallelism of these devices can be leveraged in an optimal way. Finally, the book will illustrate a number of concrete examples that exploit different advantages of FPGAs for data processing. Table of Contents: Preface / Introduction / A Primer in Hardware Design / FPGAs / FPGA Programming Models / Data Stream Processing / Accelerated DB Operators / Secure Data Processing / Conclusions / Bibliography / Authors' Biographies / Index



Standards related to Biographies

Back to Top

No standards are currently tagged "Biographies"


Jobs related to Biographies

Back to Top