IEEE standard glossaries
54 resources related to IEEE standard glossaries
- Topics related to IEEE standard glossaries
- IEEE Organizations related to IEEE standard glossaries
- Conferences related to IEEE standard glossaries
- Periodicals related to IEEE standard glossaries
- Most published Xplore authors for IEEE standard glossaries
ICST 2019 is intended to provide a common forum for researchers, scientists, engineers and practitioners throughout the world to present their latest research findings, ideas, developments and applications in the area of Software Testing, Verification and Validation. Topics of interest include, but are not limited to:Testing theory and practice, Testing in globally-distributed organizations, Model-based testing, Model-driven engineering and testing, Domain specific testing, Quality assurance, Model checking, Formal verification, Fuzzing, Inspections, Testing and analysis tools, Design for testability, Testing education, Technology transfer in testing, Testing of open source, etc. Besides research track papers, the conference also include doctoral forum, software testing contest and various workshops.
addresses the discipline of systems engineering,including theory, technology, methodology, andapplications of complex systems, system-of-systems,and integrated systems of national and globalsignificance.
The 29th annual International Symposium on Software Reliability Engineering (ISSRE 2018) is focused on innovative techniques and tools for assessing, predicting, and improving the reliability, safety, and security of software products. ISSRE also emphasizes industrial relevance, rigorous empirical studies and experience reports on existing software systems.
2013 IEEE Symposium on Humanities, Science and Engineering Research (SHUSER)
This symposium discusses the interaction of Humanities, Science and Engineering Research where its involves engineers, scientist and social scientist to feel and converged the impact of the technology to the society.
2010 International Conference on Computer Science and Software Engineering (CSSE 2010)
The 3rd International Conference on Computer Science and Software Engineering (CSSE 2010) will be held on December 12~14, 2010 in Changsha, China. This conference will bring together the top researchers from Asian Pacific nations, North America, Europe and around the world to exchange their research results and address open issues in information security, multimedia and graphics technologies, computational intelligence, and software engineering..
Computer, the flagship publication of the IEEE Computer Society, publishes peer-reviewed technical content that covers all aspects of computer science, computer engineering, technology, and applications. Computer is a resource that practitioners, researchers, and managers can rely on to provide timely information about current research developments, trends, best practices, and changes in the profession.
IEEE Computer Graphics and Applications (CG&A) bridges the theory and practice of computer graphics. From specific algorithms to full system implementations, CG&A offers a strong combination of peer-reviewed feature articles and refereed departments, including news and product announcements. Special Applications sidebars relate research stories to commercial development. Cover stories focus on creative applications of the technology by an artist or ...
All aspects of the theory and applications of nuclear science and engineering, including instrumentation for the detection and measurement of ionizing radiation; particle accelerators and their controls; nuclear medicine and its application; effects of radiation on materials, components, and systems; reactor instrumentation and controls; and measurement of radiation in space.
Statistical and structural pattern recognition; image analysis; computational models of vision; computer vision systems; enhancement, restoration, segmentation, feature extraction, shape and texture analysis; applications of pattern analysis in medicine, industry, government, and the arts and sciences; artificial intelligence, knowledge representation, logical and probabilistic inference, learning, speech recognition, character and text recognition, syntactic and semantic processing, understanding natural language, expert systems, ...
Research, development, design, application, construction, the installation and operation of apparatus, equipment, structures, materials, and systems for the safe, reliable, and economic delivery and control of electric energy for general industrial, commercial, public, and domestic consumption.
2010 IEEE International Systems Conference, 2010
To cope with current inconsistencies and incompleteness of technical documents, we propose a combined, model-based structured graphical and textual meta-standard approach for specification, verification and validation of complex systems in general and ISO enterprise standards in particular. This methodology, developed under the auspices of the ISO TC 184/SC 5 OPM Study Group, is presented along with MBASE-Model-Based Authoring of Specifications ...
2008 International Conference on Computer Science and Software Engineering, 2008
Requirement is very important part of software process. In order to truly meet the requirements of customer, it is necessary to establish an appropriate requirement management process, software process reorganization is particularly important. In this paper, firstly, a description of the requirement process is presented after software requirement process was analyzed from the perspective of requirement engineering. Then, analyzing and ...
Proceedings of the IRE, 1951
IEEE Computer Graphics and Applications, 1991
The author discusses what has been accomplished in the graphics standards area in the last decade and what lies in store in the decade ahead. He discusses issues that affect the development, implementation, and end users of graphics and data exchange standards in general. They concern scope, target, size and complexity, technology, understanding of interchange formats, automation, and the nature ...
Third International Conference on Autonomic and Autonomous Systems (ICAS'07), 2007
In his landmark work introducing layered learning Stone presented a new way of handling complex application domains suitable especially for mobile robots. We extend his framework by introducing robust layered learning- a framework that is able to handle system and environmental changes at every layer. We present first results of a lower level implementation of such a framework for mobile ...
Marks and Stanwood: WirelessMAN Inside the IEEE 802.16 Standard
Standards In Fog Computing - Tao Zhang and John Zao, Fog World Congress 2018
5g Cellular: It Will Work!
IEEE Standard 1680: An Incentive to Design Greener Computers
The Vienna LTE-A Dowlink Link-Level Simulator
Panel Presentation: Yaniv Giat - ETAP Tel Aviv 2015
High Efficiency Supply-Modulated RF Power Amplifier for Handset Applications
Standards Education: An Introduction (Chinese subtitles)
CPIQ Update and the Case for Image Quality Standards in Automotive
IEEE Standards Presents: Case Study 515-(English)
IEEE Day Milestone: Compact Disc Technology
IEEE Standards Presents: Case Study 515 (Chinese)
Richard Nute, D. Ray Corson, Jim Barrick - IEEE Medal for Environmental and Safety Technologies, 2019 IEEE Honors Ceremony
Standards Education: An Introduction (English)
William J. Miller on Sensei-IoT: 2016 End to End Trust and Security Workshop for the Internet of Things
2011 IEEE Awards Richard M. Emberson Award - Donald C. Loughry
Transportation Electrification: The State of IEEE 802LAN MAN Standards in Vehicular Network
The eXtensible Event Stream (XES) standard
To cope with current inconsistencies and incompleteness of technical documents, we propose a combined, model-based structured graphical and textual meta-standard approach for specification, verification and validation of complex systems in general and ISO enterprise standards in particular. This methodology, developed under the auspices of the ISO TC 184/SC 5 OPM Study Group, is presented along with MBASE-Model-Based Authoring of Specifications Environment, which is designed to support authors of technical specifications while creating and editing model-based technical documents.
Requirement is very important part of software process. In order to truly meet the requirements of customer, it is necessary to establish an appropriate requirement management process, software process reorganization is particularly important. In this paper, firstly, a description of the requirement process is presented after software requirement process was analyzed from the perspective of requirement engineering. Then, analyzing and simplifying process algebraic expression, the approach of process algebra based for the requirement process reorganization is presented proceeded. Finally, the results of shows that development time are shortened, resources are saved and development efficiency is improved.
The author discusses what has been accomplished in the graphics standards area in the last decade and what lies in store in the decade ahead. He discusses issues that affect the development, implementation, and end users of graphics and data exchange standards in general. They concern scope, target, size and complexity, technology, understanding of interchange formats, automation, and the nature of the end users.<<ETX>>
In his landmark work introducing layered learning Stone presented a new way of handling complex application domains suitable especially for mobile robots. We extend his framework by introducing robust layered learning- a framework that is able to handle system and environmental changes at every layer. We present first results of a lower level implementation of such a framework for mobile robots and discuss how all available sources of information regarding unforeseen changes can be integrated in such a framework in order to reach maximal robustness.
The paper describes a study that explores the relationship between program slicing and code understanding gained while debugging. The study consisted of an experiment that compared the program understanding abilities of two classes of debuggers: those who slice while debugging and those who do not. For debugging purposes, a slice can be thought of as a minimal subprogram of the original code that contains the program faults. Those who only examine statements within a slice for correctness are considered slicers; all others are considered non-slicers. Using accuracy of reconstruction as a measure of understanding, it was determined that slicers have a better understanding of the code after debugging.
A software system interacts with its environment through interfaces. Improper handling of exceptional returns from system interfaces can cause robustness problems. Robustness of software systems are governed by various temporal properties related to interfaces. Static verification has been shown to be effective in checking these temporal properties. But manually specifying these properties is cumbersome and requires the knowledge of interface specifications, which are often either unavailable or undocumented. In this paper, we propose a novel framework to automatically infer system-specific interface specifications from program source code. We use a model checker to generate traces related to the interfaces. From these model checking traces, we infer interface specification details such as return value on success or failure. Based on these inferred specifications, we translate generically specified interface robustness rules to concrete robustness properties verifiable by static checking. Hence the generic rules can be specified at an abstract level that needs no knowledge of the source code, system, or interfaces. We implement our framework for an existing static analyzer that employs push down model checking and apply the analyzer to the well known POSIX-API system interfaces. We found 28 robustness violations in 10 open source packages using our framework
Metrics are essential for the advancement of research and practice in an area. In knowledge management (KM), the process of measurement and development of metrics is made complex by the intangible nature of the knowledge asset. Further, the lack of standards for KM business metrics and the relative infancy of research on KM metrics points to a need for research in this area. This paper reviews KM metrics for research and practice and identifies areas where there is a gap in our understanding. It classifies existing research based on the units of evaluation such as user of KMS, KMS, project, KM process, KM initiative, and organization as a whole. The paper concludes by suggesting avenues for future research on KM and KMS metrics based on the gaps identified.
Real-time systems must respond to events in a timely fashion; in hard real- time systems the penalty for a missed deadline is high. It is therefore necessary to design hard real-time systems so that the timing behavior of the tasks can be predicted. Static real-time systems have prior knowledge of the worst-case arrival patterns and resource usage. Therefore, a schedule can be calculated off-line and tasks can be guaranteed to have sufficient resources to complete (resource adequacy). Dynamic real-time systems, on the other hand, do not have such prior knowledge, and therefore must react to events when they occur. They also must adapt to changes in the urgencies of various tasks, and fairly allocate resources among the tasks. A disadvantage of static real-time systems is that a requirement on resource adequacy makes them expensive and often impractical. Dynamic realtime systems, on the other hand, have the disadvantage of being less predictable and therefore difficult to test. Hence, in dynamic systems, timeliness is hard to guarantee and reliability is often low. Using a constrained execution environment, we attempt to increase the testability of such systems. An initial step is to identify factors that affect testability. We present empirical results on how various factors in the execution environment impacts testability of real-time systems. The results show that some of the factors, previously identified as possibly impacting testability, do not have an impact, while others do.