210 resources related to Computation theory
- Topics related to Computation theory
- IEEE Organizations related to Computation theory
- Conferences related to Computation theory
- Periodicals related to Computation theory
- Most published Xplore authors for Computation theory
The conference program will consist of plenary lectures, symposia, workshops and invitedsessions of the latest significant findings and developments in all the major fields of biomedical engineering.Submitted full papers will be peer reviewed. Accepted high quality papers will be presented in oral and poster sessions,will appear in the Conference Proceedings and will be indexed in PubMed/MEDLINE.
OCEANS 2020 - SINGAPORE
An OCEANS conference is a major forum for scientists, engineers, and end-users throughout the world to present and discuss the latest research results, ideas, developments, and applications in all areas of oceanic science and engineering. Each conference has a specific theme chosen by the conference technical program committee. All papers presented at the conference are subsequently archived in the IEEE Xplore online database. The OCEANS conference comprises a scientific program with oral and poster presentations, and a state of the art exhibition in the field of ocean engineering and marine technology. In addition, each conference can have tutorials, workshops, panel discussions, technical tours, awards ceremonies, receptions, and other professional and social activities.
The International Conference on Image Processing (ICIP), sponsored by the IEEE SignalProcessing Society, is the premier forum for the presentation of technological advances andresearch results in the fields of theoretical, experimental, and applied image and videoprocessing. ICIP 2020, the 27th in the series that has been held annually since 1994, bringstogether leading engineers and scientists in image and video processing from around the world.
2020 IEEE International Symposium on Information Theory (ISIT)
Information theory, coding theory, communication theory, signal processing, and foundations of machine learning
2020 IEEE International Symposium on Circuits and Systems (ISCAS)
The International Symposium on Circuits and Systems (ISCAS) is the flagship conference of the IEEE Circuits and Systems (CAS) Society and the world’s premier networking and exchange forum for researchers in the highly active fields of theory, design and implementation of circuits and systems. ISCAS2020 focuses on the deployment of CASS knowledge towards Society Grand Challenges and highlights the strong foundation in methodology and the integration of multidisciplinary approaches which are the distinctive features of CAS contributions. The worldwide CAS community is exploiting such CASS knowledge to change the way in which devices and circuits are understood, optimized, and leveraged in a variety of systems and applications.
Experimental and theoretical advances in antennas including design and development, and in the propagation of electromagnetic waves including scattering, diffraction and interaction with continuous media; and applications pertinent to antennas and propagation, such as remote sensing, applied optics, and millimeter and submillimeter wave techniques.
The theory, design and application of Control Systems. It shall encompass components, and the integration of these components, as are necessary for the construction of such systems. The word `systems' as used herein shall be interpreted to include physical, biological, organizational and other entities and combinations thereof, which can be represented through a mathematical symbolism. The Field of Interest: shall ...
Part I will now contain regular papers focusing on all matters related to fundamental theory, applications, analog and digital signal processing. Part II will report on the latest significant results across all of these topic areas.
Telephone, telegraphy, facsimile, and point-to-point television, by electromagnetic propagation, including radio; wire; aerial, underground, coaxial, and submarine cables; waveguides, communication satellites, and lasers; in marine, aeronautical, space and fixed station services; repeaters, radio relaying, signal storage, and regeneration; telecommunication error detection and correction; multiplexing and carrier techniques; communication switching systems; data communications; and communication theory. In addition to the above, ...
Specific topics of interest include, but are not limited to, sequence analysis, comparison and alignment methods; motif, gene and signal recognition; molecular evolution; phylogenetics and phylogenomics; determination or prediction of the structure of RNA and Protein in two and three dimensions; DNA twisting and folding; gene expression and gene regulatory networks; deduction of metabolic pathways; micro-array design and analysis; proteomics; ...
IEE Colloquium on Quantum Computing: Theory, Applications and Implications (Digest No: 1997/145), 1997
IJCNN-91-Seattle International Joint Conference on Neural Networks, 1991
Summary form only given, as follows. ANNECS is a software tool that compiles a high-level, object-oriented specification to a functionally equivalent neural network. It does this by realizing each object in the specification as a functionally equivalent cluster of neurons and synapses. All objects are defined ultimately in terms of just two primitives, the neuron and the synapse. Thus, the ...
IEEE Potentials, 2003
Using atoms as digital bits will start a completely new era in computer design. Atoms cannot be simply manipulated and used like the bits built with transistors. The behavior of matter on the atomic scale follows the rules of modern physics. This behavior cannot be understood in terms of our classical description of the world (i. e. Newtonian mechanics or ...
2008 Congress on Image and Signal Processing, 2008
This paper presents a method for reconstruction of entire three-dimensional objects based on principles of the rotational stereo. The experimental system, which has three degrees of freedom (two for translation and one for rotation), uses an accurate stage to make all areas of the object surface visible. It only needs one camera to capture images and thus reduces the implementation ...
International 1989 Joint Conference on Neural Networks, 1989
Summary form only given. The author proposes the following definition of massive parallelism. A computational system is massively parallel if the number of processing elements is so large that it may conveniently be considered a continuous quantity. The author proposes this definition of massive parallelism for a number of reasons. First, skillful behavior seems to require significant neural mass. Second, ...
Magneto-electric Approximate Computation for Bayesian Inference - IEEE Rebooting Computing 2017
Creative AI through Evolutionary Computation
Hideyuki Takagi - Interactive Evolutionary Computation
Bayesian Perception & Decision from Theory to Real World Applications
On the Physical Underpinnings of the Unusual Effectiveness of Probabilistic and Neural Computation - IEEE Rebooting Computing 2017
A Survey of Representations for Evolutionary Computation 2
IMS 2012 Special Sessions: A Retrospective of Field Theory in Microwave Engineering - David M. Pozar
A Survey of Representations for Evolutionary Computation 1
A Model of Embodied Computation for Artificial Morphogenesis
IMS 2012 Special Sessions: A Retrospective of Field Theory in Microwave Engineering - Magdalena Salazar Palma
IMS 2012 Special Sessions: A Retrospective of Field Theory in Microwave Engineering - Constantine A. Balanis
Opportunities in Physical Computing driven by Analog Realization - Jennifer Hasler: 2016 International Conference on Rebooting Computing
Inspiring Brilliance: The impact of control theory and cybernetics of Maxwell's paper: On governors
SIMD Programming in VOLK, the Vector-Optimized Library of Kernels
Metaheuristics for Multiobjective Optimization
Evolutionary Computation - A Technology Inspired by Nature
Merge Network for a Non-Von Neumann Accumulate Accelerator in a 3D Chip - Anirudh Jain - ICRC 2018
Electronic Systems for Quantum Computation - David DiVincenzo: 2016 International Conference on Rebooting Computing
Neuromorphic Chips - Kwabena Boahen: 2016 International Conference on Rebooting Computing
Summary form only given, as follows. ANNECS is a software tool that compiles a high-level, object-oriented specification to a functionally equivalent neural network. It does this by realizing each object in the specification as a functionally equivalent cluster of neurons and synapses. All objects are defined ultimately in terms of just two primitives, the neuron and the synapse. Thus, the compilation method consists of recursively expanding each object into its constituent objects, until the definition consists of neurons and synapses only. Since clusters of neurons are embodiments of objects whose function is fully described within the specification, the functioning of the network may be completely understood. Moreover, since networks compiled in this way are functionally equivalent to their algorithmic specification, computation theory may be applied to these neural networks. An application which demonstrates these principles is discussed, i.e. a simple robot controller which picks up objects and drops them into holes as it moves around in a world containing stairs.<<ETX>>
Using atoms as digital bits will start a completely new era in computer design. Atoms cannot be simply manipulated and used like the bits built with transistors. The behavior of matter on the atomic scale follows the rules of modern physics. This behavior cannot be understood in terms of our classical description of the world (i. e. Newtonian mechanics or Maxwell's equations in electromagnetics). The physical theory dealing with such behavior is called quantum mechanics. Its use in the computer industry will most probably cause a revolution in the way we use and understand computers. The author describes how such a quantum computer-a computer based on the rules of quantum mechanics-may work, and how it is going to give incredible speed and problem- solving power.
This paper presents a method for reconstruction of entire three-dimensional objects based on principles of the rotational stereo. The experimental system, which has three degrees of freedom (two for translation and one for rotation), uses an accurate stage to make all areas of the object surface visible. It only needs one camera to capture images and thus reduces the implementation cost. The object to be reconstructed is placed on a rotational device which can be precisely controlled by a computer. A series of images are easy to be obtained for recovering the complete model of the object. A few examples are carried out in the experiments. Results are very satisfactory in mean of both feasibility and accuracy.
Summary form only given. The author proposes the following definition of massive parallelism. A computational system is massively parallel if the number of processing elements is so large that it may conveniently be considered a continuous quantity. The author proposes this definition of massive parallelism for a number of reasons. First, skillful behavior seems to require significant neural mass. Second, he is interested in computers, such as optical computers and molecular computers, for which the number of processing elements is effectively continuous. Third, continuous mathematics is generally easier than discrete mathematics. The author develops a theoretical framework for understanding massively parallel analog computers.<<ETX>>
The problem of knowledge computation is put forward by dissecting the characteristics of industrial process control. A theory of knowledge computation for intelligent process control is proposed to overcome the difficulties of intelligent control for industrial processes. The theory is based on the characteristics of knowledge that can be quantified and numerically calculated to represent the space-time causality in an intelligent process control system. The intent and methodology of the knowledge computation theory for intelligent process control are presented. The computational properties of knowledge in intelligent process control can be described in three aspects: computation theory, algorithm and realization method. The problems of knowledge representation, reasoning and knowledge acquisition are studied.
Summary form only given. In text compression applications, it is important to be able to process compressed data without requiring (complete) decompression. In this context it is crucial to study compression methods that allow time/space efficient access to any fragment of a compressed file without being forced to perform complete decompression. We study here the real-time recovery of consecutive symbols from compressed files, in the context of grammar-based compression. In this setting, a compressed text is represented as a small (a few Kb) dictionary D (containing a set of code words), and a very long (a few Mb) string based on symbols drawn from the dictionary D. The space efficiency of this kind of compression is comparable with standard compression methods based on the Lempel-Ziv approach. We show, that one can visit consecutive symbols of the original text, moving from one symbol to another in constant time and extra O(|D|) space. This algorithm is an improvement of the on-line linear (amortised) time algorithm presented in (L. Gasieniec et al, Proc. 13th Int. Symp. on Fund. of Comp. Theo., LNCS, vol.2138, p.138-152, 2001).
The theory of computability often called basic recursive function theory is usually motivated and developed using Church's thesis. It is shown that there is an alternative computability theory in which some of the basic results on unsolvability become more absolute. Results on completeness become simpler, and many of the central concepts become more abstract. In this approach computations are viewed as mathematical objects, and the major theorems in recursion theory may be classified according to which axioms about computation are needed to prove them. The theory is a typed theory of functions over the natural numbers, and there are unsolvable problems in this setting independent of the existence of indexings. The unsolvability results are interpreted to show that the partial function concept serves to distinguish between classical and constructive type theories.<<ETX>>
In the past decade, many papers about granular computing(GrC) have been published, but the key points about granular computing(GrC) are still unclear. In this paper, we try to find the key points of GrC in the information transformation of the pattern recognition. The information similarity is the main point in the original insight of granular computing (GrC) proposed by Zadeh(1997). Many GrC researches are based on equivalence relation or more generally tolerance relation, equivalence relation or tolerance relation can be described by some distance functions and GrC can be geometrically defined in a framework of multiscale covering, at other hand, the information transformation in the pattern recognition can be abstracted as a topological transformation in a feature information space, so topological theory can be used to study GrC. The key points of GrC are (1) there are two granular computing approaches to change a high dimensional complex distribution domain to a low dimensional and simple domain, (2) these two kind approaches can be used in turn if feature vector itself can be arranged in a granular way.
A novel cellular genetic algorithm is developed to address the issues of good mate selection. This is accomplished through reinforcement learning where good mating individuals attract and poor mating individuals repel. Adaptation of good mate choice occurs, thus leading to more efficient search. Results are presented for various test cases.
This recommended practice describes 1) methods for measuring external electric and magnetic fields and contact currents to which persons may be exposed, 2) instrument characteristics and the methods for calibrating such instruments, and 3) methods for computation and the measurement of the resulting fields and currents that are induced in bodies of humans exposed to these fields. This recommended practice ...