Conferences related to Mediated Reality

Back to Top

2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)

The conference program will consist of plenary lectures, symposia, workshops and invitedsessions of the latest significant findings and developments in all the major fields of biomedical engineering.Submitted papers will be peer reviewed. Accepted high quality papers will be presented in oral and postersessions, will appear in the Conference Proceedings and will be indexed in PubMed/MEDLINE


Oceans 2020 MTS/IEEE GULF COAST

To promote awareness, understanding, advancement and application of ocean engineering and marine technology. This includes all aspects of science, engineering, and technology that address research, development, and operations pertaining to all bodies of water. This includes the creation of new capabilities and technologies from concept design through prototypes, testing, and operational systems to sense, explore, understand, develop, use, and responsibly manage natural resources.

  • OCEANS '96

  • OCEANS '97

  • OCEANS '98

  • OCEANS '99

  • OCEANS 2000

  • OCEANS 2001

  • OCEANS 2002

  • OCEANS 2003

  • OCEANS 2004

  • OCEANS 2005

  • OCEANS 2006

  • OCEANS 2007

  • OCEANS 2008

    The Marine Technology Society (MTS) and the Oceanic Engineering Society (OES) of the Institute of Electrical and Electronic Engineers (IEEE) cosponsor a joint conference and exposition on ocean science, engineering, education, and policy. Held annually in the fall, it has become a focal point for the ocean and marine community to meet, learn, and exhibit products and services. The conference includes technical sessions, workshops, student poster sessions, job fairs, tutorials and a large exhibit.

  • OCEANS 2009

  • OCEANS 2010

    The Marine Technology Society and the Oceanic Engineering Scociety of the IEEE cosponsor a joint annual conference and exposition on ocean science engineering, and policy.

  • OCEANS 2011

    The Marine Technology Society and the Oceanic Engineering Scociety of the IEEE cosponsor a joint annual conference and exposition on ocean science engineering, and policy.

  • OCEANS 2012

    Ocean related technology. Tutorials and three days of technical sessions and exhibits. 8-12 parallel technical tracks.

  • OCEANS 2013

    Three days of 8-10 tracks of technical sessions (400-450 papers) and concurent exhibition (150-250 exhibitors)

  • OCEANS 2014

    The OCEANS conference covers four days. One day for tutorials and three for approx. 450 technical papers and 150-200 exhibits.

  • OCEANS 2015

    The Marine Technology Scociety and the Oceanic Engineering Society of the IEEE cosponor a joint annual conference and exposition on ocean science, engineering, and policy. The OCEANS conference covers four days. One day for tutorials and three for approx. 450 technical papers and 150-200 exhibits.

  • OCEANS 2016

    The Marine Technology Scociety and the Oceanic Engineering Society of the IEEE cosponor a joint annual conference and exposition on ocean science, engineering, and policy. The OCEANS conference covers four days. One day for tutorials and three for approx. 500 technical papers and 150 -200 exhibits.

  • OCEANS 2017 - Anchorage

    Papers on ocean technology, exhibits from ocean equipment and service suppliers, student posters and student poster competition, tutorials on ocean technology, workshops and town meetings on policy and governmental process.

  • OCEANS 2018 MTS/IEEE Charleston

    Ocean, coastal, and atmospheric science and technology advances and applications


2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC)

The 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2020) will be held in Metro Toronto Convention Centre (MTCC), Toronto, Ontario, Canada. SMC 2020 is the flagship conference of the IEEE Systems, Man, and Cybernetics Society. It provides an international forum for researchers and practitioners to report most recent innovations and developments, summarize state-of-the-art, and exchange ideas and advances in all aspects of systems science and engineering, human machine systems, and cybernetics. Advances in these fields have increasing importance in the creation of intelligent environments involving technologies interacting with humans to provide an enriching experience and thereby improve quality of life. Papers related to the conference theme are solicited, including theories, methodologies, and emerging applications. Contributions to theory and practice, including but not limited to the following technical areas, are invited.


2020 IEEE Haptics Symposium (HAPTICS)

Held since 1992, the IEEE Haptics Symposium (HAPTICS) is a vibrant interdisciplinary forum where psychophysicists, engineers, and designers come together to share advances, spark new collaborations, and envision a future that benefits from rich physical interactions between humans and computers, generated through haptic (force and tactile) devices.

  • 2006 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems

  • 2008 16th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (Haptics 2008)

    The Haptics Symposium is an annual, single-track conference that brings together researchers in diverse engineering and human science disciplines who are interested in the design, analysis, and evaluation of systems that display haptic (force and touch) information to human operators.

  • 2010 IEEE Haptics Symposium (Formerly known as Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems)

    The Haptics Symposium is a bi-annual, single-track conference that brings together researchers who are advancing the human science, technology and design processes underlying haptic (force and tactile) interaction systems. Our community spans the disciplines of biomechanics, psychology, neurophysiology, engineering, human-computer interaction and computer science.

  • 2012 IEEE Haptics Symposium (HAPTICS)

    This conference brings together researchers in diverse engineering and human science disciplines who are interested in the design, analysis, and evaluation of systems that display haptic (force and touch) information to human operators, and the study of the human systems involved in haptic interaction.

  • 2014 IEEE Haptics Symposium (HAPTICS)

    This conference brings together researchers in diverse engineering and human science disciplines who are interested in the design, analysis, and evaluation of systems that display haptic (force and touch) information to human operators, and the study of the human systems involved in haptic interacti

  • 2016 IEEE Haptics Symposium (HAPTICS)

    Held since 1992, the IEEE Haptics Symposium (HAPTICS) is a vibrant interdisciplinary forum where psychophysicists, engineers, and designers come together to share advances, spark new collaborations, and envision a future that benefits from rich physical interactions between humans and computers, generated through haptic (force and tactile) devices. In 2016, this conference will be held in central Philadelphia, one of the most historic and beautiful cities in North America. HAPTICS 2016 will be a four-day conference with a full day of tutorials and workshops and three days of conference activities including technical paper presentations and hands-on demonstrations.Features:ExhibitsWorkshops and TutorialsHands-on Demonstrations

  • 2018 IEEE Haptics Symposium (HAPTICS)

    Held since 1992, the IEEE Haptics Symposium (HAPTICS) is a vibrant interdisciplinary forum where psychophysicists, engineers, and designers come together to share advances, spark new collaborations, and envision a future that benefits from rich physical interactions between humans and computers, generated through haptic (force and tactile) devices.


2020 IEEE Frontiers in Education Conference (FIE)

The Frontiers in Education (FIE) Conference is a major international conference focusing on educational innovations and research in engineering and computing education. FIE 2019 continues a long tradition of disseminating results in engineering and computing education. It is an ideal forum for sharing ideas, learning about developments and interacting with colleagues inthese fields.



Periodicals related to Mediated Reality

Back to Top

Computer

Computer, the flagship publication of the IEEE Computer Society, publishes peer-reviewed technical content that covers all aspects of computer science, computer engineering, technology, and applications. Computer is a resource that practitioners, researchers, and managers can rely on to provide timely information about current research developments, trends, best practices, and changes in the profession.


Computer Graphics and Applications, IEEE

IEEE Computer Graphics and Applications (CG&A) bridges the theory and practice of computer graphics. From specific algorithms to full system implementations, CG&A offers a strong combination of peer-reviewed feature articles and refereed departments, including news and product announcements. Special Applications sidebars relate research stories to commercial development. Cover stories focus on creative applications of the technology by an artist or ...


Engineering Management, IEEE Transactions on

Management of technical functions such as research, development, and engineering in industry, government, university, and other settings. Emphasis is on studies carried on within an organization to help in decision making or policy formation for RD&E.


Instrumentation and Measurement, IEEE Transactions on

Measurements and instrumentation utilizing electrical and electronic techniques.


Intelligent Systems, IEEE

IEEE Intelligent Systems, a bimonthly publication of the IEEE Computer Society, provides peer-reviewed, cutting-edge articles on the theory and applications of systems that perceive, reason, learn, and act intelligently. The editorial staff collaborates with authors to produce technically accurate, timely, useful, and readable articles as part of a consistent and consistently valuable editorial product. The magazine serves software engineers, systems ...



Most published Xplore authors for Mediated Reality

Back to Top

Xplore Articles related to Mediated Reality

Back to Top

'WearCam' (The wearable camera): personal imaging systems for long-term use in wearable tetherless computer-mediated reality and personal photo/videographic memory prosthesis

Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215), 1998

The purpose of this paper is to disclose the operational principles of 'WearCam', the basis for wearable tetherless computer-mediated reality both in its idealized form, as well as in some practical embodiments of the invention, including one of its latest embodiments. The specific inner workings of WearCam, in particular; details of its optical arrangement, have not previously been disclosed, other ...


A parallel mediated reality platform

2004 International Conference on Image Processing, 2004. ICIP '04., 2004

Realtime image processing provides a general framework for robust mediated reality problems. This paper presents a realtime mediated reality system that is built upon realtime image processing algorithms. It has been shown that the graphics processing unit (GPU) is capable of efficiently performing image processing tasks. The system presented uses a parallel GPU architecture for image processing that enables realtime ...


Exploring humanistic intelligence through physiologically mediated reality

Proceedings. International Symposium on Mixed and Augmented Reality, 2002

We present a way of making the wearing of a lifelong electrocardiographic health monitor fun for a user. The health monitor is coupled with a reality mediator device to create physiologically mediated reality, i.e. mediated reality which alters a user's audiovisual perception of the world based upon their own electrocardiographic waveform. This creates an interesting audiovisual experience for the user, ...


Interactive mediated reality

The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings., 2003

Mediated reality describes the concept of filtering or vision of reality, typically using a head-worn video mixing display. In this paper, we propose a generalized concept and new tools for interactively mediated reality. We present also our first prototype system for painting, grabbing and gluing together real and virtual elements.


Mediated reality photography with CCTV-assisted object elimination using graph cut

2017 International Conference on Information and Communication Technology Convergence (ICTC), 2017

For augmented reality (AR), computer-generated virtual objects are overlapped on the image of a real world. Since a disturbing real object (e.g, a moving human) on the display of an AR device reduces the immersion to AR experiences, the virtual objects need to be augmented after eliminating the disturbing object. From this motivation, this paper presents mediated reality (MR) photography, ...



Educational Resources on Mediated Reality

Back to Top

IEEE-USA E-Books

  • 'WearCam' (The wearable camera): personal imaging systems for long-term use in wearable tetherless computer-mediated reality and personal photo/videographic memory prosthesis

    The purpose of this paper is to disclose the operational principles of 'WearCam', the basis for wearable tetherless computer-mediated reality both in its idealized form, as well as in some practical embodiments of the invention, including one of its latest embodiments. The specific inner workings of WearCam, in particular; details of its optical arrangement, have not previously been disclosed, other than by allowing a small number of individuals to look inside the glasses. General considerations, background, and relevant findings, in the area of long-term use of wearable, tetherless computer-mediated reality are also presented. Some general insight (arising from having designed and built more than 100 different kinds of personal imaging systems over the last 20 years) is also provided. Unlike the artificiality of many controlled laboratory experiments, much of the insight gained from these experiences relates to the natural complexity of real-life situations.

  • A parallel mediated reality platform

    Realtime image processing provides a general framework for robust mediated reality problems. This paper presents a realtime mediated reality system that is built upon realtime image processing algorithms. It has been shown that the graphics processing unit (GPU) is capable of efficiently performing image processing tasks. The system presented uses a parallel GPU architecture for image processing that enables realtime mediated reality. Our implementation has many benefits; the graphics hardware has high throughput and low latency; the GPU's are not prone to jitter. Additionally, the CPU is kept available for user applications. The system is easily constructed, consisting of readily available commodity hardware.

  • Exploring humanistic intelligence through physiologically mediated reality

    We present a way of making the wearing of a lifelong electrocardiographic health monitor fun for a user. The health monitor is coupled with a reality mediator device to create physiologically mediated reality, i.e. mediated reality which alters a user's audiovisual perception of the world based upon their own electrocardiographic waveform. This creates an interesting audiovisual experience for the user, playing upon the poetic narrative of combining cardio-centric metaphors pervasive in everyday life (the heart as a symbol of love and centrality, e.g. "get to the heart of the matter") with ubiquitous occular-centric metaphors such as "see the world from my point of view". This audiovisual experience is further enhanced by combining music which alters the visual perception and also heightens the user's emotional response to their experience and, in doing so, further affects their heart(beat).

  • Interactive mediated reality

    Mediated reality describes the concept of filtering or vision of reality, typically using a head-worn video mixing display. In this paper, we propose a generalized concept and new tools for interactively mediated reality. We present also our first prototype system for painting, grabbing and gluing together real and virtual elements.

  • Mediated reality photography with CCTV-assisted object elimination using graph cut

    For augmented reality (AR), computer-generated virtual objects are overlapped on the image of a real world. Since a disturbing real object (e.g, a moving human) on the display of an AR device reduces the immersion to AR experiences, the virtual objects need to be augmented after eliminating the disturbing object. From this motivation, this paper presents mediated reality (MR) photography, where the MR includes the AR technology for object augmentation and the diminished reality (DR) technology for object elimination. In this photography, the region occluded by the disturbing object is selected by using a graph cut method, where the graph cut separates a given image into the object and the background. Then, the region is inpainted by using reference background images which have captured by high-resolution CCTVs. The reference background images which give the information to the region of a disturbing object are used after perspectively transforming them to fit the camera view of a user's AR device. This MR photography method is implemented and its inpainting performance is evaluated by a histogram to color differences at the boundary of the disturbing object.

  • Seeing eye to eye: a shared mediated reality using EyeTap devices and the VideoOrbits gyroscopic head tracker

    We present a system which allows wearable computer users to share their views of their current environments with each other. Our system uses an EyeTap: a device which allows the eye of the wearer to function both as a camera and a display. A wearer, by looking around his/her environment, "paints" or "builds" an environment map composed of images from the EyeTap device, along with head- tracking information recording the orientation of each image. The head- tracking algorithm uses a featureless image motion estimation algorithm coupled with a head mounted gyroscope. The environment map is then transmitted to another user, who, through their own head-tracking EyeTap system, browses the first user's environment solely by head motion, seeing the environment as though it were their own. As a result of browsing the transmitted environment map, the viewer builds and extends his/her own environment map, and thus this is a data-producing head-tracking system. These environment maps can then be shared reciprocally between wearers.

  • A mediated reality environment using a loose and sketchy rendering technique

    In this poster, we present sketchy-ar-us, a modified, realtime version of the Loose and Sketchy algorithm used to render graphics in an AR environment. The primary challenge was to modify the original algorithm to produce a NPR effect at interactive frame rate. Our algorithm renders moderately complex scenes at multiple frames per second. Equipped with a handheld visor, visitors can see the real environment overlaid with virtual objects with both the real and virtual content rendered in a non-photorealistic style.

  • Object composition identification via mediated-reality supplemented radiographs

    This exploratory work investigates the feasibility of extracting linear attenuation functions with respect to energy from a multi-channel radiograph of an object of interest composed of a homogeneous material by simulating the entire imaging system combined with a digital phantom of the object of interest and leveraging this information along with the acquired multi-channel image. This synergistic combination of information allows for improved estimates on not only the attenuation for an effective energy, but for the entire spectrum of energy that is coincident with the detector elements. Material composition identification from radiographs would have wide applications in both medicine and industry. This work will focus on industrial radiography applications and will analyse a range of materials that vary in attenuative properties. This work shows that using iterative solvers holds encouraging potential to fully solve for the linear attenuation profile for the object and material of interest when the imaging system is characterized with respect to initial source x-ray energy spectrum, scan geometry, and accurate digital phantom.

  • Mediated reality using computer graphics hardware for computer vision

    Wearable, camera based, head-tracking systems use spatial image registration algorithms to align images taken as the wearer gazes around their environment. This allows for computer-generated information to appear to the user as though it was anchored in the real world. Often, these algorithms require creation of a multiscale Gaussian pyramid or repetitive re-projection of the images. Such operations, however can be computationally expensive, and such head-tracking algorithms are desired to run in real-time on a body borne computer In this paper we present a method of using the 3D computer graphics hardware that is available in a typical wearable computer to accelerate the repetitive image projections required in many computer vision algorithms. We apply this "graphics for vision" technique to a wearable camera based head-tracking algorithm, implemented on a wearable computer with 3D graphics hardware. We perform an analysis of the acceleration achieved by applying graphics hardware to computer vision to create a Mediated Reality.

  • Socially wise mediated reality for hollistic smart environments

    Computing technology is radically changing the manner in which we work and communicate with computers. Ubiquitous Virtual Reality (ubiquitous VR) has been researched in order to apply the concept of virtual reality and its technology into ubiquitous computing. In this paper, we present characteristics of Ubiquitous VR. The main characteristics are classified into Reality-Virtuality, Static-Dynamic context, and Personal-Social activity continuums. In this continuums, Ubiquitous VR is represented as socially wise mediated reality which supports human social connections with highest-level user context (wisdom) in mixed reality environments. Various paradigms including Ubiquitous VR are classified into these continuums to clarify the position and meaning of Ubiquitous VR. Future research direction for Ubiquitous VR is also discussed at the end of this paper. We expect that the main characteristics would guide the future research direction in ubiquitous virtual reality.



Standards related to Mediated Reality

Back to Top

No standards are currently tagged "Mediated Reality"