IMTC is instrumental in enabling the research and education projects related to the The Aware Home Research Initiative (AHRI). AHRI is an interdisciplinary research effort involving numerous faculty members from several schools and other organizations at Georgia Tech. IMTC researcher, Brian D. Jones, is Director of the AHRI and manager of the Aware Home facility. Mr.
On November 26, 2001, The Georgia Centers for Advanced Telecommunications Technology (GCATT), Georgia Tech, and Shepherd Center received a $5 million, five-year federal grant to develop applications of wireless technologies to enhance the independence of people with physical and cognitive disabilities."To promote universal access to mobile wireless technologies and explore their innovative applications in addressing the needs of people with disabilities."
In collaboration with Dr. Frank Durso in Georgia Tech's School of Psychology and human factors researchers at the FAA, Scott Robertson at IMTC has developed the NextGen air traffic control simulator which is used in human factors research and controller training. The NextGen simulator uses a 3D game engine (Unity3D) and a physics library (PhysX) to model simplified, but realistic airplane flight characteristics.
Lorenzo Ghiberti worked for twenty-seven years on his masterpiece, “The Gates of Paradise” a pair of bronze doors for the San Giovanni Baptistery in Florence, Italy. Now, after more than twenty-five years of work at Florence’s Opificio delle Pietre Dure, the restoration of Ghiberti’s Gates of Paradise is nearing completion. The doors consist of ten panels (approximately 32" x 32") and numerous frieze elements bordering the panels. The panels, ordered from left to right and top to bottom, feature a sequence of scenes from the Old Testament are crafted in relief.
In collaboration with Highlands Historic Consulting (HHC) and the National Monuments Foundation (NMF), IMTC has developed an innovative, immersive interaction experience for the Millennium Gate Philanthropy Gallery (on the lower floor of the Millennium Gate). Rodney Cook, Jr of the National Monuments Foundation was looking for someone capable of devising and delivering this full-gallery experience.
We have developed a software framework for the creation of live performance simulations by non-technologists. This is a toolkit for simulating various types of cultural performances using motion capture, 3D animation, virtual reality, and reactive agents. The software allows the developer to define the components of a performance such as actors, audience members, and venue using a standard score paradigm. The user interactions and reactive agent scripting can also be defined through this interface.
For more than a decade, “presence” has been a key concept for understanding and evaluating the effectiveness of virtual environments. VR researchers have used the term to describe the mental state of the user in response to being immersed in a virtual world, and typically equate presence with a sense of “being in the virtual world” or “a lack of a sense of mediation.” For AR systems, we are interested in how to create systems where the user loses the sense of mediation, and begins to respond to being immersed in a blended physical/virtual as if it was a single “world.”
Tangible user interfaces (TUIs) can create engaging and useful interactive systems. However, along with the power of these interfaces comes challenges; they are often so specialized and novel that building a TUI system involves working at a low level with custom hardware and software. As a result the community of people that are capable of creating TUIs is limited.
In this project we explored a concept for augmented reality entertainment, called AR Karaoke, where users perform their favorite dramatic scenes with virtual actors. AR Karaoke is the acting equivalent of traditional karaoke, where the goal is to facilitate an acting experience for the user that is entertaining for both the user and audience. The main challenge in creating an AR Karaoke prototype is to develop an easy to learn user interface that helps the performer understand the timing, body movement, and dialog for their character.
A collaboration with Blair MacIntyre and the Augmented Environments Lab (AEL)