responsive objects surfaces and spaces

Research Groups
Projects


Creative design and technology research program

Defining collaborative design solutions where emerging technologies weave physically and digitally across all facets of our everyday experiences with performance, knowledge, play, environment.

Research Groups

ACME Lab: A Creativity Machine Environment Lab
Faculty: Ellen Yi-Luen Do
Description: At A Creativity Machine Environment (ACME) Lab, Professor Do is committed to building better design tools, from understanding the human intelligence involved in the design process and leading to the improvement of the interface with computers. Her research explores new modalities of communication, collaboration, and coordination, as well as the physical and virtual worlds that push the current boundaries of computing environments for design.


Digital World and Image Group
Faculty: Michael Nitsche
Description: The Digital World and Image Group focuses on two main areas: virtual spaces and real-time imagery gathered from them. We see game spaces and game media as important forms of self-expression. That is why we work to improve creative access and the expressive range available in interactive digital media such as games. Research is conducted in a combination of theory, analysis, and practical experimentation.


Emergent Game Group
Faculty: Celia Pearce
Description: The Emergent Game Group (EGG) designs and researches games that facilitate emergent behavior. We study how patterns of gameplay express broader cultural norms, such as political hierarchy, and how these patterns possess holistic properties not found in their parts. We want to know how people transform game elements into social practices.


Synaesthetic Media Lab
Faculty: Ali Mazalek
Description: Synlab explores emerging modalities in new media. Our research focuses on tangible interaction and sensing technologies that support creative expression bridging the physical and digital worlds. Applications range across media arts, entertainment and educational domains.


Projects

ArchiTales
Faculty: Ali Mazalek, Tristan Al-Haddad and Claudia Winegarden
Students: Martin Bednar, Mehdi Ben Yahmed, Jin Ah Chon, Jakob Crowder, Daniel Gibson, Sergio Goldenberg, April Headen, Chih-Chieh Hsu, Emily Kiel, Amelia Mendez, Jacob Porter, Ritesh Rathi, Martin Rojas, Joy Salter, Stephanie Sellers, Yang Ting Shen, Jasjit Singh, Kurt Stilwell, Jacob Tompkins, Joshua Tuminella, Theodore Ullrich, Cooper Welch, Sarah Williams, Crystal Wrenn, Steph Yang, Arseni Zaitsev
Description: Tables are artifacts around which people gather. They become organized spaces of exchange and consumption. Kitchens are organized around the dining table; meeting rooms are organized around the conference table; living spaces are organized around the coffee table. Tables perform two complementary and simultaneous tasks: bringing people together to promote intimacy and holding them just enough apart to provide security. As technology becomes a vehicle for tangible interactions, tables establish the framework for social interaction instances. The Story Table is a symbiosis of two social spaces: story and table collapsed onto one another. Created through a process of co-construction of digital and physical media, the Story Table is an interactive installation that encompasses shared engagement in cinematically-inspired narrative expressions that unfold on its surface and space.
Date: Since Spring 2008 (ongoing)


Flourishing Future
Faculty: Elise van den Hoven
Students: Kim, Imke, Jeanine
Description: Flourishing Future is a digital tabletop game developed to support collaboration among children. The players interact with the game on a digital tabletop by means of tangible tools. The goal of Flourishing Future is to develop an environmentally friendly city. The environmental status of this city is reflected in the amount of leaves growing on a tree. In order to achieve this goal the children can place objects on the board, for example a tree or an airport. These objects can influence the environmental friendliness positively or negatively. Each child gets its own set of objects and related game character. By dividing the objects among the children, they are encouraged to collaborate to reach the goal. The game was developed as a platform for three different research purposes. The first research subject is sound analysis. Auditory icons and physical sounds are used to see which of the two the children liked better for feedback on their actions. For the second research subject, the effect of different material properties on the child's haptic experience is tested. Four tools with different hardness are haptically explored by the children, while they express their perceptions. The third research subject is about testing collaborative learning amongst children. Two sets of tangible tools are compared on the amount by which they invoke collaborative interactions between the children.
Date: Fall 2007


Gamewell
Faculty: Ali Mazalek
Students: Ari Velazquez, Dan Gibson, Steph Yang, Jimmy Truesdell, Peter Watanabe, Tatum Clanton, Andy Korzik
Description: The research focuses on tangible interaction and sensing technologies that support creative expression bridging the physical and digital worlds. The gamewell project brings digital content and computational game engines to the table to enable new forms of multi-player tabletop game play. Gamewell is a platform and manager for tangible and multi-touch tabletop games that allows us to explore novel interaction methods and game genres for digital tabletops. Research questions addressed include: what interaction methods can generalize across different tabletop games, what interaction methods are specific to certain games or game types, what can we learn from past tabletop or arcade game forms, and how can we redefine the design methods for this new modality. Gamewell is developed in XNA. The first games include ColorCross, a Twister-inspired cooperative game, and Space Vectors, a multi-player tabletop strategy game.
Date: Since Spring 2008 (ongoing)


Next Generation Play (NGP)
Faculty: Janet Murray, Michael Nitsche, Celia Pearce
Students: Shashank Raval, HeeRin Lee, Sergio Goldenberg
Description: The project combines existing TV IP with ubiquitous 4G handheld technology. NGP connects players not only to a virtual community but also to their real neighborhood via our localied multiplayer concept. We use the players' TV preferences to drive our dynamic world generation that generates 2D multiplayer game levels on the fly.
Date: Since Fall 2007 (ongoing)


Tangible User Interfaces for Real-time 3D Worlds (TUI3D)
Faculty: Ali Mazalek, Michael Nitsche
Students: Tarandeep Gill, Tandav Krishna, Shashank Raval
Description: 3D interactive performance space, such as machinima, lacks intuitive control mechanisms. Set direction and acting are limited by the technical toolsets available to creators, as the tools were designed to create video games rather than cinematic works. They do a poor job of capturing the performative expression that characterizes the more mature medium of film. The TUI3D project is joint research between Synlab, the Experimental Games Lab and Machinima Group at Georgia Tech that addresses production and performative challenges involved in creating machinima. We are developing a suite of tangible interfaces that could be used to create and control three aspects of 3D virtual environments in real-time: character, camera, and space. Tangible interfaces can help bridge gap between computer film creators and the established base of traditional film creatives and their knowledgebase. We also anticipate deeper and more convincing expression with these more user-centered tools.
Support: Supported in part by NSF
Date: Since Fall 2006 (ongoing)


Tangible Comics
Faculty: Ali Mazalek
Students: Ozge Samanci, Yanfeng Chen
Description: Tangible Comics is a computer vision based full-body interactive storytelling environment that also functions as a comics generator. Prevailing applications of full-body computer vision have not used the full storytelling or performance potential of these environments. Our aim is to produce an environment that can create a space for redefining the conventions of comics, performance, film, photography, and animation. In relation to that, we are exploring the design problems that can arise when computer vision technology is contextualized in an interactive story telling environment of tangible interfaces to control characters in 3D game environments. The goal is to enrich the expressive range of virtual worlds and simplify animation.
Date: Since Fall 2006 (ongoing)


Totti
Faculty: Elise van den Hoven
Students: Manon Spermon, Marigo Heijboer
Description: Totti is a game developed for (high functioning) autistic children. The goal of the game is to increase the level of collaboration between the children, since autistic children tend to have difficulties with social contact and relationships in general. The game is developed on a digital tabletop, which enables the enforcing of rules in the game. In order to improve collaboration, the players have one general goal that needs to be achieved. Each turn, players need to discuss the possible actions and need to confirm the actions for the game to proceed. The game objects are used to identify the different powers of the Gods and are helpful to visualize the collaboration process (by exchanging objects and placing them in different spots).
Date: Fall 2007


TViews Table
Faculty: Ali Mazalek, Matt Reynolds, Glorianna Davenport
Description: As digital media applications continue to evolve, there is a need for new kinds of platforms that can support shared media interactions for everyday consumers. The Tviews Table is a multi-user digital/tangible media table that supports interaction through the real-time tracking of tagged tangible objects on a coincident display surface. The first implementation of TViews used electromagnetic sensing technology combined with an overhead-projected display. Later versions incorporate an extensible acoustic-based sensing architecture that functions through the glass surface of an embedded display and enables real-time tracking of a virtually unlimited set of uniquely identified wireless objects that can be used on the surface of any similar table. These objects can be physically customized in order to suit particular applications, and can provide additional functionality through external input and output elements on the objects themselves. TViews has been used for the development of a number of a variety of media content applications, including games, story engines and map-based media content browsers.
Support: Supported in part by Samsung
Date: Since 2003



To view all ROSS research groups and projects, click here.