Figure 28: Interactional Channels, Projection, and Mapping
Utilizing the interface database and its settings, the user will interact with object/spaces, both from interior and exterior viewpoints via three types of interactional channels: the visual, the auditory, and the physical. These interactional channels will overlap perceptually through parallel signaling (i.e.: an object that is simultaneously visually "long" and auditorially high-pitched, etc.)

The visual interactional channel will consist of two forms of interface visualization, the projected, and the mapped.

The first form of interface visualization is projection. The projected interface consists of a virtual, user-centered, radially-projectible array of interactional screens or objects. This will allow for cognitive interaction through elements such as navigational maps, lists, and informational, feedback, or view screens.

Some of these projected interface elements may be paired-to/controlled-by real parallel physical interactional controls through parallel signalling.

The second form of interface visualization is mapping. As opposed to radially-projected screens and objects, mappng will allow the user to experience direct, object/space-centered interactivity by mapping interactional objects or attributes directly into the current space and onto the current objects by recognizing and utilizing metainfomational hooks. Smart doors, either superspatial or subspatial, that automatically attach themselves in appropriate/specified, recognizable places within spaces or on objects would fall into this category. Video view/control screens that automatically map themselves onto objects of class-type video would also fall into this category.

The physical interactional channel will consist of two main categories, the indirect physical interface (i.e.: keyboards, switches, sliders, etc.), and direct physical interface, comprised of virtual physical controls and feedback devices (i.e.: gloves, bodysuits, etc.). As stated inversely above, these physical interactional elements may be paired with parallel projected interface elements through parallel signalling.

The auditory interactional channel , except in cases of voice-recognition, will function primarily to transmit augmentational information regarding status and/or location of objects, or, quite possibly, the user. This sound will be three-dimensionally placed when possible, to reinforce the perception of spatial relationships and meaning.