The deliverables set in version 0.03 has been scaled back from previous proposals. Of particular importance is that direct sound streaming and backward compatability are now listed as completely optional -- replaced with much a much simpler to implement MIDI server. This reduction in system complexity comes in response to unexpected delays in using the iroom: a deleay of an extra week in getting hold of the C++ version of the Event Heap; occasional minor bugs in the Event Heap software; a continued delay in acquiring example code for streaming data from one machine to another outside of the event heap; a larger than expected time expenditure on actually installing the hardware. In retrospect, all such delays, (or something similar), were to be expected. It simply requires a little reassesment now. Here are the newly revised (as of 11/19/99) deliverables:
Primary (All Required)
- An EventHeap driven API for client sound requests.
- A 3-D audio server for driving sound output once a sound request is recieved from a client.
- One or more small, client applications that demonstrate the 3-D audio functionality of the room.
- A MIDI sound server.
- An installed audio hardware system within the interactive room.
Secondary (optional if time permits)
- A protocol for streaming live audio from a client machine directly to a connected server.
- A virtual device driver as a means of backward support for all windows applications on the system.
- Analog sound patching as a means of backward support for all windows applications on the system.
Class Demo
The final class presentaion will consist of a demo application that ties much of the primary system functionality together. The application should demostrate these three aspects of the system:We will make use of a "spatial-sound-board" in which we can "position" different sounds somewhere inside the room. Semi-Dynamic sounds will be generated using MIDI commands.
- 3-D Spatial Sound Generation
- Practical use
- System Scalability
The presentation will also include a brief description of the inner architecture, along with a guide to client use so that others with ongoing project will walk away with the knowledge of how to utilize the system in their own applications.
The following is a list of deliverables that we will have by the final demo date:
At the conclusion of our project we will have developed three software components: the pda client application; the proxy server that interfaces the pda client with the event heap; and the application server that interfaces powerpoint with the event heap (working with PPT group).
The PDA Client Application will have four main pieces: the slide viewer/annotator; the outline viewer; the question poster; and the meta-information extractor.
The slide viewer/annotator will allow the user to view and take notes on the presentation slides obtained from the server. The user will be able to store the annotations in stroke vector format to better facilitate handwriting recognition (in the future).
The outline viewer will allow the user to view the structure of the presentation to ease slide navigation.
The question poster will only allow simple questions to be transmitted. Voting on questions may be implemented as an extension in the future.
The meta-information extractor will store URL's and email addresses from the presentation.
All software will run on windows devices.
Our demonstration will be an active presentation involving all the features described above.
We will deliver the following pieces of software:
-The SmartPPT Presentation Manager application for authoring and executing a smart presentation.
-The SmartPPT Daemon (SmartPPTd), an application that resides in the system tray, launching and controlling PowerPoint from commands it receives from the SmartPPT PM via the event heap.
Our demo will come in the form of a Smart PowerPoint presentation describing the features of our system while at the same time demonstrating those features.We will effectively utilize all three SmartBoards, and will simulate the standard experience of the presenter bringing in his or her laptop and running the presentation from there.
Data
1. The WM database.
2. The Application which will parse, the Event Log and update the Current Configuration.
3. The Application which will create Snap Shots and enable the restoration of User Sessions.
The final software will be a demonstration of a game, comprised of a server application (which keeps track of all the game details, and allows the game to be saved and restored), and a client application (to be installed and run on each laptop for keeping track of player information).
Our demo will show the game in progress, with some time spent on exploring
the various public and private displays.