|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Home | Research | Research Programs | Human-Computer Interaction |
Human-Computer InteractionScratchCadThe ScratchCad project began in June 2002. Its goal is to develop, test and evaluate multi-modal (speech and gesture) interfaces for 3D authoring and navigation. We expect to develop an easy-to-use interface that will allow naïve users to create and navigate crude 3D models using speech and gesture. This multi-modal interface will be of interest to anyone working in 3D design, particularly for those using 3D software for early design in automotive, manufacturing, architecture and theatre set design, among other application areas. Research ContactAlain Désilets Business ContactDr. George Forester |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|