pymt: new and advanced interaction widgetsthe-space-station.com/~dennda/gallery/mt... · gsoc 2010...

8
PyMT: New and Advanced Interaction Widgets GSoC 2010 Proposal Christopher Denter [email protected] April 9, 2010 i

Upload: others

Post on 16-Sep-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

PyMT: New and Advanced Interaction WidgetsGSoC 2010 Proposal

Christopher Denter [email protected]

April 9, 2010

i

Page 2: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

About MeHi, my name is Christopher. I am 21 yearsof age and live and study in Koblenz, Ger-many. My field of study is computational visual-istics, which is computer science with a focus oncomputer graphics, image processing, AR/VR, de-sign, software ergonomics and the automated cre-ation, manipulation and analysis of images in gen-eral.

I have recently handed in my thesis titled De-veloping a Multi-Touch Application for MedicalVolume Visualization. In the course of the the-sis, I designed and implemented an applicationfor volume rendering and evaluated the efficiencyand ease of use of multitouch gestures in this con-text. The program was implemented in PyMT.Additionally, a 40" DSI MT table was built thatis now living in my flat.

I am now doing courses for my master’s, including things like gesture track-ing. In the past I have worked for many different open source projects such asUbuntu, MoinMoin, Zine/Pocoo and MeMaker. I am very familiar with Pythonand have contributed patches to PyMT already. In 2008 and 2009 I have suc-cessfully participated in GSoC as a student for MoinMoin. In 2010 I decided toapply for NUI Group instead due to the fact that I’m very interested in HCI.Additionally, NUI happens to fit what I study just perfectly, so I also have quitesome knowledge in these fields. My extensive experience with Python, PyMT,computer graphics, image processing, open source collaboration models (git,bugtracking, etc.) and the fact that I’m really interested and eager to accom-plish what I propose make me a good choice as student.

• Name: Christopher Denter

• EMail: [email protected]

• Age: 21

• Timezone: Germany (Berlin)

• Web: the-space-station.com & http://nuigroup.com/forums/member/5685/

• Occupation: Working towards my MSc in CV

• Open Source Development Experience: Vast

• Proposal Tags: PyMT, Widgets, 3D, Text Input, Optical Flow

ii

Page 3: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

Figure 1: Mockup of a Swype-like keyboard. By moving a finger across thekeys, the words to be typed are deduced automatically from the keys on thepath.

Project ProposalWhile PyMT already contains several widgets and facilities that can be used forUI creation, text input, 3D drawing and user interaction, few of these fully utilizethe potential of MT hardware. Actually, most of the already existing widgetsare a reimplementation of what we already know from the WIMP world.The goal of this proposal is to design and implement a plethora of novel widgetsthat allow users to express their intent efficiently and intuitively. By taking intoaccount information not only from several fingertips, but also the user’s handsand their orientation (gesture tracking, optical flow), I feel that it is possible torealize widgets that allow for higher input bandwidth. These widgets should bewell-documented and easily reusable in different contexts by developers.In the following, a list of examples is given to make clear what I envision. Ofcourse, this list is subject to discussion and not necessarily complete1.

Text Input One main area that is still very problematic in MT scenarios istext input. No input technique is available as of today that is at least somewhatcomparable to ordinary keyboards when it comes to typing speed. While oneGSoC is certainly not enough to research this fully, it would still be nice tohave a widget that allows relatively fast text input. Several already availableapproaches such as Dasher, Tikinotes and Swype will be evaluated. At least oneof the approaches (likely Swype) will be implemented (cf. figure 1). The prob-lem with Swype is that it again is a single-touch input method. Optimizationpossibilities with respect to the higher input bandwidth offered by multitouchwill therefore be researched. It might even make sense to implement a virtualkeyboard that is akin to ordinary keyboards, but takes into account the posi-tions and orientations of the hands as well as the reachability of keys by eachfinger (see figure 2). When it comes to actual usage, the widgets should not

1Originally I planned to submit all of this in one large proposal. However, Mathieu askedme to split the four major tasks up and apply for each of them separately. If one such taskcan be completed faster than planned, I will continue to work on one of the other topics

iii

Page 4: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

Figure 2: Mockup of a split keyboard. Each finger can reach certain keys. Thesizing and padding of the keys can be determined in an optional calibrationstep that measures the user hand’s dimensions. Each half follows the hand’smovements and orientation changes.

strain the user from an ergonomic point of view. I really think that a solutionthat allows for quick text input would significantly improve interaction withmultitouch devices. Ideally, an additional hardware keyboard that still oftenaccompanies many MT setups would become obsolete.

Deliverables for this task are:

• Swype-like Keyboard

• Intelligence features (Missed a key? No problem.)

• Word suggestions (Look-ahead suggestions: Missi -> Mississippi)

• Split keyboard (One half for each hand, following the position and rotationof the user’s hands)

• Adaptive layout of the keys depending on the hand’s properties (Ideally,this would make blind typing possible.)

• Documentation and a set of examples

• Typing speed benchmarks

iv

Page 5: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

Figure 3: Mockup of a menu widget. The user invokes the menu by performing agesture. The menu is then drawn and the user selects from one of its categories.That category’s items are then listed and the user selects one of them, whichcauses the target operation to be carried out.

Menu Widgets Some more generic widgets that e.g. allow for selectionamong different options will be implemented. For example, circular on thespot widgets (as seen in [Wu03] or the Multitouch Menu [MTM08]) can be use-ful in many contexts and applications. A fraction of the widgets that will beimplemented is briefly presented below:A widget that allows selection of different options among several (e.g. four)categories will be implemented. It can be invoked by performing a gesture (ora double-tap) and appears right at the finger tips.Another widget that is invisible by itself can be used by the developer to con-nect gestures with functionality. For example, a gesture in which three fingersare swiped from right to left can be interpreted as a backwards command. Thewidget’s job is the recognition of such gestures and the execution of a callbackmethod or function. The benefit for the developer is less code and ideally acommon set of gestures for similar tasks among different applications2.A big problem with multitouch hardware is the low resolution of the touch pointsdue to the fingers’ thickness. Another set of widgets would address this issue ina generic way. For example, to precisely adjust a slider’s value, one finger canbe used to step through the available values in a normal speed. When using twofingers to slide, the value adjusts twice as slow. Analogously, for three fingersit would be three times as slow. To precisely position a touch (the touch.pos),an optional D-pad widget can be added to slowly shift a touch to the left, right,top or bottom.

Deliverables for this task are:

• Circular menu widget

• Gesture to command widget

• Precisely adjustable sliders2Although the gestures itself would, obviously, not be hardcoded.

v

Page 6: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

• D-pad-like touch position correction widget

• Documentation and a set of examples

• And many more widgets

3D Support PyMT’s capabilities when it comes to manipulation of 3D ob-jects are very limited. Existing modifications will be considered and eventuallyenhanced and merged. At the end of GSoC, it will be possible to interact with3D scenes more easily. This includes the ability to select 3D objects in a 3Dscene, alter for example the geometry of such objects and finally, navigate insaid scene.

Both 2D and 3D should be usable in the same scene (e.g. mixing a flight simu-lator with a heads-up display). It might make sense to implement an additionalabstract base class specifically tailored to the needs of 3D objects, including forexample collision methods (similar to collide_point) with a generic defaultimplementation.Challenges of this proposal are (for example) problems with occlusion of objectsand low touch resolution (which is especially annoying when one wants to selectone of several widgets in close proximity to each other but far away from theuser’s point of view). A solution for the former problem might be using trans-parency and displaying a list of all objects that lie on the ray (although thismight not be supported by OpenGL’s picking functionality and require a cus-tom implementation). The latter could perhaps be solved by depth-dependanttolerance regions (i.e., the farther away, the bigger the tolerance region in rela-tion to the size of the widget on the screen) and an optional list of potential hitswhen a user touches the intersecting tolerance regions of more than one widget.

Deliverables for this task are:

• Merge of the existing patches

• Abstract base class for 3D widgets

• Support for selection and modification of 3D widgets

• Documentation and a set of examples

vi

Page 7: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

Optical Flow When interacting with on-screen objects, the well-known selec-tion and manipulation commands (such as drawing a rectangle and touching abutton to move them) are neither intuitive nor efficient. Techniques that utilizeoptical flow seem very appealing here.Base classes for widgets that can be manipulated by optical flow input will there-fore be implemented. This likely requires a new type of input data to reflectthe apparent motion between subsequent frames in the camera image stream.Adjustments for this are necessary in both, PyMT and the tracking application.While the data coming from the tracking application can be simulated, it willbe attempted to contribute a real patch. Additionally, a PyMT input providerto deal with the optical flow input data will be implemented.With these changes made, it will be possible to make use of even more naturalgestures for user interaction. E.g., in a chess game, the users can use their wholehands and even arms to move the figures back to their side of the table when agame ends, as opposed to moving them all individually or by drawing selectionrectangles.

Deliverables for this task are:

• A patch for the tracking application to support optical flow

• Alternatively, datasets that simulate optical flow input can be generated

• Optical flow input provider for PyMT

• Abstract base class (or mixin) for widgets that accept optical flow input

• Support for 3D vector arithmetic

• Documentation and a set of examples

vii

Page 8: PyMT: New and Advanced Interaction Widgetsthe-space-station.com/~dennda/gallery/mt... · GSoC 2010 Proposal Christopher Denter dennda@the-space-station.com April 9, 2010 i. About

References[Wu03] Wu, Mike and Balakrishnan, Ravin, Multi-finger and whole hand ges-

tural interaction techniques for multi-user tabletop displays, In UIST ’03:Proceedings of the 16th annual ACM symposium on user interface softwareand technology, 2003

[MTM08] Bailly, Gilles and Demeure, Alexandre and Lecolinet, Eric and Ni-gay, Laurence, Multi-Touch Menu (MTM), Proceedings of the 20th Inter-national Conference of the Association Francophone d’Interaction Homme-Machine, 2008

viii