-
Notifications
You must be signed in to change notification settings - Fork 1
Getting started
The framework lets you connect a series of blocks together to process the touch input. Blocks may be used to produce cursors that are representing contacts with the touch surface and that will be further processed, to filter the cursors, to remove useless events, to re-schedule further treatment on other threads, to dispatch/associate the cursors to different touch targets (understand regions) on the screen, to recognize gestures, to add inertia, etc.
Typically, a block can only be queued to a single other block. However, stateless blocks could potentially be queued to several other blocks. Also, several blocks can be queued to a same block. Therefore, the flow of touch data (e.g. cursors) can be seen as a river with its bifurcations.
The input of the blocks is the output of the queued block(s).
Here is an example of chain of blocks:
1 TuioInputSource => 1 BoundingBoxCursorFilter => 1 NoChangeCursorFilter => 1 CursorToTouchTargetDispatcher => * TouchTargetFilter => 1..* GestureRecognizer => 0..1 InertiaFilter => 1..* GestureListener
This example contains one TuioInputSource block which creates cursors from TUIO events that are first processed by the BoundingBoxCursorFilter in order to minimize the noise of cursor movements. The output cursors are then processed by the NoChangeCursorFilter to eliminate redundant cursor events. The output cursors are then processed by the CursorToTouchTargetDispatcher in order to find touched targets on the screen. The output is then processed by one or several GestureRecognizers in order to recognize different gestures from the cursors. The output gestures are may then be processed by an InertiaFilter (for instance, when scrolling). The output is then processed by gesture listeners to actually perform the action on the touch targets (for instance, change the viewport of a scrolled list, panning a map, etc.).
The chain of blocks can be constructed in the code using the regular API and the builder API. Depending on your need and taste, you may choose one or the other.
The regular API provides more flexibility by creating each block, configuring them, and connecting them to each other. It is simple, but may still result in some relatively large amount of code for complex block chains.
Here is how to code the above example using the regular API:
TODO
The builder API is a layer on top of the regular API and provides more convenience by using a simple DSL. This API offers lower type safety, as you may get class cast exceptions at runtime if you are not careful, but it can help in reducing the amount of code to write to construct the chain of blocks.
The DSL is currently quite limited, but will evolved in the future and can already provides convenience and fluidity.
Here is how to code the above example using the builder API:
ChainBuilder
.queue(new TuioSource())
.queue(new BoundingBoxCursorFilter()
.queue(new NoChangeCursorFilter())
.queue(new CursorToComponentDispatcher())
.queue(new IncludeTouchTargetFilter(myTarget1, myTarget2))
.queue(new TapGestureRecognizer())
.queue(new MyTapAdapter())Below is a description of the blocks you can find in the MultitouchFramework.
| Block | Input | Output | Remark |
|---|---|---|---|
| TuioSource | - | CursorUpdateEvent | |
| BoundingBoxCursorFilter | CursorUpdateEvent | CursorUpdateEvent | |
| NoChangeCursorFilter | CursorUpdateEvent | CursorUpdateEvent | |
| SimpleCursorToTouchTargetDispatcher | CursorUpdateEvent | CursorUpdateEvent | Will soon have a shorter name |
| DragRecognizer | CursorUpdateEvent | DragEvent | |
| PinchSpreadRecognizer | CursorUpdateEvent | PinchSpreadEvent | |
| TapRecognizer | CursorUpdateEvent | TapEvent | |
| IncludeTouchTargetFilter | Any TouchEvent | Same as input | |
| ExcludeTouchTargetFilter | Any TouchEvent | Same as input | |
| IncludeUserFilter | Any TouchEvent | Same as input | |
| ExcludeUserFilter | Any TouchEvent | Same as input | |
| EDTScheduler | Any TouchEvent | Same as input | |
| PrintStreamAdapter | Any TouchEvent | - |
Input blocks are the source blocks that produce cursors representing contacts with the touch surface.
The following input blocks are currently available:
- TuioSource: This block receives input data from a TUIO server and produces Cursors to be processed by the sub-sequent blocks.
The following input blocks are expected in the future:
- WindowsTouchSource: This block is meant to produce Cursors from Windows Touch events. It is therefore dedicated to the Windows platform only.
- OSXSource: This block is meant to produce Cursors from a Magic Trackpad, Magic Mouse or simply a MacBook's trackpad. It is therefore dedicated to the OS X platform only.
Filter blocks typically process the cursors to transform them, to smoothen their movements, etc.
The following filter blocks are currently available:
- BoundingBoxCursorFilter: This block filters the cursors to reduce the tiny noisy movements that often occur even when not moving the point of contact on the touch surface. This is often the case using TUIO input on some touch tables for instance. The output cursors of this block are more stable. This can be a cheap (in terms of performance) alternative to a low-pass filter that would only allow movements of low frequency.
- NoChangeCursorFilter: This block filters out events from the previous block by not triggering the sub-sequent blocks when the output event would be exactly the same as the previous output events. This removes, for instance, all STATIONARY events that are often triggered in touch frameworks. This blocks greatly reduces the number of events and therefore increase the overall performance.
- IncludeTouchTargetFilter: This block filters out all the cursors that are not touching the specified touch targets. This block can be used, for instance, after a cursor-to-touch-target dispatcher and before gesture recognizers to make sure that gesture recognition is performed only on some touch targets. Note that in order to avoid unnecessary processing, you may use this block as early as possible in the chain.
- ExcludeTouchTargetFilter: This block filters out all the cursors that are touch the specified touch targets. This block can be used, for instance, after a cursor-to-touch-target dispatcher and before gesture recognizers to make sure that gesture recognition is not done on every single touch target. Note that in order to avoid unnecessary processing, you may use this block as early as possible in the chain.
- IncludeUserFilter: This block filters out all the cursors that are not performed by the specified users. Note that in order to avoid unnecessary processing, you may use this block as early as possible in the chain.
- ExcludeUserFilter: This block filters out all the cursors that are performed by the specified users. Note that in order to avoid unnecessary processing, you may use this block as early as possible in the chain.
Dispatch blocks are actually filters that associate the cursors to specific touchable targets on the screen, to specific users around a touch table, etc.. They are useful, for instance, when the input cursors are only generated per-device instead of per-touch-target or per-GUI-component. This is particularly the case of TUIO input.
The following dispatch blocks are currently available:
- SimpleCursorToTouchTargetDispatcher: This block dispatches cursors to pre-registered touch targets on the screen. It is not bound to any GUI framework.
The following dispatch blocks are expected in the future:
- CursorToComponentDispatcher: This block dispatches cursors to AWT/Swing components.
- CursorToNodeDispatcher: This block dispatches cursors to JavaFX nodes in a scene graph.
- CursorToUserDispatcher: This block dispatches cursors to different user based on the touch target of the screen they interact. This can be useful if your touch device cannot make the distinction between different users.
Gesture recognition blocks generate GestureEvents when some gesture are recognized. Typically, one block recognizes only one gesture.
The following gesture recognition blocks are currently available:
- TapRecognizer: This block recognizes taps performed with a configurable number of fingers. This can be used, for instance, to action a button (touch target), to select an object on a map, etc.
- DragRecognizer: This block recognizes a drag gesture performed with a configurable number of fingers. This can be used, for instance, to drag an object (touch target) on the screen, to pan a map, etc.
- PinchSpreadRecognizer: This block recognizes a pinch/spread gesture performed with a configurable number of fingers. This can be used, for instance, to collapse/expand and object (touch target) on the screen, the zoom out/in a map, etc.
The following gesture recognition blocks are expected in the future:
- RotateRecognizer: This block recognizes rotations performed with a configurable number of fingers. This can be used, for instance, to rotate an object on a map, to rotate a map, etc.
The following scheduling blocks are currently available:
- EDTScheduler: This block re-schedules the processing of the touch events by the queued blocks on the Swing Event Dispatch Thread. This can be used, for instance, to handle touch events performed on Swing components, to paint on a canvas, etc. Note that in order to minimize the impact of processing touch events on the EDT and keeping the user interface of a Swing application as responsive as possible, it is best to use it in the leaf blocks of the chain.
The following debug blocks are currently available:
- PrintStreamAdapter: This block prints out the received touch events to a specific PrintStream. This can be used, for instance, to print events on the standard output or standard error.
The demo application can be found in the multitouchframework-demo package.
In order to run it, just check it out from git, compile it and run the MultitouchFrameworkDemo. For better understanding of how the framework works, you may have a look at the initChain() method of the MultitouchFrameworkDemo class.
Note that the demo application makes use of a TUIO client listening on port the default port 3333. So you will need a TUIO server running on your machine. The TUIO server may be provided with your touch device drivers or may be part of another standalone product. But it is not part of the the MultitouchFramework.
Also, the implementation of touch targets and the containing canvas of the demo application are only there for demo purposes and are not actually part of the framework.
Finally, note that the demo application does not make use of the mouse and keyboard to manipulate the touch targets. At a later stage, the framework will provide blocks to convert mouse and keyboard events to cursor updates/tap/drag/rotate/etc. events.