BETA MultiScreen (not finished)



#What is MultiScreen Agent? (UNDER DEVELOPMENT)

The MultiScreen Agent is developed by Malte Weiss and Simon Voelker. We created the MultiScreen Agent to be able to use multiple cameras and multiple projectors in the BendDesk system. Yet, the MultiScreen Agent can be used for any kind of MultiTouch table at our chair. In fact, the MultiScreen Agent has replaced the MultiTouch Agent and the classic version MultiTouchFramework as our standard touch detection system.
The MultiScreen architecture consists of two parts:

1. MultiScreen Agent
  • The MultiScreen Agent enables you to calibrate the screens used to display the interface and the cameras that are used to detect the touches.
  • The Agent distributes the touches via the NSNotificationCenter to your application.
  • A tutorial on calibrating the MultiScreen Agent is described below. (Coming soon...)
2. Your own application
  • The touches sent by the MultiScreen Agent are received by the MultiScreenServer framework.
  • Using the MultiScreenRenderer allows you to render into the calibrated screen.
  • How to add both of these frameworks to your project is described below.

Before setting up the frameworks and your application, please read this important piece of information first:
  • Please, never change anything in the MultiScreen Agent project or in the framework projects without our permission.
  • If you are using the MulitScreen Agent, please send me an e-mail (Simon Voelker) so that we have an overview who is actually using the Framework.
  • If you find any bugs, please report them to me by e-mail (Simon Voelker) .


#HowTo: Setup the MultiScreen Agent

  • Getting the sources and setting up the frameworks
  • First of all, you have to check out the MultiTouchFramework (For information about Git visit the git Wikipage):
    • git clone ssh://oliver.informatik.rwth-aachen.de/Public/Research%20Projects/MultiTouchFramework/Software/Sources/MultiTouchFramework.git
  • You find the MultiScreen Agent project under:
    • MultiTouchFramework/Agents/MultiScreen/
  • The MultiScreen Agent application can be found under:
    • MultiTouchFramework/Agents/MultiScreen/build/Release/
  • As you will use this Agent very often, it is handy to create an alias of the Agent and drag it to a location of your choice.




#HowTo: Calibrate the MultiScreen Agent


Coming soon...


#HowTo: Create your MultiScreen application


  • First you have to compile two more frameworks that you need for your Applications:
    • MultiTouchFramework/Frameworks/MultiScreenRenderer/
    • MultiTouchFramework/Frameworks/MultiTouchServer/
  • During the compile process these Frameworks are copied to /Users/Shared/Frameworks/
  • If you want to use the SLAP Framework you have to check out SLAP which is still an SVN repository (but we will port it to git in the next couple of weeks)
    • svn checkout svn+ssh://oliver.informatik.rwth-aachen.de/svn/SLAP/
  • For using the SLAP framework you have to compile /SLAP/trunk/Software/SLAPFrameworkGL
  • Like the other frameworks, you find the SLAP Framework under /Users/Shared/Frameworks/
  • Now you are ready to create your MultiScreen application
  • We added a demo application that you can find under:
    • MultiTouchFramework/Demos/MultiScreenDemo/


# Create a project and add the frameworks

  • First of all, create a new Xcode project (we called our project "MultiScreenDemo").
  • Secondly, add the MultiScreenServer, the MultiScreenRenderer, and the MultiTouch framework to your project.
  • In order to use OpenGL in your project, add the OpenGL framework as well.
  • There are several ways to add frameworks to an Xcode project. In this tutorial, we stick to the following procedure:
  • OpenGL:
    • Right-click on the Frameworks folder in your Xcode project.
    • Choose Add->Existing Frameworks...
    • From the displayed list, select the OpenGL.framework.
    • Click on "Add".
  • MultiScreenServer framework, MultiScreenRenderer framework and MultiTouch framework:
    • Right-click on the Frameworks folder in your Xcode project.
    • Choose Add->Existing Frameworks...
    • Click on the "Add other..." button.
    • Select /Users/Shared/Frameworks/MultiScreenServer.framework (for the MultiScreenServer framework)
    • Click on the "Add other..." button.
    • Repeat these steps for the MultiScreenRenderer framework or for the SLAP framework (Note: You only need the SLAP Framework or the MulitScreenRenderer.):
      • /Users/Shared/Frameworks/MultiScreenRenderer.framework
      • /Users/Shared/Frameworks/SLAPFrameworkGL.framework
    • Finally, you have to create a new copy build phase in your project:
      • Click on the arrow next to "Targets" (in the Groups and Files view) and select your target (in our case this is "MultiScreenDemo").
      • Right-click on "MultiScreenDemo"->Add->New Build Phase->New Copy File Build Phase.
      • Choose "Frameworks" as Destination, leave Path blank, and close the window.
      • A new folder called "Copy Files" is created under your target.
      • You can rename this folder to "Copy Frameworks"
      • Finally, drag the frameworks "MultiScreenServer.framework", "MultiScreenRenderer.framework", and "SLAPFrameworkGL.framework" to this new folder.


# Basic program

  • Now, add you have to add some code to get the application running.
  • The entire code can be placed in any of your files. In this tutorial, we use the AppDelegate class, i.e., "MultiScreenDemoAppDelegate" in our case.


# Header file

  • Import the frameworks with:
    • #import <MultiScreenClient/MultiScreenServer.h>
    • #import <MultiScreenRenderer/MultiScreenRenderer.h>
  • Set up the protocols:
    • @interface MultiScreenDemoAppDelegate : NSObject <NSApplicationDelegate,MTTouching,MultiScreenRendering>
  • MTTouching is for receiving the touch events.
  • MultiScreenRendering is needed for rendering into the screens.
  • We will explain how to use these protocols in the next sections.
  • Add variables and properties for accessing the MultiScreenServer and the MultiScreenRenderer frameworks:
    • MultiScreenRenderer* _multiScreenRenderer;
    • @property (nonatomic,retain) MultiScreenRenderer* multiScreenRenderer;

# Source file

  • Add the synthesize functions for the MultiScreenServer and the MultiScreenRenderer variables:
    • @synthesize multiScreenRenderer = _multiScreenRenderer;
  • Now, you have to create the MultiScreenRenderer object and set the delgate of the MultiTouchServer:
    • [MultiTouchServer sharedServer] startWithDelegate:self];
    • self.multiScreenRenderer = [[MultiScreenRenderer alloc] initWithRenderDelegate:self] autorelease];
  • In our case, the self means that all touch and rendering methods are called in the AppDelegate.

# The MTTouching protocol

  • In order to receive touched from the MultiScreen Agent, you must implement the following three methods:
    • -(void) touchesBegan:(NSSet *)touches withEvent:(MTEvent *)event;
    • -(void) touchesMoved:(NSSet *)touches withEvent:(MTEvent *)event;
    • -(void) touchesEnded:(NSSet *)touches withEvent:(MTEvent *)event;
  • Each of these functions gets a set of MTTouch objects. The methods and properties of this object are listed subsequently:
  • Properties:
    • (NSTimeInterval) startTimestamp
      • Returns the time when the touch was detected for the first time.
    • (NSTimeInterval) timestamp
      • Returns the current timestamp of the touch.
    • (MTTouchPhase) phase
      • Returns the phase of the touch, which can be either
      • an MTTouchPhaseBegan, i.e., whenever a finger touches the surface.
      • or an MTTouchPhaseMoved, i.e., whenever a finger moves on the surface.
      • or an MTTouchPhaseStationary, i.e., whenever a finger is touching the surface but has not moved since the previous event.
      • or an MTTouchPhaseEnded, i.e., whenever a finger leaves the surface.
      • or an MTTouchPhaseCancelled, i.e., whenever a touch does not end but we need to stop tracking.
    • (BOOL) tap
      • Returns YES in case the touch is a tap touch.
    • (NSUInteger) tapCount
      • Returns the number of taps detected for the touch (e.g., a double tap).
    • (NSView *)view
      • Returns the view within which the touch was detected. For retrieving the location of a touch, you need its view.
    • (NSDictionary *) touchInfo
      • Returns details about the touch in a dictionary. The dictionary has the following keys/value pairs:
      • radius | NSNumber, the radius of the touch
      • type | NSNumber, the type of the touch
      • mainAxis+ | NSData, the main axis of the touch. Each touch is represented by an ellipsis. The longest axis of the ellipsis is the main axis.
      • axisLengthRation | NSNumber, the ratio of the length of the two ellipsis axes.
  • Methods:
    • -(NSPoint) startLocationInView:(NSView *)view;
    • Returns the start location of the touch (of the view it has been detected in).
    • -(NSPoint) locationInView:(NSView *)view;
    • Returns the current location of the touch (of the view it has been detected in).
    • -(NSPoint) previousLocationInView:(NSView *)view;
    • Returns the previous location of the touch (of the view it has been detected in).

# The MultiScreenRendering protocol

  • For rendering some OpenGL stuff in your screen, use:
    • -(void) draw;
    • This method is called in the OpenGL render process by a timer. The timer calls this methods each 0.005 seconds.

# MultiScreenServer framework

  • The MultiScreenServer offers several methods:
  • -(id)initWithTouchReceiver:(id)delegate;
    • As already mentioned above this method initializes the MultiScreenClient and sets the delegate, which receives touch events.
  • -(void)stopTouchDetection;
    • This method stops the touch detection.
  • -(void)startTouchDetection;
    • This method starts the touch detection.

# MultiScreenRenderer framework

  • -(id)initWithRenderDelegate:(id)delegate;
    • This methods initializes the MultiScreenRenderer and set the delegate which is called during the rendering process.
  • - (void) updateAllScreens;
    • Updates all screens
  • -(NSSize)getGUISize;
    • This method returns the size of the graphical user interface.





We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.