Mouse Gestures/Touch

RPi can be configured with touchscreen TFTs, and many modern OSs also supports touches. It would be nice if next version supports touch input compatible features.

Comments

  • Current version already supports touch inputs (including gestures) on Android and HTML5 platforms. On other platforms, gestures are translated from mouse inputs.

    You can check example: https://github.com/raysan5/raylib/blob/develop/examples/core_gestures_detection.c

    Unfortunately, I don't have a RPI touchscreen TFT to test touch inputs...
  • Thanks!
  • Ray, I have been playing with Touch Input of a TFT connected to a Raspberry Pi.

    I am working outside of X, using RayLib directly on the FrameBuffer

    The main problem is that if you process Touch events as a Mouse, then movements are relative when you need absolute, and if you process the raw events, then you need to handle the scaling, de-jitter etc.

    The best approach I have found so far is to use TSLIB to calibrate the TFT Screen, and then to run the stand-alone utility ts_uinput as a daemon. This generates two additional input devices beneath /dev/input/ - a mouse device and event stream (/mouse2 and /event4 in my particular system).

    The /dev/input/mouse2 device is no use, because the movements are relative, but /dev/input/event4 is good because the events are passed through TSLIB and so are scaled and de-jittered.

    It is a very simple event stream that just contains absolute x & y values plus pressure.

    I modified core.c in RayLib and added a touchThread similar to the existing mouseThread, and used that to read from/dev/input/event4 stream and convert the x and y values from the event stream into the existing absolute mousePosition.x and mousePosition.y values, and used pressure > 0 to simulate clicking the left mouse button by setting currentMouseState[0] = 1 or 0.

    This works fairly well, and means I don't need to worry about scaling or de-jittering the event stream as it is passed through TSLIB first, it also means my application doesn't need to be linked to the TSLIB library, and neither does RayLib.

    If I linked my application directly to TSLIB then I would need to process the touch signals outside of RayLib, whereas it is much easier to use existing RayLib calls for collision detection etc, and it doesn't seem to make sense to link RayLib to TSLIB for the rare occasion that somebody wants to use a TFT screen.

    I would submit a pull request via Github, but I am a little concerned that what I have developed is not a general-purpose solution: It involves the user in some work (installing TSLIB, calibrating the TFT screen, running ts_uinput as a daemon), and I also cannot guarantee what the new event stream under /dev/input/ will be called.

    I would be interested to know whether you think I should submit a pull request so at least you can see what I have done?

    Richard (Morphology)
  • Hi Richard! Wow! You are doing amazing progresses with raylib on Raspberry Pi! :smiley:

    I understand and share your concerns, it's not a general solution and it involves some work in user side; actually, I got another similar case with Oculus Rift platform, it requires specific non-general code and an external library to be linked... still thinking how to manage that... maybe it could be moved to an specific example, outside of main code-base...

    Right now I'm preparing a custom configuration system for raylib building, to allow users to configure raylib for their needs (platform, supported file-formats, supported-features...), maybe it can be just added as an option.

    About daemon usage, in general, I don't like daemons... but checking TSLIB license (GPLv2), I'm not sure if it could be linked statically, so, daemon is a valid approximation.

    Just submit the pull-request and we can continue the discusion there.
Sign In or Register to comment.