InputEventGesture in Godot – Complete Guide

Touch gestures have become an essential part of the modern user experience, particularly on mobile devices. They enable intuitive, efficient interactions that can significantly enhance the usability and enjoyment of games and apps. In the Godot Engine, a robust and versatile tool for game development, touch gestures are handled through the InputEventGesture class. This class is a cornerstone for developing games with touch support, enabling you to create smooth and responsive touch-based mechanics.

What is InputEventGesture?

The InputEventGesture class in Godot 4 is an abstract base class used to represent touch gesture events. It inherits from InputEventWithModifiers, which gives it the ability to recognize additional inputs, such as control, shift, or alt keys being pressed in conjunction with the gesture. This functionality allows developers to create intricate touch-based controls for their games, providing a richer gaming experience on touchscreen devices.

What is it for?

This class is responsible for detecting and managing gestures such as magnifying and panning, which are common in mobile games and applications. InputEventGesture provides the means to not only detect these gestures but also to determine their characteristics, such as the location and movement involved.

Why Should I Learn It?

Understanding InputEventGesture and its subclasses is key for Godot developers aiming to create games or apps with intuitive touch controls. By mastering gestures, you can make your games feel more natural and immersive on touchscreen devices. Whether you’re a beginner looking to expand your skill set or an experienced coder refining your expertise, diving into Godot’s touch input management is a critical step in developing modern, interactive applications.

CTA Small Image
FREE COURSES AT ZENVA
LEARN GAME DEVELOPMENT, PYTHON AND MORE
ACCESS FOR FREE
AVAILABLE FOR A LIMITED TIME ONLY

Detecting Touch Gestures in Godot

To begin handling touch gestures in Godot, we first need to understand how to detect them. Generally, this is done through the _input() function, which is a callback for input events in your scripts. Let’s start by detecting a simple touch event.

func _input(event):
    if event is InputEventScreenTouch:
        if event.is_pressed():
            print("Screen has been touched at position", event.position)

This code checks if the input event is a touch event (InputEventScreenTouch) and prints the coordinates where the screen was touched.

Handling Pinch and Zoom Gestures

For pinch and zoom gestures, you need to track two touch points and how their distance changes over time. Here’s how you might do it:

var start_distance = -1
var current_distance = 0

func _input(event):
    if event is InputEventScreenDrag:
        if event.get_index() == 1:
            var touch_position_1 = event.position
            var touch_position_2 = event.relative # Assuming another finger is already down

            current_distance = touch_position_1.distance_to(touch_position_2)

            if start_distance == -1:
                start_distance = current_distance
            
            var zoom_factor = current_distance / start_distance
            print("Zoom factor:", zoom_factor)

Here, we initialize a start_distance variable to keep track of the initial distance between the two touch points. When the InputEventScreenDrag is detected with index 1, we assume that the user is using two fingers. We calculate the current distance and then determine the zoom factor by comparing it to the start distance.

Recognizing Swipe Gestures

To recognize swipe gestures, we can track the movement of a single touch point. Let’s set up a simple swipe detection:

var touch_start_position = Vector2()
var touch_end_position = Vector2()

func _input(event):
    if event is InputEventScreenTouch and event.is_pressed():
        touch_start_position = event.position
    elif event is InputEventScreenTouch and not event.is_pressed():
        touch_end_position = event.position
        var swipe_vector = touch_end_position - touch_start_position
        print("Swipe vector:", swipe_vector)

In this example, when the screen is touched, we record the position. When the touch ends, we calculate the swipe direction by subtracting the start position from the end position, creating a vector representing the swipe gesture.

Handling Long Press Gestures

Long press gestures can be recognized by measuring the time between an InputEventScreenTouch being pressed and released. Here is an example:

var touch_start_time = 0
var LONG_PRESS_DURATION = 1.0 # in seconds

func _input(event):
    if event is InputEventScreenTouch:
        if event.is_pressed():
            touch_start_time = OS.get_ticks_msec()
        else:
            var touch_duration = OS.get_ticks_msec() - touch_start_time
            if touch_duration >= LONG_PRESS_DURATION * 1000:
                print("Long press detected")

When the touch begins, we record the start time in milliseconds. Upon release, we calculate how long the screen was touched. If the duration exceeds our predefined long-press threshold, we consider it a long press.

These examples cover the basic gestures such as touch, pinch, zoom, swipe, and long press. Experiment with these snippets and see how you can integrate them into your Godot projects to make engaging and interactive touch interfaces.

Continuing our exploration into handling touch gestures in Godot, let’s delve into more advanced functionality and code examples. These will provide a deeper understanding of how to manage touch inputs effectively within your projects.

Processing two-finger rotations can add a layer of interactivity in games and apps, allowing users to rotate objects or the camera. Here’s an example:

var initial_angle = 0
var two_finger_start = Vector2()

func _input(event):
    if event is InputEventScreenDrag and event.get_index() == 1:
        if initial_angle == 0:
            two_finger_start = event.relative
            initial_angle = two_finger_start.angle_to(event.position)
        else:
            var current_angle = two_finger_start.angle_to(event.position)
            var rotation_delta = current_angle - initial_angle
            print("Rotation delta:", rotation_delta)
            initial_angle = current_angle  # Update for continuous rotation

The above script initializes an initial angle and uses the position of the second finger to determine the gesture’s rotation relative to the first touch point. As the user rotates their fingers, the rotation delta is calculated and printed.

Multi-touch gestures can be quite complex. The following example uses an array to keep track of multiple touch points for a gesture-based control scheme:

var touches = []

func _input(event):
    if event is InputEventScreenTouch:
        if event.is_pressed():
            touches.append(event.position)
        else:
            touches.erase(event.position)

This array ‘touches’ holds the positions of all current touch points. When a new touch is detected, its position is added to the array, and when a touch ends, its position is removed.

Detecting a double-tap gesture can be useful for actions like resetting a view or zooming in on a point. Here’s a simple implementation to detect double taps:

var last_tap_time = 0
var DOUBLE_TAP_MAX_DELAY = 0.3 # 300 milliseconds

func _input(event):
    if event is InputEventScreenTouch and event.is_pressed():
        var current_time = OS.get_ticks_msec()
        if current_time - last_tap_time < DOUBLE_TAP_MAX_DELAY * 1000:
            print("Double tap detected")
        last_tap_time = current_time

We track the time between two consecutive taps and compare it to a predefined threshold. If the second tap occurs within this threshold, we consider it a double-tap.

Finally, detecting drag gestures typically involves tracking touch movement over a period of time:

var drag_start_position = Vector2()
var drag_in_progress = false

func _input(event):
    if event is InputEventScreenTouch:
        if event.is_pressed():
            drag_start_position = event.position
            drag_in_progress = true
        else:
            drag_in_progress = false

    if drag_in_progress and event is InputEventScreenDrag:
        var drag_vector = event.relative
        print("Drag vector:", drag_vector)

In this snippet, we start a drag when a touch is pressed and keep track of whether the drag is in progress. When an InputEventScreenDrag is received, we calculate the drag vector and print it. Releasing the touch ends the drag.

These examples demonstrate how to handle a variety of touch gestures in Godot, providing the capability to create a rich and nuanced touch interface. As always with input management, the key is to marry responsiveness with the expected outcomes of any given interaction, creating an intuitive experience for the user.

Exploring further, we’ll look at how to use Godot’s InputEventPanGesture and InputEventMagnifyGesture for more nuanced and specific gestural inputs, which offer higher-level abstractions for common multi-touch actions.

To detect a pan gesture, which is a common requirement for camera movement or object positioning in mobile games, we can utilize the InputEventPanGesture event:

func _input(event):
    if event is InputEventPanGesture:
        var delta = event.relative
        print("Pan delta:", delta)

This code will print the movement delta for panning, which is the distance the user’s finger has moved from the initial touch point.

Similarly, for pinch-to-zoom functionality, the InputEventMagnifyGesture comes in handy. This event provides a factor of the pinch movement, making it straightforward to implement zooming:

func _input(event):
    if event is InputEventMagnifyGesture:
        var factor = event.factor
        print("Zoom factor:", factor)

This script will print the magnification factor which ranges around 1 (where less than 1 indicates a pinch gesture, and greater than 1 indicates a spread gesture).

In addition to handling discrete gestures, we also need to manage gestures’ ending. It is crucial to ensure that game objects stop moving or scaling when the user’s fingers leave the screen. The following code demonstrates how to handle the end of a touch event:

func _input(event):
    if event is InputEventScreenTouch and not event.is_pressed():
        handle_gesture_end()

The handler function handle_gesture_end() would contain the logic needed to finalize the active gesture.

Knowing how to differentiate between single and multi-touch events can aid in building complex touch interfaces. With Godot, you can use the index of the InputEventScreenTouch to distinguish between touches:

func _input(event):
    if event is InputEventScreenTouch:
        if event.index == 0:
            # Handle single touch
        elif event.index == 1:
            # Handle second touch

This allows your game to handle each touch point separately, essential for gestures like rotating two on-screen objects independently.

To create a custom gesture, such as a triple tap, we can extend Godot’s input event system with a little creativity:

var last_tap_time = 0
var tap_count = 0
var TRIPLE_TAP_MAX_DELAY = 0.2 # 200 milliseconds

func _input(event):
    if event is InputEventScreenTouch and event.is_pressed():
        var current_time = OS.get_ticks_msec()
        if current_time - last_tap_time < TRIPLE_TAP_MAX_DELAY * 1000:
            tap_count += 1
            if tap_count == 3:
                print("Triple tap detected")
                tap_count = 0 # reset after the third tap
        else:
            tap_count = 1 # reset counter if the delay is too long
        last_tap_time = current_time

The code above builds on the double-tap logic by adding a counter. If three taps are detected within the allowed time frame, we consider it a triple tap.

While these examples offer a range of touch gesture implementations, it’s important to consider the context and mechanics of your game when deciding which gestures to implement and how they should function. Experiment with the sensitivity and responsiveness of each gesture to ensure they align with your gameplay and provide the best user experience.

Continue Your Game Development Journey

Mastering touch gestures in Godot is just the beginning of your game development adventure. To keep learning and expanding your skills, our Godot Game Development Mini-Degree is the perfect next step. This program provides an in-depth exploration into Godot 4, offering a wealth of knowledge on creating cross-platform games with engaging gameplay mechanics, from platformers to RPGs.

Whether you’re a complete novice to coding or looking to enhance your existing expertise, our tailored courses cater to all levels of experience. You can progress at your own pace, building a portfolio of impressive projects along the way. And upon completion, you’ll receive a certificate to showcase your newfound abilities.

For a broader look at what we offer in Godot and game development, explore our variety of Godot courses. Here at Zenva, we’re committed to helping you go from beginner to professional, so dive in and keep learning – your game development journey is only just beginning!

Conclusion

By equipping yourself with the knowledge of touch gestures in Godot, you’ve unlocked a new dimension of interactive design for your games and applications. The paths to innovation and engaging gameplay are numerous, and with our Godot Game Development Mini-Degree, you’ll be able to explore these avenues fully. Rise to the challenge and infuse your projects with intuitive controls and captivating mechanics that delight your audience.

Remember, the journey of game development is one of constant learning and discovery. Every step you take, from getting familiar with basic touch inputs to crafting complex gesture-based interfaces, paves the way to creating those unforgettable gaming experiences. So keep coding, keep creating, and let us here at Zenva be a part of your success story in the exciting world of game development.

FREE COURSES
Python Blog Image

FINAL DAYS: Unlock coding courses in Unity, Godot, Unreal, Python and more.