13 KiB
uid |
---|
input-system-touch |
Touch support
Touch support is divided into:
- low-level support implemented in the
Touchscreen
class. - high-level support implemented in the
EnhancedTouch.Touch
class.
Note
: You should not use
Touchscreen
for polling. If you want to read out touches similar toUnityEngine.Input.touches
, seeEnhancedTouch
. If you read out touch state fromTouchscreen
directly inside of theUpdate
orFixedUpdate
methods, your app will miss changes in touch state.
Touch input is supported on Android, iOS, Windows, and the Universal Windows Platform (UWP).
Note
: To test your app on iOS or Android in the editor with touch input from your mobile device, you can use the Unity Remote as described here.
Touchscreen
Device
At the lowest level, a touch screen is represented by an InputSystem.Touchscreen
Device which captures the touch screen's raw state. Touch screens are based on the Pointer
layout.
To query the touch screen that was last used or last added, use Touchscreen.current
.
Controls
Additional to the Controls inherited from Pointer
, touch screen Devices implement the following Controls:
Control | Type | Description |
---|---|---|
primaryTouch |
TouchControl |
A touch Control that represents the primary touch of the screen. The primary touch drives the Pointer representation on the Device. |
touches |
ReadOnlyArray<TouchControl> |
An array of touch Controls that represents all the touches on the Device. |
A touch screen Device consists of multiple TouchControls
. Each of these represents a potential finger touching the Device. The primaryTouch
Control represents the touch which is currently driving the Pointer
representation, and which should be used to interact with the UI. This is usually the first finger that touches the screen.
primaryTouch
is always identical to one of the entries in the touches
array. The touches
array contains all the touches that the system can track. This array has a fixed size, regardless of how many touches are currently active. If you need an API that only represents active touches, see the higher-level EnhancedTouch.Touch
class.
Each TouchControl
on the Device, including primaryTouch
, is made up of the following child Controls:
Control | Type | Description |
---|---|---|
position |
Vector2Control |
Absolute position on the touch surface. |
delta |
Vector2Control |
The difference in position since the last frame. |
startPosition |
Vector2Control |
The position where the finger first touched the surface. |
startTime |
DoubleControl |
The time when the finger first touched the surface. |
press |
ButtonControl |
Whether the finger is pressed down. |
pressure |
AxisControl |
Normalized pressure with which the finger is currently pressed while in contact with the pointer surface. |
radius |
Vector2Control |
The size of the area where the finger touches the surface. |
touchId |
IntegerControl |
The ID of the touch. This allows you to distinguish individual touches. |
phase |
TouchPhaseControl |
A Control that reports the current TouchPhase of the touch. |
tap |
ButtonControl |
A button Control that reports whether the OS recognizes a tap gesture from this touch. |
tapCount |
IntegerControl |
Reports the number of consecutive tap reports from the OS. You can use this to detect double- and multi-tap gestures. |
Using touch with Actions
You can use touch input with Actions, like any other Pointer
Device. To do this, bind to the pointer Controls, like <Pointer>/press
or <Pointer>/delta
. This gets input from the primary touch, and any other non-touch pointer Devices.
However, if you want to get input from multiple touches in your Action, you can bind to individual touches by using Bindings like <Touchscreen>/touch3/press
. Alternatively, use a wildcard Binding to bind one Action to all touches: <Touchscreen>/touch*/press
.
If you bind a single Action to input from multiple touches, you should set the Action type to pass-through so the Action gets callbacks for each touch, instead of just one.
EnhancedTouch.Touch
Class
The EnhancedTouch.Touch
class provides a polling API for touches similar to UnityEngine.Input.touches
. You can use it to query touches on a frame-by-frame basis.
Because the API comes with a certain overhead due to having to record touches as they happen, you must explicitly enable it. To do this, call EnhancedTouchSupport.Enable()
:
using UnityEngine.InputSystem.EnhancedTouch;
// ...
// Can be called from MonoBehaviour.Awake(), for example. Also from any
// RuntimeInitializeOnLoadMethod code.
EnhancedTouchSupport.Enable();
Note
:
Touchscreen
does not requireEnhancedTouchSupport
to be enabled. You only need to callEnhancedTouchSupport.Enable()
if you want to use theEnhancedTouch.Touch
API.
The EnhancedTouch.Touch
API is designed to provide access to touch information along two dimensions:
-
By finger: Each finger is defined as the Nth contact source on a
Touchscreen
. You can use Touch.activeFingers to get an array of all currently active fingers. -
By touch: Each touch is a single finger contact with at least a beginning point (
PointerPhase.Began
) and an endpoint (PointerPhase.Ended
orPointerPhase.Cancelled
). Between those two points, an arbitrary number ofPointerPhase.Moved
and/orPointerPhase.Stationary
records exist. All records in a touch have the sametouchId
. You can use Touch.activeTouches to get an array of all currently active touches. This lets you track how a specific touch moves over the screen, which is useful if you want to implement recognition of specific gestures.
See EnhancedTouch.Touch
API documentation for more details.
Note
: The
Touch
andFinger
APIs don't generate GC garbage. The bulk of the data is stored in unmanaged memory that is indexed by wrapper structs. All arrays are pre-allocated.
Touch Simulation
Touch input can be simulated from input on other kinds of Pointer devices such as Mouse and Pen devices. To enable this, you can either add the TouchSimulation
MonoBehaviour
to a GameObject
in your scene or simply call TouchSimulation.Enable
somewhere in your startup code.
void OnEnable()
{
TouchSimulation.Enable();
}
In the editor, you can also enable touch simulation by toggling "Simulate Touch Input From Mouse or Pen" on in the "Options" dropdown of the Input Debugger.
TouchSimulation
will add a Touchscreen
device and automatically mirror input on any Pointer
device to the virtual touchscreen device.
Reading all touches
To get all current touches from the touchscreen, use EnhancedTouch.Touch.activeTouches
, as in this example:
using Touch = UnityEngine.InputSystem.EnhancedTouch.Touch;
public void Update()
{
foreach (var touch in Touch.activeTouches)
Debug.Log($"{touch.touchId}: {touch.screenPosition},{touch.phase}");
}
Note
: You must first enable enhanced touch support by calling
InputSystem.EnhancedTouch.Enable()
.
You can also use the lower-level Touchscreen.current.touches
API.