text stringlengths 1 372 |
|---|
<topic_start> |
other interactive widgets |
flutter offers a variety of buttons and similar interactive widgets. |
most of these widgets implement the material design guidelines, |
which define a set of components with an opinionated UI. |
if you prefer, you can use GestureDetector to build |
interactivity into any custom widget. |
you can find examples of GestureDetector in |
managing state. learn more about the GestureDetector |
in handle taps, a recipe in the flutter cookbook. |
lightbulb tip |
flutter also provides a set of iOS-style widgets called |
cupertino. |
when you need interactivity, it’s easiest to use one of |
the prefabricated widgets. here’s a partial list: |
<topic_end> |
<topic_start> |
standard widgets |
<topic_end> |
<topic_start> |
material components |
<topic_end> |
<topic_start> |
resources |
the following resources might help when adding interactivity |
to your app. |
gestures, a section in the flutter cookbook. |
<topic_end> |
<topic_start> |
taps, drags, and other gestures |
this document explains how to listen for, and respond to, |
gestures in flutter. |
examples of gestures include taps, drags, and scaling. |
the gesture system in flutter has two separate layers. |
the first layer has raw pointer events that describe |
the location and movement of pointers (for example, |
touches, mice, and styli) across the screen. |
the second layer has gestures that describe semantic |
actions that consist of one or more pointer movements. |
<topic_end> |
<topic_start> |
pointers |
pointers represent raw data about the user’s interaction |
with the device’s screen. |
there are four types of pointer events: |
on pointer down, the framework does a hit test on your app |
to determine which widget exists at the location where the |
pointer contacted the screen. the pointer down event |
(and subsequent events for that pointer) are then dispatched |
to the innermost widget found by the hit test. |
from there, the events bubble up the tree and are dispatched |
to all the widgets on the path from the innermost |
widget to the root of the tree. there is no mechanism for |
canceling or stopping pointer events from being dispatched further. |
to listen to pointer events directly from the widgets layer, use a |
listener widget. however, generally, |
consider using gestures (as discussed below) instead. |
<topic_end> |
<topic_start> |
gestures |
gestures represent semantic actions (for example, tap, drag, |
and scale) that are recognized from multiple individual pointer |
events, potentially even multiple individual pointers. |
gestures can dispatch multiple events, corresponding to the |
lifecycle of the gesture (for example, drag start, |
drag update, and drag end): |
tap |
double tap |
long press |
vertical drag |
horizontal drag |
pan |
<topic_end> |
<topic_start> |
adding gesture detection to widgets |
to listen to gestures from the widgets layer, |
use a GestureDetector. |
info note |
to learn more, watch this short |
widget of the week video on the GestureDetector widget: |
if you’re using material components, |
many of those widgets already respond to taps or gestures. |
for example, IconButton and TextButton |
respond to presses (taps), and ListView |
responds to swipes to trigger scrolling. |
if you aren’t using those widgets, but you want the |
“ink splash” effect on a tap, you can use InkWell. |
<topic_end> |
<topic_start> |
gesture disambiguation |
at a given location on screen, |
there might be multiple gesture detectors. |
for example: |
all of these gesture detectors listen to the stream |
of pointer events as they flow past and attempt to recognize |
specific gestures. the GestureDetector widget decides |
which gestures to attempt to recognize based on which of its |
callbacks are non-null. |
when there is more than one gesture recognizer for a given |
pointer on the screen, the framework disambiguates which |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.