Selection & Input Methods

Master the implementation of Apple's selection and input patterns to create more intuitive interfaces.

Selection and input methods are at the heart of how people use Apple devices. Every interaction builds on natural human behaviors - pointing, touching, dragging, and speaking. These patterns go beyond simple clicks and taps. They create a connection between users and their content through carefully designed gestures, keyboard commands, and voice controls. Good selection and input design makes complex actions feel simple. It helps people edit documents, manipulate objects, and navigate apps without thinking about the technology behind it. When implemented well, these interaction methods fade into the background, letting users focus entirely on their tasks. Understanding these foundations helps create interfaces that feel natural and responsive across Apple platforms.

Fundamentals of touch selection

Touch selection must respond instantly and feel natural. When users tap items, the interface should clearly show what's selected and what actions are possible.

Users expect various touch interactions when selecting items. Single taps select individual items. Simple drag gestures select multiple items. Adding a second finger while dragging modifies selection.

Avoid creating unique selection patterns. Using standard iOS behaviors helps users understand how selection works across all apps.

Different types of selection serve different needs:

  • Quick selection. Elements respond briefly and return to normal
  • Staying selected. Elements stay selected until tapped again
  • Multiple choices. Users can select many items at once using standard gestures[1]

Pro Tip! Check if selected items are clearly visible in both light and dark modes.

Implementing long-press gestures

Long-press gestures add depth to touch selection. This gesture reveals contextual actions when users press and hold interface elements, similar to right-clicking on desktop systems.

The timing of long-press gestures matters for usability. iOS expects these gestures to last 0.5 seconds — long enough to be intentional but short enough to feel responsive. Shorter durations can cause accidental triggers, while longer ones make the interface feel slow.

Apps must show clear visual feedback during long-press interactions. Users should see when the gesture starts, how long to hold, and what will happen when they release it. This feedback helps prevent confusion and supports learning.

Different elements support various long-press behaviors:

  • Preview and open. Shows content preview on press, opens fully on swipe-up
  • Context menus. Reveals relevant actions for the pressed item
  • Custom actions. Triggers specific features like rearranging or editing

Pro Tip! Add haptic feedback to long-press gestures — it helps users know exactly when they've held long enough.

Designing for keyboard input

Keyboard input needs careful handling in iOS and macOS apps. Text fields must support basic editing functions, custom input methods like emoji keyboards, and text suggestions that help users type faster.

Input fields should clearly show their purpose and state. The keyboard type should match the expected input — showing number pads for numeric fields, email keyboards for email addresses, and adding return key actions that make sense for each context.

Text input needs proper validation and formatting. Apps should check input as users type, show clear error messages, and format text automatically when appropriate — like adding hyphens to phone numbers or validating email formats.

Common text input patterns include:

  • Basic input. Single-line fields for names, search terms, and simple data
  • Structured input. Fields that format text as users type, like phone numbers
  • Rich text input. Multi-line fields with formatting options

Pro Tip! Always show what keyboard type will appear before users tap a text field — like using email or number icons.

Context menus in iOS

Context menus in iOS Bad Practice
Context menus in iOS Best Practice

Context menus help users quickly access common actions. When users long-press an element, iOS displays a focused list of relevant commands for that item.

Every action in a context menu needs a clear icon and brief label. Actions should follow a consistent order — most used options are at the top, and destructive actions like Delete are at the bottom in red.

Menu content must reflect the item's current state. If a message is unread, the menu shows "Mark as Read." After users tap this action, the same menu will show "Mark as Unread" instead.

Common context menu patterns include:

  • Preview options. Show content previews with additional swipe actions
  • Basic actions. Common commands like Share, Copy, Move
  • Toggles. State-changing actions like Pin, Flag, Mark as Read
  • Destructive actions. Delete or Remove options appear in red[2]

Pro Tip! Use action verbs for menu items — "Share" instead of "Sharing" and "Copy" instead of "Copy to.”

Selection feedback patterns

Selection feedback patterns

Selection feedback helps users understand what's happening on screen. Every tap, long-press, or keyboard input needs a clear visual response to show that iOS detected the interaction.

Apps must use standard iOS feedback styles. Common feedback patterns include:

  • Highlight states. Selected items change color or add borders
  • Press states. Buttons and controls darken while touched
  • Focus indicators. Text fields show where input will appear

Feedback timing affects how responsive an app feels. Visual changes should appear instantly when users select something. Delayed or missing feedback makes users doubt if their action is registered.

Pro Tip! Test selection feedback with different screen color modes — what works in light mode might be hard to see in dark mode.

Text selection mechanics

Text selection mechanics

Text selection helps users precisely choose and manipulate written content. From selecting a single word to highlighting entire paragraphs, iOS offers several ways to interact with text.

Good text selection needs clear visual indicators. On native iOS apps, selected text appears highlighted with handles at both ends. Each app can use its highlight color while maintaining the standard selection mechanics.

Text selection offers instant editing options. Once the text is selected, a menu appears with relevant actions like Cut, Copy, or Look Up. This menu adapts based on the selected content and available actions.

Common text selection patterns include:

  • Word selection. Double tap selects a single word
  • Paragraph selection. Triple tap selects a whole paragraph
  • Drag handles. Selection markers at each end adjust the selection range

Haptic feedback patterns

Haptic feedback adds a physical response to selection and input. When users interact with UI elements, tiny vibrations confirm their actions and make digital interactions feel more tactile.

iOS provides standard haptic patterns that users already know. Success actions are different from errors, and selection changes are different from alerts. Using these system patterns helps users understand what's happening without looking at the screen.

Each haptic pattern needs a clear purpose. Light taps work for selection changes, while longer vibrations suit important alerts. Apps should use haptics to enhance the experience, not distract from it.

Common haptic patterns include:

  • Selection feedback. Brief tap when items are selected or changed
  • Action feedback. Distinct buzz when actions complete or fail
  • Alert feedback. Notable vibration for important notifications

Pro Tip! Test haptics with sound off — feedback should feel natural and meaningful without audio cues.

Input accessories and toolbars

Input accessories and toolbars

Input accessories are UI elements that appear above the iOS keyboard. They help users type faster and format text without switching between different screens or menus.

The most common input accessory is word prediction. iOS suggests up to three words that users might type next, based on what they're currently typing. These suggestions update in real-time and learn from typing patterns.

Apps can also show custom toolbars based on context. Text editors might display formatting options like bold and italics. Note-taking apps often include checklist and attachment tools. The accessory content should help users complete their current task efficiently.

Common input accessory patterns include:

  • Word predictions. Shows likely next words as users type
  • Quick corrections. Offers fixes for typos and common mistakes
  • Formatting tools. Basic styling options for rich text editing

Pro Tip! Use iOS system fonts in input accessories — they match the keyboard's visual style.

Keyboard shortcuts in macOS

Apps must handle all selection methods equally well. Whether users select with keyboard, mouse, or trackpad, the behavior and visual response should stay the same.

When users select with keyboard shortcuts, content should be highlighted just like it does with mouse selection. This helps users trust that selection works the same way no matter how they do it.

Apps need to show what keyboard shortcuts are available. Common places for this are menus and tooltips, where users can discover shortcuts while they work. This is vital for selection actions that might be hidden in the interface.

Common keyboard selection patterns include:

  • Same visuals. The selection looks identical across all input methods
  • Clear discovery. Shortcuts appear in obvious places like menus
  • Easy switching. Users can mix keyboard and mouse freely

Pro Tip! Test selection with both pointer and keyboard to ensure consistent behavior and feedback.

Mouse and trackpad interactions

Mouse and trackpad interactions

Mac apps need smooth selection support for the mouse and trackpad. While touch dominates on iOS, precise pointer control remains key for Mac interfaces.

Selection behaviors must stay consistent between input devices. Trackpad gestures should feel as natural as mouse clicks, and both need clear visual feedback. Pointer styles should change to show when selection is possible.

Apps should support basic and advanced selection methods. Simple clicks select single items, while drag operations or modifier keys help select multiple items. The pointer's appearance and behavior should make these options clear.

Common pointer selection patterns include:

  • Pointer states. The cursor changes shape over selectable items
  • Click feedback. Selected items are highlighted instantly
  • Drag selection. Users can select multiple items by dragging

Pro Tip! Change the pointer style when hovering over selectable items — it helps users know what they can interact with.

Voice control integration

Voice control makes apps more accessible. To support voice selection properly, apps need to follow key accessibility principles in their design.

Every selectable item must have a clear accessibility label. These labels help voice control identify elements correctly and let users interact with them using voice commands. Labels should be concise and describe the item's purpose.

Voice selection should work like any other input method. When users select items by voice, apps should show the same visual feedback and perform the same actions as they would for touch or mouse input.

Common voice selection patterns include:

  • Clear labeling. Every interactive element needs an accessibility label
  • Consistent feedback. Same visual response as other input methods
  • Logical grouping. Related items should be near each other for easier selection

Complete the lesson quiz and track your learning progress.
<?xml version="1.0" encoding="utf-8"?>