Mobile-Specific Functions & Interactions

Learn how to build mobile device features based on mobile functionality and common usage
Mobile-Specific Functions & Interactions Lesson

When creating mobile apps, designers need to consider mobile device features and functionality (small size, touchscreen, inbuilt camera, etc.), and how people use them (often on the go). These factors determine the differences in interactions between mobile and desktop versions of products.

For example, the desktop version of Google Translate offers text input as default. This makes sense, as personal computers come with a physical keyboard — usually the preferred input method.

The Google Translate app, however, is quite different. It supports text input but also has a prominent Paste button to avoid making mobile users type. Moreover, it leverages two inbuilt functionalities of smartphones — camera and microphone — to allow visual and sound input. As a result, the app is really easy to use.

Therefore, consider both mobile functionality and how people use them to create mobile-friendly apps.

Touchscreens allow users to interact with the device by touching the display with their fingers and thumbs. Besides simple taps, most touch screens and mobile systems support gestures.

Gestures are physical movements that activate a specific control within the design.[1] Most gestures are hand movements but the term also includes shaking, tilting, or moving the device.

Some of the most common gestures and their respective actions include:

  • Tapping — touching the surface briefly.
  • Double-tapping — touching the surface with two quick motions (often to zoom in).
  • Dragging — moving a finger along the surface without breaking contact.
  • Pinching and spreading — touching the surface with two fingers to move in (pinch) or out (spread).
  • Pressing — touching the surface and holding.
  • Flicking — scrolling quickly.

Offer shortcut gestures to supplement and not replace interface-based navigation and actions. For example, on Instagram, you can like the picture by either double-tapping it or pressing the Heart icon.

Pro Tip! Avoid assigning standard gestures to non-standard actions as this will only lead to confusion.

Pressure-sensitive displays perceive the force users put on the screen. Brands that equip their phones with this feature include Samsung, ZTE, Meizu, Huawei, and Google phones.[2] On supported devices, users can access extra functions by applying varying levels of pressure to the touchscreen. For example, in drawing apps on devices with pressure-sensitive displays, users can create lines of varying thickness depending on how hard they push on the screen. Another example is Apple’s 3D touch, which offers direct options from the home screen of the iPhone, creating navigation shortcuts.

On-screen keyboard Bad Practice
On-screen keyboard Best Practice

The on-screen keyboard is the primary input mechanism on mobile devices. A designer’s job is to make the typing experience as effortless as possible for users.

When designing keyboards for an app, try answering the following questions:

  • Does the interaction require a keyboard or can you use other controls (radio buttons, picklists, etc.) instead?
  • What type of keyboard should you use (default, numeric, or custom)?
  • What actions trigger and dismiss the keyboard?
  • Will the keyboard block any items from view?
  • Should the keyboard leverage autocomplete or predictive text? (For example, in a language learning app like Duolingo, predictive text can make learning less efficient).[3]
Location services

Nowadays, both mobile devices and personal computers have location-tracking services. Desktops mostly use Wi-Fi to detect users’ location (which isn't always accurate) while most mobile devices use GPS.[4]

Mobile users take their devices with them everywhere. Adding location awareness to your app offers users a more contextual experience.

Google Maps uses it well by asking users to share their current location, which becomes the default starting point of all itineraries.

The app also tracks users' movement history, requests feedback on places they've been to, and helps them orient themselves in the street with the help of AR in Live View.

Audio

As most mobile devices have built-in speakers, it's never been easier to integrate audio into the user experience.

Mobile devices allow apps to interact with users with visual, haptic, and sound signals. It's not necessary to use all three — in fact, too many sounds can drive users away.

Sound can provide feedback or enhance the user experience when applied to strategic moments. For example, the Duolingo app uses different sounds for correct and incorrect answers and a celebratory sound at the end of each lesson.

Limit the frequency of decorative sound to reduce user fatigue. Material Design guidelines recommend avoiding using sound for:

  • UIs that require privacy or discretion
  • Users who have requested no interruptions
  • Actions that are performed frequently[5]
Microphone Bad Practice
Microphone Best Practice

The microphone isn't a unique mobile feature — nowadays, all laptops come with built-in mics, and external devices can easily be connected to desktops.

However, it's more difficult to type using an on-screen keyboard than a physical keyboard. It’s one of the reasons why the microphone's popularity as an input device in smartphones is growing.

WhatsApp reports that its users send an average of 7 billion voice messages every day.[6] Microphones are also used to access digital virtual assistants like Siri or Google Home.

When designing mobile apps, consider using voice input as a supplement to keyboard input.

Camera

Cameras aren't just for taking pictures anymore. As smartphone cameras are improving, their use cases are expanding beyond traditional photography.

Some less conventional uses of mobile cameras include:

  • Scanning barcodes and QR codes for quick information retrieval
  • Utilizing optical character recognition (OCR) to translate text instantly
  • Implementing augmented reality to enable virtual try-ons for clothes or visualizing furniture inside your house
  • Creating 3D scans for various applications
  • Scanning bank cards to automate the entry of details
  • Tracking heartbeat and measuring stress levels for health and wellness monitoring[7]
Push notifications Bad Practice
Push notifications Best Practice

Push notifications are messages that are sent directly to users' device. They appear on lock screens and are added to the notification alerts on a smartphone or tablet.

They are a powerful tool to increase user engagement. The biggest social media apps like TikTok and Instagram send their users notifications all the time. It works as a part of their reward system — for example, a notification that a friend liked users' picture gives them a little dopamine boost and gets them to return to the app.

However, more doesn't always translate to better — especially in the case of push notifications. Think of them as tapping users on the shoulder. It's fine if you have something important or relevant to say. If not, you'll only annoy users, and they might turn off notifications altogether or even uninstall your app.

Tactile feedback

Tactile or haptic feedback is a physical, tactile response when using a device — typically, a vibration or pulse. For example, the subtle vibration generated when pressing a key on a virtual keyboard is haptic feedback. It confirms that the system has registered the action.

Using haptic feedback can result in richer and more engaging UX, especially when it's challenging to visually confirm users' actions.

Orientation

Smartphones use multiple sensors to detect orientation:

  • Gyroscope: Measures rotation and twist
  • Magnetometer: Determines direction like a compass
  • Accelerometer: Detects movement and gravitational forces

This sensor combination enables accurate device positioning, allowing apps to automatically adjust their layouts. Common applications include:

  • Photo galleries expanding images to full width in landscape
  • Video players maximizing screen space for content
  • Maps rotating to match the user's orientation
  • Games adapting controls based on device position

While automatic orientation changes can enhance the viewing experience, they should be purposeful and match user behavior. For example, video apps typically switch to landscape for fullscreen playback but maintain portrait mode for browsing.

Accelerometer Bad Practice
Accelerometer Best Practice

The accelerometer is a core mobile sensor that detects device movement and orientation in three-dimensional space. Its capabilities enable:

Movement tracking:

  • Detects device tilts and rotations
  • Measures motion intensity and direction
  • Recognizes specific gesture patterns

Common applications:

  • Automatic screen rotation based on device position
  • Motion controls in mobile games
  • Step counting and activity tracking in fitness apps
  • Navigation adjustments in map applications
  • Gesture-based commands like "shake to undo"

The accelerometer works alongside other sensors (gyroscope, compass) to provide precise motion data, enabling more natural and intuitive interactions with mobile devices.

Bluetooth enables short-range wireless communication between devices. While crucial for mobile connectivity, its applications extend beyond just audio.

Key capabilities:

  • Connects devices within ~30 feet range
  • Enables data transfer between paired devices
  • Supports multiple simultaneous connections
  • Functions without internet connectivity

Common applications:

  • Audio streaming to wireless headphones/speakers
  • File sharing between nearby devices
  • Connection to wearables and fitness trackers
  • Smart home device control
  • Car entertainment system integration

The rise of Bluetooth accessories, particularly wireless earbuds, has influenced hardware design—many smartphones now omit headphone jacks entirely in favor of wireless connectivity.

Near field communication (NFC) in mobile devices allows for secure, quick data exchanges when two devices are close — usually within a few centimeters. You may have used it without realizing it, especially with contactless payments. With NFC, mobile payment apps like Apple Pay, Google Pay, or Samsung Pay let you tap your phone to a payment terminal instead of using cash or cards, making transactions convenient and secure. Beyond payments, NFC is used to transfer files, pair devices (like headphones or speakers), or quickly share contact information.

Complete the lesson quiz and track your learning progress.
Start
<?xml version="1.0" encoding="utf-8"?>