Building Echo: What App Developers Need to Know About the New Siri Remote
One of the most obvious differences between the new and old Apple TV is the new Siri Remote — it’s much more than just a simple IR remote. It offers a touch surface capable of recognizing gestures, a microphone input, and includes a gyroscope and accelerometer. When we first got our hands on it, it reminded us of the original Wii Remote — which was one of the reasons we decided to build a game for Apple TV.
As we continue our series on how we developed the Echo game for Apple TV in three weeks, we wanted to share what we learned about developing for the Siri Remote. Here are the most important three things app developers need to understand about the Siri Remote:
The difference between touch-based interaction and focus-based interaction
On iOS devices, a user interacts directly with a touchscreen. If you touch a button, an action will be triggered on that button — and you get immediate and clear visible feedback of your action.
BUILDING ECHO: A blog series on developing for the Apple TV
- Introducing Echo: A First Foray into Apple TV App Development
- How We Approached Design on the Apple TV
- From iOS to tvOS: The Key Differences
- The New Siri Remote: What Developers Need to Know
- How to Succeed (or Fail) with Your Apple TV App Release Process
On Apple TV, the interaction is indirect — you touch the remote in order to interact with the TV interface. This is accomplished with a focus-based interaction model in which only one item on screen is in focus at a time and all actions are mapped to that item. Navigation is done by swiping the Siri Remote’s touch surface, which triggers scrolling among the items on screen. The focused item is always visible and distinguished. Clicking the touch surface selects the focused item, so if you click on a movie or TV show, you see second-level information about it with on-screen buttons to buy, rent, etc.
Focus is controlled by the Focus Engine, and the UIKit framework components have been updated for tvOS to handle Focus. The standard UIKit components such as UIButton, UITextField, UITableView, UICollectionView, UITextView, UISegmentedControl, and UISearchBar have been extended to handle focus by default. In other words, by using these “off-the-shelf” components in your app, you automatically benefit from standard Focus animations that give users the visual feedback they need to navigate your app. But if you want to modify any of the UIKit components even slightly (or create any new custom elements), you have to implement custom Focus behaviors — and create animations — by writing your own code. And it requires a lot more development time and effort.
UIControl objects also have a new state called Focused, and Interface Builder lets you set up the appearance for this state. For a UIButton, you can change the title properties (title text, font, text color, shadow color), image and background image. When the state of the control changes to Focused, these properties take effect automatically.
The Siri Remote touch surface can recognize gestures
The Siri Remote is a very different kind of TV remote control, but from a developer point of view, it’s similar to an iPhone in how it recognizes gestures.
The UIGestureRecognizer class from iOS has been updated for Apple TV apps. Instead of checking for touches on a screen, you have to check for presses. To do this, use the “allowedPressTypes” property to specify which button you want the user to trigger. This property is used by Tap gesture recognizers to identify which button the user pressed.
The touch surface is recognized as the Select button when the user clicks the surface. The Menu and Play/Pause buttons are the other two buttons that can detect presses.
The touch-sensitive surface also works as a directional pad that can identify taps in the edges of the touch surface in four directions: up, down, left and right. Swipe gesture recognizer also can use the “direction” property to determine which of four directions a user swipes: up, down, left or right.
If your app calls for additional types of control, the touch surface can also work with other gesture recognizers, including Pinch, Rotation and Pan.
Interface Builder has been updated for all the gesture recognizers used with tvOS.
At its core, the Siri Remote is a game controller
With a combination of physical buttons, touch sensitivity and motion control, we think the Siri Remote makes a great living room game controller.
To make it easier for game developers, Apple created a new micro gamepad profile specifically for the Siri Remote. It’s important to note that if you’re going to build a game for Apple TV, you must support the Siri Remote (Apple also offers an extended gamepad profile so that third-party manufacturers could build more advanced controllers, but any games still must support the Siri Remote).
Here are the basic features in the Siri Remote profile that apps can choose to implement:
- The touchpad on the remote is essentially a directional pad (D-pad). It provides an application with analog input data.
- The Siri remote works in either portrait or landscape orientation. Developers need to determine whether input data may (or may not) flip when users change the remote orientation.
- The touchpad can also serve as a digital button — activated when a user firmly presses on the touchpad.
- The Play/Pause button on the remote is also a digital button.
- The Menu button on the remote can be used to pause gameplay using the pause handler in the controller object.
- The remote supports motion data with the GCMotion profile, leveraging the built-in accelerometer and gyroscope. The remote, however, cannot interpret rotation.
For our Echo game, here’s how we took advantage of the micro gamepad profile:
The Echo game consists of three basic user interactions with the remote, as users try to mimic a sequence of three different actions they see on screen: Click, Swipe and Shake. We used Tap and Swipe gesture recognizers for the remote’s Click and Swipe actions, respectively. The third action, the Shake, will be covered in the next section. For our third gameplay action, the Shake, we used the accelerometer motion data — and this was the most challenging (and interesting) feature to calibrate.
Motion data is constantly being reported by the accelerometer and gyroscope. It was difficult at first to properly interpret the data, but after some trial and error and lots of fine-tuning, we managed to make this feature work pretty well. We wrote a Shake detector to establish parameters for what our game would define as a single Shake — and ignore other subtle movements and data. This Shake detector provided a margin of error — so that some movements would be ignored by the game instead of interpreting them as player mistakes and ending the game.
Using the Siri Remote with the ArcTouch Echo App
There are lots of opportunities for app development with the Siri Remote. We’re excited to see how app developers take advantage of this new platform. We’d love any feedback about your experiences with our Echo game and the Siri Remote — so try it out (it’s free on Apple TV), and tweet to us using #EchoGame or comment on this post below.