Engineering
9 Jul 2021
Enabling Now-Playing and Earphone Button Controls for an Audio App
Building an app that supports audio playback with controls to record, play, pause, and skip to a position on a track is a non-trivial affair
Building an app that supports audio playback with controls to record, play, pause, and skip to a position on a track is a non-trivial affair. In my experience the most challenging aspect of the process is building the UI for this functionality. An audio playback interface can have quite a few moving parts that need to be kept in sync: play/pause button state, slider position, time-elapsed, time remaining, and animations for the audio playback - to name just a few.
Developers can get so focused on ensuring the UI in their app works perfectly that they forget about the other OS provided interfaces that a) can change your audio state, and b) have their own interfaces that need to be kept up-to-date. This article will look at two of these interfaces: the Now-Playing interface visible from the lock screen and the command centre of a device, and the audio controls provided by Apple earphones.
Integrating with these interfaces is not a requirement for an audio app - you can still get your app submitted without them. But if you want to build a high quality app, then doing so is a no-brainer. In this article I will walk you through how to integrate your app with these interfaces.
Now playing interface
To see how the Now-Playing interface works, open an app like Apple Music or Spotify and start playing a track. While the track is playing swipe up from the bottom of the screen to bring up the control centre. You should see a block with the name of the track and some controls (See image 1).
This is the Now-Playing control. As of iOS 14.3, tapping the control opens an expanded view (see image 2) with more controls for volume, and a slider to seek to a position on the track.
A very similar view, with identical functionality, is presented to the user from the lock screen when audio is being played (See image 3).
These are all different flavours of the same Now-Playing interface.
High level overview.
In order to keep the state of your app and the Now-Playing interface in sync you'll need to manage state in two directions: from the Now-Playing interface to your app, as well as from your app to the Now-Playing interface. For the former you'll use MPRemoteCommandCenter, and for the latter you'll use MPNowPlayingInfoCenter.
MPRemoteCommandCenter.
`MPRemoteCommandCenter` is an Apple provided class that allows us to integrate our audio app with other audio control interfaces. The class is designed to be used as a singleton, so be sure to used its `shared()` method to access its instance.
Register for remote commands
At launch your app needs to register for all the `MPRemoteCommand`s it needs. For example, if your app implements a play/pause button, register the `togglePlayPauseCommand` using the `addTarget(handler: )` method, as follows:
The `addTarget` handler takes a `MPRemoteCommandEvent` and returns a `MPRemoteCommandHandlerStatus`. The `MPRemoteCommandEvent` contains extra info about the event, and the returned `MPRemoteCommandHandlerStatus` is used to tell the Now-Playing interface whether the command was successfully handled or not.
There are a host of commands for which you can register. To see the full list inspect the MPRemoteCommandCenter class.
It's important to understand that the commands for which you register determine which features are visible to the user on the Now-Playing interface. For example, registering for `changePlaybackPositionCommand` will add a scrubber to the Now-Playing view, allowing the user to seek to a position on the audio track. Ideally you want to match the functionality provided in your app with that presented on the Now-Playing view.
Other useful things to know:
In some cases you may want to disable (but not remove) a control on the Now-Playing interface. You can do this using the `isEnabled` property on each `MPRemoteCommand`.
If your app implements a play/pause button that toggles when tapped, then the correct command to register for is `togglePlayPauseCommand` and not `pauseCommand` and `playCommand`. Adding either of the latter two adds buttons to the Now-Playing interface that do not toggle .
If your app has multiple audio interfaces with different features on each, then you may want to change the Now-Playing interface depending on which view is visible to the user in app. You can do this by dynamically registering and deregistering for `MPRemoteCommand`s using the `.addTarget` and `.removeTarget` methods.
If you registered for the `skipForwardCommand` or `skipBackwardCommand` you need to set the `preferredIntervals` value on each command as follows:
So far we have covered state flow in one direction - from events triggered on the Now-Playing interface to your app. We now need to turn our attention to managing state changes in the opposite direction. In what follows we'll look at how to keep the Now-Playing interface aligned with the state of your app.
MPNowPlayingInfoCenter
`MPNowPlayingInfoCenter` is an Apple-provided class that allows us to manage the state of the Now-Playing interface. The class is designed to be used as a singleton, so be sure to used its `default()` method to access its instance.
Updating the state of the Now-Playing interface is quite simple. `MPNowPlayingInfoCenter` has a `nowPlayingInfo` property of type `[String: Any]?`. Updating this dictionary will trigger an update of the Now-Playing state automatically.
Importantly, `MPNowPlayingInfoCenter` is expecting only certain strings to be used as the keys on the nowPlayingInfo dictionary, a full list of which is available here. Also, be sure to set the value of those keys to the appropriate types.
For example, the code below sets the title (of type `String`) and the playback duration (of type `Float`).
For a more detailed look at how to enable your app to interact with the Now playing interface check out Apple's demo app on the subject.
Earphone button interface
The controls on Apple earphones represent yet another interface that a well designed audio app should be able to handle. The good news is if your app has been set up to handle commands via `MPRemoteCommandCenter` then it is also ready to handle the events sent from the controls on Apple earphones. Actions like pressing the earphone button once will trigger the `togglePlayPauseCommand`, and double pressing will trigger the `nextTrackCommand`.
Other interfaces
This article has covered the two most essential remote interfaces, but there are others. If your app supports audio playback via blue-tooth enabled devices then you will need to cater for remote commands sent from those devices too. The API's covered in this article will be your starting point.
Conclusion
The Now-Playing interface and the events triggered by Apple earphone controls are just a few of the remote interfaces for which a well designed audio app needs to cater. At a bare minimum your audio app should support this functionality. Hopefully this article will help you achieve that goal.
Ps. as a fun exercise, download a few voice note apps and look for the following:
Does the app integrate with the Now-Playing interface?
Are the controls contained in the app mirrored in the Now-Playing interface? Which ones are missing?
Does the Now-Playing interface correctly reset itself when audio stops?
See if you can get the Now-Playing interface out of sync with the app's state, and visa versa.