Knowledge base

Case study: Playing Video in Your iOS Application

Mike Butusov
Head of Marketing at TechMagic. Passionate about startups, digital ecosystem, cloud technologies, and innovations.
Case study: Playing Video in Your iOS Application

Apple has developed a wide range of essential solutions for playing video content on iOS devices and quite a few frameworks. Here are some of the frameworks that are worth mentioning: * AVKit — provides a high-level interface for playing video content; * AV Foundation — provides full access to the media stack and

Apple has developed a wide range of essential solutions for playing video content on iOS devices and quite a few frameworks. Here are some of the frameworks that are worth mentioning:

  • AVKit — provides a high-level interface for playing video content;
  • AV Foundation — provides full access to the media stack and allows building complex media applications;
  • Core Media — provides a low-level C interface for managing and playing audio-visual media;
  • Core Video — provides a pipeline model for digital video and makes it simpler for developers to access and manipulate individual frames.

In this article, I would like to draw your attention to the advantages and disadvantages of AVKit and AV Foundation. In addition, I will talk on the importance of following the iOS Human Interface Guidelines while building your media app.

AVKit

AVKit framework became available after the release of iOS 8. Apple designed this framework in order to combine the power of AVFoundation with the standardized user interface. It contains two basic classes:

  • AVPlayerViewController;
  • AVPictureInPictureViewController.

AVPlayerViewController is actually a substitution for MPMoviePlayerController (another native iOS class for playing video, which Apple deprecated in iOS 9). It has a simple but powerful interface—you can control swift library for video gravity, visibility of playback controls. An instance of AVPlayer to play media content offers you all power of AVFoundation.
AVPlayerViewController doesn’t allow specifying custom controls, but it has a control panel with the standardized Apple user experience—play/pause, video scrubbing, timeline, etc.

AVPictureInPictureViewController was introduced with the release of iOS 9 and available only on IPad. The idea of watching video and doing something else simultaneously is nothing new, but it provides great user experience. For an instance, you can both read your book and enjoy new WWDC session at the same time.
It is necessary to keep in mind that this feature should be under user control, because any app invoking PiP without direct user perdition will be shut.

AVFoundation

The AVFoundation framework is designed not only for playing media content. It combines a set of classes to work with time-based media:

  • play videos;
  • record videos with advanced settings like face and machine code detection;
  • access hardware options like zooming, etc.;
  • edit videos (apply watermarks or compose a presentation from multiple video tracks).

This framework provides us with the best performance on all devices using advantages of multicore hardware and extensive usage of GCD and blocks.

AVFoundation offers a wide variety of features:

  • advanced access to media metadata;
  • advanced video scrubbing with generating thumbnails;
  • advanced control of playback;
  • different aspect ratios for the video;
  • content streaming with the help of Airplay;
  • customizing your own controls for the video player;
  • building the desired user experience for interaction with media in your application;
  • managing subtitles;
  • creating complex media players;
  • playing multiple videos simultaneously  (video gallery application);

Apart from the official Apple documentation there is a great book «Learning AVFoundation» by Bob McCune dedicated entirely to this complicated framework.

The  class is AVAudioSession another essential object that is used to set audio context for your app and to express the way your app interacts with the device. It is responsible for mixing or silencing audio.
Apart from this, it is important to keep in mind that such operations as preloading media or retrieving metadata from the file are heavy operations. Therefore, you should use the Key-Value-Observing mechanism to keep user interface synchronized with media for the first one. To retrieve metadata, you can use asynchronous preloading.
Another thing is that setting up the media pipeline of an iOS device might take some time too. That is why such features as Video Autoplay are bad solution.
You should test all media applications on a real device to check the performance and delays, which might not occur on the simulator.

YouTube video

To be honest, I cannot even imagine today’s world of videos without YouTube. You can easily integrate native YouTube video player into your iOS application. It has a simple interface and integration is not complex at all.
Moreover, you can easily display the controls and timescale the way you like, start play video from particular second, and hide the YouTube logo.
Actually, YouTube works through WebView. The library builds the iframe on top of the player’s API by creating UIWebView and rendering the HTML. JavaScript is also required for a basic player.
You can build your own realization of YouTube player retrieving video by parsing data. Actually, you can get away with XCDYouTubeKit. It uses AVKit to play YouTube videos and has native iOS interface.
However, according to YouTube Terms of Service, the only official way to play their videos is to run the native YouTube player or video player library in Swift. So the best solution here is to use the official product.

Youtube video

User Guidelines

While developing an app it is important to follow iOS Human Interface Guidelines.
In media applications, you should provide a proper volume slider. Besides this, you should respect user privacy.

Here is an extract from iOS Human Interface Guidelines: “When users unplug a headset or headphones or disconnect from a wireless device (or the device goes out of range or turns off), they don’t want to automatically share what they’ve been listening to with others. For this reason, they expect an app that is currently playing audio to pause, allowing them to explicitly restart playback when they’re ready.”

Actually, in addition to performance, privacy is another significant issue of Autoplay. Just present some progress hud or pregenerated thumbnail before media will be ready to play.
You can observe route changes via AVAudioSessionRouteChangeNotification posted by AVAudioSession. This notification contains info with the reason why the notification appeared and previous route state.
That is why you should handle different interruptions. Apple distinguishes resumable and non-resumable interruptions depending on how temporary the interlude in their primary listening experience is. Resumable interruptions such as plugging/unplugging a headset, incoming VoIP call, connecting to a wireless audio device and non-resumable interruptions are media playback.
For an instance, users expect the app to be silent in case of an incoming call; when the call is finished, the app is supposed to resume playing automatically. On the other hand, if the user paused the music playback before the call, the music is supposed to remain paused after the call ends.
In case of non-resumable interruptions, an app with media controls should not resume playing.
Apple offers notification mechanism to check when interruptions begin end finish (use AVAudioSessionInterruptNotificaitions).
For applications dealing with YouTube and other video services it is essential to check connection reachability and slow connection otherwise users will be confused with a blank screen.

Conclusions

Apple offers really powerful solutions for playing media in your iOS application. If you just want to play videos, you can use the native iOS AVPlayerViewController providing the standard user experience. To get deep access to media or build a complicated media player, you can dive into the power of AVFoundation, which has the comprehensive list of options. For YouTube, you can use both out-of-the-box solutions and third-party libraries, but only in case if the features in your application comply with Google’s terms of service
Apple also encourages developers to follow iOS Human Interface Guidelines such as handling interruptions or plug-in headphones by providing detailed descriptions and mechanisms to deal with it.

You may also like to read:

Was this helpful?
like like
dislike dislike

Subscribe to our blog

Get the inside scoop on industry news, product updates, and emerging trends, empowering you to make more informed decisions and stay ahead of the curve.

Let’s turn ideas into action
award-1
award-2
award-3
RossKurhanskyi linkedin
Ross Kurhanskyi
Head of partner engagement