A Quick Start Guide for 3D Face Filters using the Svrf SDK

ox

Artem Titoulenko

Posted on February 4, 2019

A Quick Start Guide for 3D Face Filters using the Svrf SDK

Getting Started iOS

To demonstrate searching for, and using, Svrf Face Filters we will create a basic iOS app that searches for a Face Filter and applies it to your face. For simplicity we will focus exclusively on the iPhone X which allows us to rely on ARKit to handle face detection, face mesh generation, and model morphing. We will cover:

  • Installing the SvrfSDK and setting up a Svrf API Key
  • Compose an interface with an ARSCNView, a search bar, and a UICollectionView to hold the search results
  • Create a VirtualContentUpdater class which implements ARSCNViewDelegate
  • Devise a RemoteFaceFilter class which extends SCNNode and can load the 3D model of a SvrfMedia
  • Plumbing to handle search bar input, showing/hiding the keyboard, etc.

Building the Interface

First, lets create a single-view new Xcode project named FaceFilterDemo.

Create a Single View App

Name the app FaceFilterDemo

Now let's create a basic layout for our application using Interface Builder. Our application needs an ARSCNView to display the ARKit scene, a text box to input searches, and a Collection View to display the results. Let’s add those components and wire them up:

Wiring up the basic interface

Make sure that the ViewController implements the UISearchBarDelegate protocol, and that the searchBar.delegate is set to self on viewDidLoad(). Set the background of the Search Results View to be "Clear Color", set "Scroll Direction" to Horizontal, and turn on "Bounce Horizontally".

Setting bounce and flow direction on the search results view

We can then set the "Search Style" of the search bar to "Minimal" to make it translucent and look like it's floating on top of the ARSCNView. We should also give it a more descriptive placeholder to show the purpose of of the search bar. Let’s set it as “Search for Face Filters”.

Setting placeholder and search style on the search bar

Installing the SvrfSDK Pod

In order to query the Svrf API we need to install the SvrfSDK from CocoaPods. If you don't have CocoaPods installed, you should follow the installation guide on their website, or follow the SvrfSDK installation instructions to install the SDK directly into your application. In the root of the FaceFilterDemo folder create a file named Podfile:

# Podfile
platform :ios, '12.0'

target 'FaceFilterDemo' do
  use_frameworks!

  # Pods for FaceFilterDemo
    pod 'SvrfSDK'
  end
Enter fullscreen mode Exit fullscreen mode

After you have saved the Podfile, close Xcode, open a terminal, navigate to the root of the project directory, and run pod install. This will install the SvrfSDK pod and its dependencies. Now instead of opening the Xcode Project, you should open the FaceFilterDemo.xcworkspace file in the project root directory. You'll know you did it right if you see a "Pods" project alongside the "FaceFilterDemo" project in Xcode.

Creating and Adding a Svrf API Key

Accessing the API requires an API Key. You can get an API Key by logging into the Svrf website and navigating to your User Settings. At the bottom you can name your key and describe what it will be used for.

After you sign up, you should be presented with your API Key:

Don't worry, this key is long gone

Add a property named SVRF_API_KEY to the Info.plist file whose value is your new API Key:

Authenticating with the Svrf API

To test that the key works correctly and to authenticate with the API, we import the SvrfSDK framework and call the SvrfSDK.authenticate method. A good place to authenticate is when our application starts: in the -application:didFinishLaunchingWithOptions: method of our AppDelegate.

// AppDelegate.swift

import UIKit
import SvrfSDK

@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
  var window: UIWindow?

  func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {

    SvrfSDK.authenticate(onSuccess: {
      print("Successfully authenticated with the Svrf API!")
    }, onFailure: { err in
      print("Could not authenticate with the Svrf API: \(err)")
    })

    return true
  }
}
Enter fullscreen mode Exit fullscreen mode

After launching the Simulator, we should see the success text in the logs.

Searching for Face Filters

Now that we are authenticated with the service, let's search for some Face Filters! First we should wire up our search bar and the search results view as @IBOutlets in our ViewController. Then we specify that the ViewController implements the UISearchBarDelegate protocol, set the searchBar's delegate to self, and implement searchBarSearchButtonClicked(_:). We can then use the SvrfSDK to search for 3D Face Filters by calling the +search(query:options:onSuccess:onFailure) method of the SvrfSDK:

// ViewController.swift

import UIKit
import ARKit
import SvrfSDK

class ViewController: UIViewController {
  @IBOutlet weak var sceneView: ARSCNView!
  @IBOutlet weak var searchBar: UISearchBar!
  @IBOutlet weak var searchResultsView: UICollectionView!

  override func viewDidLoad() {
    super.viewDidLoad()

    searchBar.delegate = self
    }
  }

extension ViewController: UISearchBarDelegate {
  func searchBarSearchButtonClicked(_ searchBar: UISearchBar) {
    guard   let query = searchBar.text else {
      return
    }

    print("Searching for \(query)")

    let searchOptions = SearchOptions(type: [._3d])
    SvrfSDK.search(query: query, options: searchOptions, onSuccess: { (allMedia) in
      print("got \(allMedia.count) results")
    }, onFailure: { (err) in
      print("Could not search for FaceFilters: \(err.title): \(err.description ?? "")")
    })
  }
}
Enter fullscreen mode Exit fullscreen mode

Searching for "Movies" Face Filters should show you how many results were returned for that query, along with the names and canonical URLs for those filters:


Searching for Movies
Got 7 results
Lego Batman: https://www.svrf.com/vr/Lego-Batman/730872
Iron Man: https://www.svrf.com/vr/Iron-Man/730870
Loki: https://www.svrf.com/vr/Loki/663718
Flash: https://www.svrf.com/vr/Flash/730869
Starlord: https://www.svrf.com/vr/Starlord/720510
Spiderman Mask: https://www.svrf.com/vr/Spiderman-Mask/730873
Black Panther: https://www.svrf.com/vr/Black-Panther/730867
Enter fullscreen mode Exit fullscreen mode

Other fun Face Filters to search for:

  • Breakfast
  • Animal
  • Meme
  • Cute
  • Comic Book

This is the most basic API usage, but there's a lot more that the SvrfSDK offers. You can fetch trending Face Filters, filter by a certain category, and also search through thousands of 360 photos and videos. You can get a feel for the content that we curate by browsing the Svrf website. Read onward to learn how to use these Face Filters in a ARKit Session where you can try on the superhero masks in the search results!

Displaying Search Results

Interface for a Search Result

Now that we know we can search for Face Filters, let's show them to the user in our Collection View and have them try them on. First, we’ll show the user a preview of the Face Filter. Each SvrfMedia returned by the SvrfSDK.search method has a collection of rendered images under media.files.images which we will use to show a preview of the Face Filter. You can see additional properties that Media have in the Svrf API docs. UICollectionView requires elements which extend UICollectionViewCell, so we will make a class called SearchCollectionViewCell. This will manage the preview image view of a SvrfMedia. Our SearchCollectionViewCell class will contain a single method called setupWith(media:). That method will take a SvrfMedia, fetch an image from the URL at media.files?.images?._720x720, and set the fetched image to the image the cell is displaying.

    // SearchCollectionViewCell.swift

    import UIKit
    import SvrfSDK

    class SearchCollectionViewCell: UICollectionViewCell {
      @IBOutlet weak var imageView: UIImageView!

      func setupWith(media: SvrfMedia) {
        imageView.image = nil

        guard let previewImage = media.files?.images?._720x720,
          let imageUrl = URL(string: previewImage) else {

          print("Could not fetch 720x720 image")
          return
        }

        URLSession.shared.dataTask(with: imageUrl,
          completionHandler: { (data, _, error) in
            if error != nil {
              print("Could not fetch image: \(error!)")
              return
            }

            DispatchQueue.main.async {
              if let data = data, let remoteImage = UIImage(data: data) {
                self.imageView.image = remoteImage
              }
          }
        }).resume()
      }
    }
Enter fullscreen mode Exit fullscreen mode

In the Interface Builder view, drop in an Image View into the search result cell view, ctrl+drag that image view into the SearchCollectionViewCell class as an @IBOutlet named imageView. Additionally specify that the SearchResultCell's class is SearchCollectionViewCell.

Set the Identifier to

Set the class to SearchCollectionViewCell

Showing All of the Search Results

In order to show all of the results of our search we need to implement all of the required methods for UICollectionViewDataSource and UICollectionViewDelegate in our ViewController.

    // ViewController.swift

    extension ViewController: UICollectionViewDataSource, UICollectionViewDelegate {
      func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {
        return searchResults.count
      }

      func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {

        guard let cell = searchResultsView.dequeueReusableCell(withReuseIdentifier: "SearchResultCell", for: indexPath) as? SearchCollectionViewCell else {
          return UICollectionViewCell()
        }

        cell.setupWith(media: searchResults[indexPath.row])

        return cell
      }

      func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
        let media = searchResults[indexPath.row]

        // Result selected
        // Do something with the resulting media when it's selected
      }
    }
Enter fullscreen mode Exit fullscreen mode

Now set the searchResultsView's delegate and dataSource to be self so that the searchResultsView knows where to fetch its data from and how to draw the cells representing the data.

    // ViewController.swift

    class ViewController: UIViewController {
      // ...

      override func viewDidLoad() {
        // ...

        // This ViewController will tell the UICollectionView how to render its
        // results, and will also be the data source for the collection view
        searchResultsView.delegate = self
        searchResultsView.dataSource = self
      }
Enter fullscreen mode Exit fullscreen mode

Now let's fire up the app and search for "Movies". When your search completes you should see several superhero Face Filters. Congrats, you're using the Svrf API!

Try searching for other Face Filters!

Using ARKit and Trying On Face Filters

For the final part of our guide, we will allow the user to select a Face Filter and try it on. This step is more complex as it requires setting up the ARKit session, creating classes to manage the Face Filter's 3D model, interacting with ARSCNView, and applying the blend shapes that ARKit detects to the model being displayed.

Allow access to front-camera

First we need to add a NSCameraUsageDescription key to our Info.plist whose value will be a description of what our app will use the camera for.

Starting the ARSession

When the ViewController view appears we need to start the sceneView's session and tell it that we want to do face tracking with the ARFaceTrackingConfiguration. When the view will disappear, we should pause the session so as to reduce CPU usage.

    // ViewController.swift

    class ViewController: UIViewController {
      // ...
      override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        // Run the ARSCNView session
        let configuration = ARFaceTrackingConfiguration()
        sceneView.session.run(configuration, options: ARSession.RunOptions())

        // Keep the screen on
        UIApplication.shared.isIdleTimerDisabled = true
      }

      override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        // Pause the ARSCNView session
        sceneView.session.pause()

        // Let the screen sleep
        UIApplication.shared.isIdleTimerDisabled = false
      }
Enter fullscreen mode Exit fullscreen mode

Representing a Face Filter as a SCNNode

In order for our Face Filter to be used in the ARSCNView session, we need to be able to represent it as an SCNNode. Our Face Filters come with a variety of 3D model formats, but we always have glTF models. For your convenience the SvrfSDK includes a method called +getFaceFilter which will fetch, parse, and convert the glTF model for a Face Filter into a SCNNode hierarchy. The SCNNodes also have blendshape data attached if the original glTF model has weights defined for it.

    // RemoteFaceFilter.swift
    import Foundation
    import SceneKit
    import ARKit
    import SvrfSDK

    class RemoteFaceFilter: SCNNode {
      // Root node which will have the Face Filter as its child
      private var faceFilter: SCNNode?

      // BlendShapeAnimation
      var blendShapes: [ARFaceAnchor.BlendShapeLocation: NSNumber] = [:] {
        didSet {

          // Each child node may have blend shape targets so we enumerate over all of them to make sure
          // that each blend target is expressed completely
          faceFilter?.enumerateHierarchy({ (node, _) in
            if node.morpher?.targets != nil {
              SvrfSDK.setBlendShapes(blendShapes: blendShapes, for: node)
            }
          })
        }
      }

      func loadFaceFilter(media: SvrfMedia) -> Void {

        // Generate a Face Filter SCNNode from a Media in a background thread
        DispatchQueue.global(qos: .background).async { [unowned self] in
          SvrfSDK.getFaceFilter(with: media, onSuccess: { faceFilter in
            // Remove any existing Face Filter from the SCNScene
            self.resetFaceFilter()
            // Set new Face Filter
            self.faceFilter = faceFilter
          }, onFailure: { error in
            print("\(error.title). \(error.description ?? "")")
          })

          // Add the Face Filter as a child node
          if let head = self.faceFilter {
            self.addChildNode(head)
          }
        }
      }

      // Deletes all childNodes from the self.faceFilter
      func resetFaceFilter() {
        if let head = self.faceFilter {
          for child in head.childNodes {
            child.removeFromParentNode()
          }
        }
      }
    }
Enter fullscreen mode Exit fullscreen mode

Giving the ARSCNView a Delegate

Our preferred approach to managing multiple Face Filters is to create a VirtualContentUpdater class which will manage a virtual face node (which represents your face). The VirtualContentUpdater will also implement ARSCNViewDelegate and pass changes to the ARFaceAnchor (a class defining a virtual mesh of your face) to the virtual face node. This way we can offload the resource management to another class (RemoteFaceFilter) which will be responsible for fetching and loading the 3D model data from a SvrfMedia. It's worth carefully reading the documentation for ARFaceAnchor as it gives a great overview of what kind of data it can give you and how you should use it. The following VirtualContentUpdater contains its own root node (faceNode), and a RemoteFaceFilter whose blend shapes it updates when the user moves their face.

    // VirtualContentUpdater.swift

    // An `ARSCNViewDelegate` which adds and updates the virtual face content in response to the ARFaceTracking session.
    import SceneKit
    import ARKit

    class VirtualContentUpdater: NSObject {

        // The virtual content that should be displayed and updated.
        var remoteFaceFilter: RemoteFaceFilter? {
            didSet {
                setupFaceNodeContent()
            }
        }

        // A reference to the node that was added by ARKit in `renderer(_:didAdd:for:)`.
        private var faceNode: SCNNode?

        // The queue reference
        private let serialQueue = DispatchQueue(label: "com.example.FaceFilterDemo.serialSceneKitQueue")

        //MARK: private functions
        private func setupFaceNodeContent() {
            guard let node = faceNode else {
                return
            }

            resetFaceNode()

            if let content = remoteFaceFilter {
                node.addChildNode(content)
            }
        }

        private func resetFaceNode() {
            guard let node = faceNode else {
                return
            }

            // Remove all childNodes from the faceNode
            for child in node.childNodes {
                child.removeFromParentNode()
            }
        }
    }

    // Extension that realises ARSCNViewDelegate protocol's functions
    extension VirtualContentUpdater: ARSCNViewDelegate {

        // ARNodeTracking
        func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

            // Hold onto the `faceNode` so that the session does not need to be restarted when switching Face Filters.
            faceNode = node

            // Setup face node content in async thread
            serialQueue.async {
                self.setupFaceNodeContent()
            }
        }

        // ARFaceGeometryUpdate
        func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {

            // FaceAnchor unwrapping
            guard let faceAnchor = anchor as? ARFaceAnchor else { return }

            // Update virtualFaceNode's blendshapes with FaceAnchor's blendshapes
            remoteFaceFilter?.blendShapes = faceAnchor.blendShapes
        }
    }
Enter fullscreen mode Exit fullscreen mode

Finishing Touches

The last step in the process is to wire up all of the new behavior into our ViewController so that a user can actually try on the superhero masks they were searching for. We will create a new VirtualContentUpdater called contentUpdater, a new RemoteFaceFilter, assign the sceneView's delegate to contentUpdater, and make contentUpdater keep track of remoteFaceFilter.

    class ViewController: UIViewController {
      @IBOutlet weak var sceneView: ARSCNView!
      @IBOutlet weak var searchBar: UISearchBar!
      @IBOutlet weak var searchResultsView: UICollectionView!

      var searchResults: [SvrfMedia] = []
      let contentUpdater = VirtualContentUpdater()
      let remoteFaceFilter = RemoteFaceFilter()

      override func viewDidLoad() {
        super.viewDidLoad()

        // ContentUpdater will tell the ARSCNView what to draw
        sceneView.delegate = contentUpdater

        // ContentUpdater's virtual face node will dictate what Face Filter to render
        contentUpdater.remoteFaceFilter = remoteFaceFilter

        // This ViewController will handle the behavior of the searchBar
        searchBar.delegate = self

        // This ViewController will tell the UICollectionView how to render its results, and will also be the
        // data source for the collection view
        searchResultsView.delegate = self
        searchResultsView.dataSource = self
      }
    }

So now we can tell the `remoteFaceFilter` to load a new Face Filter from a tapped search result:

    // ViewController.swift

    // ...
    extension ViewController: UICollectionViewDataSource, UICollectionViewDelegate {
      // ...
      func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
        let media = searchResults[indexPath.row]

        // Result selected, load a new Face Filter
        remoteFaceFilter.loadFaceFilter(media: media)
      }
    }

And finally lets clean up the methods called by the search bar when a user searches or taps "cancel" so that they close the keyboard:

    extension ViewController: UISearchBarDelegate {
        func searchBarCancelButtonClicked(_ searchBar: UISearchBar) {
            // Hide searchCollectionView
            searchResultsView.isHidden = true

            // Reset text in searchBar
            searchBar.text = ""

            // Hide cancel button in searchBar
            searchBar.showsCancelButton = false

            // Hide keyboard
            searchBar.resignFirstResponder()
        }

        func searchBarSearchButtonClicked(_ searchBar: UISearchBar) {
            // Hide keyboard
            searchBar.resignFirstResponder()

            // Show the searchResultsView
            searchResultsView.isHidden = false

            guard let query = searchBar.text else {
                return
            }

            print("Searching for \(query)")

            let searchOptions = SearchOptions(type: [._3d])

            SvrfSDK.search(query: query, options: searchOptions, onSuccess: { (allMedia) in
                print("Got \(allMedia.count) results")
                self.searchResults = allMedia
                self.searchResultsView.reloadData()
            }, onFailure: { (err) in
                print("Could not search for FaceFilters: \(err.title): \(err.description ?? "")")
            })
        }
    }
Enter fullscreen mode Exit fullscreen mode

Now let's try on those superhero Face Filters!

Going Further

We have scratched the surface of what you can create with the Svrf API and the Svrf SDK for iOS. Head over to the Svrf Developers Site to explore API clients in a variety of languages and to see what other content you can discover through the Svrf API. To see a more complete Demo app with error handling, loading spinners, and reset buttons check out https://github.com/SVRF/svrf-api/tree/master/examples/ARKitFaceFilterDemo.

Originally published at blog.svrf.com by Artem Titoulenko on January 17, 2019.

💖 💪 🙅 🚩
ox
Artem Titoulenko

Posted on February 4, 2019

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related