Skip to content

swiftablellc/jabberwocky-head-tracking-kit-ios

Repository files navigation

Jabberwocky Head Tracking Kit for iOS

GitHub tag (latest SemVer) Platform GitHub

Head Tracking Cursor for any iOS app!

htkit-demo

Features

  • Head Tracking Cursor
  • Blink or Dwell Click
  • Easy to use Settings
  • < 10 Line Configuration for Existing Apps
  • Compatible with simulator (will not enable)
  • Compatible with iOS Deployment targets 9.0 and above.
    • Currently enableable on iOS 11.0 and above.
  • Interaction with existing UI elements:
    • UIControl
    • UICollectionViewCell
    • UITableViewCell
    • Extensible for subclasses of UIView
  • Plugin Architecture (HTFeature)

Build JabberwockyHTKit from Source

To build and run JabberwockyHTKit from source follow the steps below. BasicTutorial is an example application target that has one standard UIButton which responds to taps on the screen. When configuring head tracking within an existing application there are a few setup steps that need to be performed. Once JabberwockyHTKit is configured and enabled, the default HTFeature singletons that are configured will automatically detect UIControl, UICollectionViewCell, and UITableViewCell elements and interact with them. Other custom UIView elements can be configured to work with the head tracking framework by implementing the HTFocusable protocol.

Step 1: Checkout

  • Check out from source:
git clone git@github.com:swiftablellc/jabberwocky-head-tracking-kit-ios.git && cd jabberwocky-head-tracking-kit-ios.git

Step 2: Build Framework

  • Open XCode using the JabberwockyHTKit.xcodeproj
  • Select the JabberwockyHTKit scheme and any Device/Simulator and then run in XCode.

Step 3: Run Tutorial

  • Select the BasicTutorial scheme and run on a FaceID enabled device.

Notes

  • *-LocalDev schemes are for development of JabberwockyHTKit and JabberwockyARKitEngine simultaneously. This is not a common use case, so it is safe to ignore these schemes.
  • *-PodsOnly schemes pull all dependencies from CocoaPods and therefore are not very useful for local development of JabberwockyHTKit, but a great way to try out see how cocoapods would work in an existing application. To use these, you will need to do a pod install and open xcode using the .xcworkspace file.

Add Head Tracking to an Existing Application

Step 1: Install JabberwockyHTKit Frameworks

  • Create a Podfile and replace $YOUR_TARGET with the appropriate target:
source 'https://github.com/CocoaPods/Specs.git'

use_frameworks!

platform :ios, '12.0'

target '$YOUR_TARGET' do
  pod 'JabberwockyHTKit', '~> 0.8.4'
end
  • Verify CocoaPods version. Install CocoaPods if you haven't already:
    • JabberwockyHTKit installs an .xcframework using CocoaPods so it requires CocoaPods version 1.10.0 or greater.
pod --version

1.10.0
  • Install dependencies using CocoaPods:
pod install
  • WARNING: Don't forget to open your project using the *.xcworkspace instead of .xcodeproj or you will get a Pods not found error.

Step 2: Add Camera Permissions to *-Info.plist

  • Add NSCameraUsageDescription to your $PROJECT-Info.plist file
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
...
    <key>NSCameraUsageDescription</key>
    <string>Uses Camera to provide Head Tracking</string>
...
</dict>
</plist>

Step 3: Configure JabberwockyHTKit in Code

  • Head Tracking can be configured and enabled in code any time after the application didFinishLaunchingWithOptions.

Swift

import AVFoundation
import JabberwockyARKitEngine
import JabberwockyHTKit

...

    func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
        // Override point for customization after application launch.
        
        ...
        
        AVCaptureDevice.requestAccess(for: .video) { (granted) in
            if (granted) {
                // Configure the default HTFeatures and enable Head Tracking
                DispatchQueue.main.async {
                    HeadTracking.configure(withEngine: ARKitHTEngine.self)
                    HeadTracking.shared.enable()
                }
            } else {
                NSLog("Head Tracking requires camera access.")
            }
        }
        
        ...

        return true
    }
  • WARNING: If you are building a newer Swift project (with SceneDelegate), you will need to modifiy an additional file! The engine will get configured correctly, but the head tracking cursor won't show up because the UIWindowScene was not assigned correctly. Modify SceneDelegate.swift as follows:
import JabberwockyHTKit

...

    func sceneDidBecomeActive(_ scene: UIScene) {
        // Called when the scene has moved from an inactive state to an active state.
        // Use this method to restart any tasks that were paused (or not yet started) when the scene was inactive.
        if let windowScene = scene as? UIWindowScene {
            HeadTracking.shared.windowScene = windowScene
        }
    }
  • SwiftUI is only partially supported (introspection and programmatic triggering of events in SwiftUI is elusive at this point). For an example implementation, which is very similar to pre-iOS 13 Swift is available at SwiftUI Tutorial SceneDelegate. The UIWindowScene needs to be provided to the Jabberwocky HeadTracking singleton for Jabberwocky to manage UIWindow stacks properly.
  • SwiftUI integration requires both of AppDelegate.swift and SceneDelegate.swift changes documented above...

Objective C

#import <AVFoundation/AVFoundation.h>
#import <JabberwockyARKitEngine/JabberwockyARKitEngine.h>
#import <JabberwockyHTKit/JabberwockyHTKit.h>

...

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
        NSLog(@"Requested Camera Permission");
        if(granted){
            dispatch_async(dispatch_get_main_queue(), ^{
                [HeadTracking configureWithEngine:[ARKitHTEngine class] withFeatures:HeadTracking.DEFAULT_FEATURES withSettingsAppGroup:nil];
                [HeadTracking.shared enableWithCompletion: ^(BOOL success) {}];
            });
        } else {
            NSLog(@"Camera Permissions Missing for Head Tracking");
        }
    }];
    return YES;
}

Step 4: Run

  • If you run on a physical device that supports FaceID, you should get XCode output similar to below.
Basic[19446:10081868] Requested Camera Permission
...
Basic[19446:10081868] Head Tracking configured successfully.
Basic[19446:10081868] Head Tracking enabled successfully.
Basic[19446:10081868] Metal API Validation Enabled
  • If you run on a simulator or device that does not support FaceID, you should get XCode output similar to below. JabberwockyARKitEngine.xcframework binary comes with i386 and x86_64 module archs, so that running in a simulator should not crash.
Basic[2476:18033900] Requested Camera Permission
Basic[2476:18033900] Head Tracking cannot be configured. It is not supported on this device.
Basic[2476:18033900] Head Tracking is not configured. Use HeadTracking.configure() to configure.

Release Instructions (Swiftable Developers)

About

The Jabberwocky® Head Tracking Kit (JabberwockyHTKit) is an open-source iOS framework, developed by Swiftable LLC, that provides a touch-free interface for existing iOS applications. Jabberwocky enables users to interact with an application by just moving their head. Head movement translates into the movement of a mouse-like cursor on the screen. By default, blinks trigger a .touchUpInside event simulating a tap on any UIControl subclass (in fact any subclass of UIView can be extended to respond to a facial gesture trigger).

Jabberwocky was originally designed as an accessibility tool for users with mobility impairments such as ALS or Spinal Cord Injury (SCI) to enable effective and efficient interaction with iOS devices. Currently, Jabberwocky requires ARKit and is only supported on devices that also support FaceID. Supported devices include:

  • iPhone X and later models
  • iPad Pro models with the A12X Bionic chip

As of iOS 13, Head Tracking Accessibility was added to iOS Switch Control for the same device models supported by Jabberwocky. It is important to note that iOS Head Tracking can be configured to operate in a similar capacity to Jabberwocky Head Tracking, but is provided at the OS level. While iOS Head Tracking Accessibility works across the entire device, its tight coupling with Switch Control, complicated setup, and limited feature set make it unsuitable for many users. Jabberwocky supports in-app customization of Head Tracking and provides custom event hooks.

Applications

JabberwockyHTKit is currently being used by the following applications in the App Store:

Dependencies

JabberwockyHTKit does not require any non-Apple Frameworks other than JabberwockyARKitEngine. While JabberwockyHTKit is open-source and licensed under the Apache 2.0 License, it depends on JabberwockyARKitEngine which is closed-source and licensed under the Permissive Binary License. JabberwockyARKitEngine is free to redistribute in its binary form, without modification, provided the conditions of the license are met.

JabberwockyHTKit is available in the Jabberwocky CocoaPods Spec Repo and is also available in the CocoaPods Trunk Repo.

Trademarks

Jabberwocky® is a registered trademark of Swiftable LLC.

License

Apache 2.0 License