Skip to content

Modification of Apple Tracking and Visualizing Faces Project

License

Notifications You must be signed in to change notification settings

eightifact/swiftVtuber

Repository files navigation

swiftVtuber

Modification of Apple Tracking and Visualizing Faces Project

AR Face scanning and texture map using iPhone.

sreenshot

example video when used in combination with OBS Studio : https://www.youtube.com/watch?v=x0rdXJdUZWA

Static Props Eye Tracking and Eye Model Hidden Background

-all config must be modified / set in code at this time.

Important: Face tracking supports devices with Apple Neural Engine in iOS 14 and iPadOS 14 and requires a device with a TrueDepth camera on iOS 13 and iPadOS 13 and earlier. To run the sample app, set the run destination to an actual device; the Simulator doesn’t support augmented reality.

Original Project Code: https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_faces

About

Modification of Apple Tracking and Visualizing Faces Project

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published