'Is it possible to access multiple cameras from ARFrame?

I have an ARSession using a ARWorldTrackingConfiguration as part of its configuration. I've also enabled face tracking via:

configuration.userFaceTrackingEnabled = true

In the func session(_ session: ARSession, didUpdate frame: ARFrame) delegate method, I can successfully get the frame.capturedImage from the world-facing camera, but it doesn't seem like there's a way to access the frame from the face-facing camera.

Am I correct in this assumption?

If so, is there some other way to access the frames of both cameras when using face and world tracking together?



Solution 1:[1]

About two simultaneous ARConfigurations

As a rule, one ARSession is able to run just one ARConfiguration at a time. But there's an exception: we can use Face tracking config within a World tracking configuration. However, ARWorldTrackingConfiguration is a "main" in that case (hence Face tracking is a "supplemental" config).

Both cameras (rear and front) produce 60 ARFrames per second, containing RGB, Depth, anchors, axis, feature points, etc. And each camera has its own ARFrames, what can be used for defining intrinsic and extrinsic ARCamera parameters (like 3x3 camera matrix or 4x4 transform matrix).

@NSCopying var currentFrame: ARFrame? { get }

However, in ARKit 5.0, if you are running World tracking config with activated instance property userFaceTrackingEnabled, you can get access only to ARFrames coming from rear camera – at the moment there's no access to simultaneous ARFrames coming from front camera.

let config = ARWorldTrackingConfiguration()

if ARWorldTrackingConfiguration.supportsUserFaceTracking {
    config.userFaceTrackingEnabled = true
}
sceneView.session.run(config, options: [])
    
let currentFrame = sceneView.session.currentFrame
let rearCameraTransform = currentFrame?.camera.transform
let rearCameraAnchors = currentFrame?.anchors
    
print(rearCameraTransform?.columns.3 as Any)
print(rearCameraAnchors as Any)

But, of course, you can control all ARFaceAnchors in World tracking environment.

enter image description here

Tip:

In ARKit 5.0 you can use ARFaceTrackingConfiguration config on the following devices:

TrueDepth sensor iOS version CPU Depth data
if YES iOS11 ....... iOS15 A11, A12, A13, A14, A15 true
if NO iOS13 ....... iOS15 A12, A13, A14, A15 false

So, as a developer, you need to check whether current device supports Face tracking config or not:

import ARKit

@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {

    var window: UIWindow?

    func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
        
        if !ARFaceTrackingConfiguration.isSupported {
            let storyboard = UIStoryboard(name: "Main", bundle: nil)
            window?.rootViewController = storyboard.instantiateViewController(withIdentifier: "unsupportedDeviceMessage")
        }
        return true
    }
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1