'Stereo ARSCNview to make VR and AR mix

I want to make a mix of virtual reality and augmented reality. The goal is I have a stereo camera (for each eyes).

I tried to put two ARSCNView in a viewCotnroller but it seems ARKit enable only one ARWorldTrackingSessionConfiguration at the same time. How can I do that?

I researched to copy the graphic representation of a view to past this to an other view but impossible to find. Please help me to find the solution.

I found this link, maybe can it illumine us: ARKit with multiple users

Here's a sample of my issue:

https://www.youtube.com/watch?v=d6LOqNnYm5s

PS: before unlike my post, comment why!



Solution 1:[1]

The following code is basically what Hal said. I previously wrote a few lines on github that might be able to help you get started. (Simple code, no barrel distortion, no adjustment for the narrow FOV - yet).

Essentially, we connect the same scene to the second ARSCNView (so both ARSCNViews are seeing the same scene). No need to get ARWorldTrackingSessionConfiguration working with 2 ARSCNViews. Then, we offset its pointOfView so it's positioned as the 2nd eye.

https://github.com/hanleyweng/iOS-Stereoscopic-ARKit-Template

Solution 2:[2]

The ARSession documentation says that ARSession is a shared object.

Every AR experience built with ARKit requires a single ARSession object. If you use an ARSCNView or ARSKView object to easily build the visual part of your AR experience, the view object includes an ARSession instance. If you build your own renderer for AR content, you'll need to instantiate and maintain an ARSession object yourself.

So there's a clue in that last sentence. Instead of two ARSCNView instances, use SCNView and share the single ARSession between them.

I expect this is a common use case, so it's worth filing a Radar to request stereo support.

How to do it right now?

The (singleton) session has only one delegate. You need two different delegate instances, one for each view. You could solve that with an object that sends the delegate messages to each view; solvable but a bit of extra work.

There's also the problem of needing two slightly different camera locations, one for each eye, for stereo vision. ARKit uses one camera, placed at the iOS device's location, so you'll have to fuzz that.

Then you have to deal with the different barrel distortions for each eye.

This, for me, adds up to writing my own custom object to intercept ARKit delegate messages, convert the coordinates to what I'd see from two different cameras, and manage the two distinct SCNViews (not ARSCNViews). Or perhaps use one ARSCNView (one eye), intercept its frame updates, and pass those frames on to a SCNView (the other eye).

File the Radar, post the number, and I'll dupe it.

Solution 3:[3]

To accomplish this, please use the following code:

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet weak var sceneView: ARSCNView!
    @IBOutlet weak var sceneView2: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()

        sceneView.delegate = self
        sceneView.showsStatistics = true
        let scene = SCNScene(named: "art.scnassets/ship.scn")!
        sceneView.scene = scene
        sceneView.isPlaying = true

        // SceneView2 Setup
        sceneView2.scene = scene
        sceneView2.showsStatistics = sceneView.showsStatistics

        // Now sceneView2 starts receiving updates
        sceneView2.isPlaying = true     
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
    }
    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        sceneView.session.pause()
    }
}

And don't forget to activate .isPlaying instance properties for both ARSCNViews.

enter image description here

Solution 4:[4]

Objective-C version of Han's github code, sceneViews created programatically, with y + z positions not updated - all credit Han:

-(void)setup{

    //left
    leftSceneView = [ARSCNView new];
    leftSceneView.frame = CGRectMake(0, 0, w, h/2);
    leftSceneView.delegate = self;
    leftSceneView.autoenablesDefaultLighting = true;
    [self.view addSubview:leftSceneView];

    //right
    rightSceneView = [ARSCNView new];
    rightSceneView.frame = CGRectMake(0, h/2, w, h/2);
    rightSceneView.playing = true;
    rightSceneView.autoenablesDefaultLighting = true;
    [self.view addSubview:rightSceneView];

    //scene
    SCNScene * scene = [SCNScene new];
    leftSceneView.scene = scene;
    rightSceneView.scene = scene;

    //tracking
    ARWorldTrackingConfiguration * configuration = [ARWorldTrackingConfiguration new];
    configuration.planeDetection = ARPlaneDetectionHorizontal;
    [leftSceneView.session runWithConfiguration:configuration];
}

-(void)renderer:(id<SCNSceneRenderer>)renderer updateAtTime:(NSTimeInterval)time {

    dispatch_async(dispatch_get_main_queue(), ^{

        //update right eye
        SCNNode * pov = self->leftSceneView.pointOfView.clone;

        SCNQuaternion orientation = pov.orientation;
        GLKQuaternion orientationQuaternion = GLKQuaternionMake(orientation.x, orientation.y, orientation.z, orientation.w);
        GLKVector3 eyePosition = GLKVector3Make(1, 0, 0);
        GLKVector3 rotatedEyePosition = GLKQuaternionRotateVector3(orientationQuaternion, eyePosition);
        SCNVector3 rotatedEyePositionSCNV = SCNVector3Make(rotatedEyePosition.x, rotatedEyePosition.y, rotatedEyePosition.z);

        float mag = 0.066f;
        float rotatedX = pov.position.x + rotatedEyePositionSCNV.x * mag;
        float rotatedY = pov.position.y;// + rotatedEyePositionSCNV.y * mag;
        float rotatedZ = pov.position.z;// + rotatedEyePositionSCNV.z * mag;
        [pov setPosition:SCNVector3Make(rotatedX, rotatedY, rotatedZ)];

        self->rightSceneView.pointOfView = pov;
    });

}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Han
Solution 2
Solution 3 Andy Jazz
Solution 4 Johnny Rockex