'How to set a known position and orientation as a starting point of ARKit

I am starting to use ARKit and I have a use case where I want to know the motion from a known position to another one.

So I was wondering if it is possible (like every tracking solution) to set a known position and orientation a starting point of the tracking in ARKit?

Regards



Solution 1:[1]

There are at least six approaches allowing you set a starting point for a model. But using no ARAnchors at all in your ARScene is considered as bad AR experience (although Apple's Augmented Reality app template has no any ARAnchors in a code).

First approach

This is the approach that Apple engineers propose us in Augmented Reality app template in Xcode. This approach doesn't use anchoring, so all you need to do is to accommodate a model in air with coordinates like (x: 0, y: 0, z: -0.5) or in other words your model will be 50 cm away from camera.

override func viewDidLoad() {
    super.viewDidLoad()
    
    sceneView.scene = SCNScene(named: "art.scnassets/ship.scn")!
    let model = sceneView.scene.rootNode.childNode(withName: "ship", 
                                                recursively: true)
    model?.position.z = -0.5
    sceneView.session.run(ARWorldTrackingConfiguration())
}


Second approach

Second approach is almost the same as the first one, except it uses ARKit's anchor:

guard let sceneView = self.view as? ARSCNView 
else { return }

if let currentFrame = sceneView.session.currentFrame {
        
    var translation = matrix_identity_float4x4
    translation.columns.3.z = -0.5
    let transform = simd_mul(currentFrame.camera.transform, translation)
        
    let anchor = ARAnchor(transform: transform)
    sceneView.session.add(anchor: anchor)
}


Third approach

You can also create a pre-defined model's position pinned with ARAnchor using third approach, where you need to import RealityKit module as well:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
            
    let model = ModelEntity(mesh: MeshResource.generateSphere(radius: 1.0))

    // ARKit's anchor
    let anchor = ARAnchor(transform: simd_float4x4(diagonal: [1,1,1]))
    
    // RealityKit's anchor based on position of ARAnchor
    let anchorEntity = AnchorEntity(anchor: anchor)

    anchorEntity.addChild(model)
    arView.scene.anchors.append(anchorEntity)
}


Fourth approach

If you turned on a plane detection feature you can use Ray-casting or Hit-testing methods. As a target object you can use a little sphere (located at 0, 0, 0) that will be ray-casted.

let query = arView.raycastQuery(from: screenCenter,
                            allowing: .estimatedPlane,
                           alignment: .any)

let raycast = session.trackedRaycast(query) { results in

    if let result = results.first {
        object.transform = result.transform
    } 
}


Fifth approach

This approach is focused to save and share ARKit's worldMaps.

func writeWorldMap(_ worldMap: ARWorldMap, to url: URL) throws {

    let data = try NSKeyedArchiver.archivedData(withRootObject: worldMap, 
                                         requiringSecureCoding: true)
    try data.write(to: url)
}

func loadWorldMap(from url: URL) throws -> ARWorldMap {

    let mapData = try Data(contentsOf: url)
    guard let worldMap = try NSKeyedUnarchiver.unarchivedObject(ofClass: ARWorldMap.self, 
                                                                   from: mapData) 
    else { 
        throw ARError(.invalidWorldMap) 
    }
    return worldMap
}


Sixth approach

In ARKit 4.0 a new ARGeoTrackingConfiguration is implemented with the help of MapKit module. So now you can use a pre-defined GPS data.

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
   
    for geoAnchor in anchors.compactMap({ $0 as? ARGeoAnchor }) {

        arView.scene.addAnchor(Entity.placemarkEntity(for: geoAnchor)
    }
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1