I want to make a mix of virtual reality and augmented reality. The goal is I have a stereo camera (for each eyes). I tried to put two ARSCNView in a viewCotnro
I want to make an AR foundation project but I keep getting this problem .Can someone helps me please ? No active UnityEngine.XR.ARSubsystems.XRCameraSubsystem i
I haave a file (exists in main bundle with target membership checked) named matrix.usdz and need to load it with do { let path = Bundle.main.path(forResour
I have an iOS app with deployment target iOS 10+, I need to add some features that depend only on RealityKit to appear with users whom their iOS version is 13+,
In my renderer delegate I create a raycast query from the center of the view to track estimated plane and display a 3D pointer that follows the raycast result.
Out of the box it's pretty clear ARKit doesn't allow for the tracking of more than 4 images at once. (You can "track" more markers than that but only 4 will fun
I am brand new to ARKit (and a novice in swift) but I am trying to create a basic AR app. I am following this tutorial in which a simple scene is created essent
I want to know if there is a way for the scenes that gets exported from the reality composer to deal with multiple surfaces in the real world. So for example,
I want to build a demo app in ARKit and I have some questions about what is currently possible with the beta (Apple has been calling this RealityKit, or ARKit 3
I'm currently doing some experiments with RealityKit. I've been looking at some sample code, and I'm a bit confused about the differences between ARAnchor and
I am starting to use ARKit and I have a use case where I want to know the motion from a known position to another one. So I was wondering if it is possible (lik