What is the real benefit of using Raycast in ARKit and RealityKit?

Simple Ray-Casting, the same way as Hit-Testing, helps to locate a 3D point on a real-world surface by projecting an imaginary ray from a screen 2D point onto a detected plane. In Apple documentation (2019) there was the following definition of a raycasting:

Ray-casting is the preferred method for finding positions on surfaces in the real-world environment, but the hit-testing functions remain present for compatibility. With tracked raycasting, ARKit and RealityKit continue to refine the results to increase the position’s accuracy of virtual content you placed with a raycast.

When the user wants to place a virtual content onto detected surface, it’s a good idea to have a tip for this. Many AR apps draw a focus circle or square that give the user visual confirmation of the shape and alignment of the surfaces that RealityKit or ARKit is aware of. So, to find out where to put a focus circle or a square in the real world, you may use an ARRaycastQuery to ask a framework where any surfaces exist in the real world.

UIKit implementation

Here’s an example where you can see how to implement the raycast(query) instance method:

import UIKit
import RealityKit

class ViewController: UIViewController {
    
    @IBOutlet var arView: ARView!
    let model = try! Entity.loadModel(named: "usdzModel")
    
    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        self.raycasting()
    }

    fileprivate func raycasting() {
            
        guard let query = arView.makeRaycastQuery(from: arView.center,
                                              allowing: .estimatedPlane,
                                             alignment: .horizontal)
        else { return }

        guard let result = arView.session.raycast(query).first
        else { return }

        let raycastAnchor = AnchorEntity(world: result.worldTransform)
        raycastAnchor.addChild(model)
        arView.scene.anchors.append(raycastAnchor)
    }
}

If you wanna know how to use a Convex-Ray-Casting in RealityKit, read this post.


If you wanna know how to use Hit-Testing in RealityKit, read this post.

SwiftUI implementation

Here’s a sample code where you can find out how to implement a raycasting logic in SwiftUI:

import SwiftUI
import RealityKit

struct ContentView: View {
    
    @State private var arView = ARView(frame: .zero)
    var model = try! Entity.loadModel(named: "robot")
    
    var body: some View {            
        ARViewContainer(arView: $arView)
            .onTapGesture(count: 1) { self.raycasting() }
            .ignoresSafeArea()
    }
    
    fileprivate func raycasting() {                    
        guard let query = arView.makeRaycastQuery(from: arView.center,
                                              allowing: .estimatedPlane,
                                             alignment: .horizontal)
        else { return }

        guard let result = arView.session.raycast(query).first
        else { return }

        let raycastAnchor = AnchorEntity(world: result.worldTransform)
        raycastAnchor.addChild(model)
        arView.scene.anchors.append(raycastAnchor)
    }
}

and then…

struct ARViewContainer: UIViewRepresentable {
    
    @Binding var arView: ARView
    
    func makeUIView(context: Context) -> ARView { return arView }
    func updateUIView(_ uiView: ARView, context: Context) { }
}

P.S.

If you’re building either of these two app variations from scratch (i.e. not using Xcode AR template), don’t forget to enable the Privacy - Camera Usage Description key in the Info tab.

Leave a Comment