$20+

ARUnderstandingPlus

0 ratings
Buy this

ARUnderstandingPlus

$20+
0 ratings

ARUnderstandingPlus gives you these capabilities:

  1. Ability to playback anchor data from ARKit in the visionOS Simulator (no visionOS device needed)
  2. Ability to record ARKit anchor data from a visionOS device
  3. A collection of ARKit anchor data recordings I have captured and will continue to add to (I do take requests). Work with these to try out your own hand pose or gesture recognition algorithms or other interactions.
  4. HandCapture app which runs on visionOS devices and on visionOS Simulator. The two will connect to each other and can stream ARKit data over a local Bonjour connection from device to Simulator in real time.
  5. Create ARKit-based Unit Tests from recorded sessions and execute them with your CI/CD to verify your functionality with a robust and repeatable process.

This API lets you drop in to your existing code simply with minimal changes. (Please read below for some important notes about Planes and Meshes).

To Record ARKit anchor data, just call record():

  • Using ARUnderstanding
.task {
    for await update in ARUnderstanding.handUpdates.record(outputName: "handSession\(sessionNumber)") {
        guard let anchor = update.anchor,
              let skeleton = anchor.handSkeleton
        else { continue }
        // Use the skeleton
    }
}
  • On the Data Provider
.task {
    for await update in HandTrackingProvider().record() {
    }
}
  • On the anchor stream
.task {
    for await update in HandTrackingProvider().anchorUpdates.record() {
    }
}
  • Or contribute anchor updates manually with ARUnderstandingPlus.AnchorRecorder
recorder = AnchorRecorder(outputName: "MyCustomSession")
for await update in HandTrackingProvider().anchorUpdates {
    guard let anchor = update.anchor,
          let skeleton = anchor.handSkeleton
    else { continue }
    
    if skeleton.joint(.forearmArm).isTracked {
        recorder.record(anchor: update)
    }
}

To playback a session, just call playback():

  • Using ARUnderstanding
.task {
    for await update in ARUnderstanding(providers: [.hands, .horizontalPlanes])
        #if targetEnvironment(simulator)
        .playback(fileName: "MyCustomSession")
        #endif
        .handUpdates {
        
    }
}
  • On the Data Provider
.task {
    for await update in HandTrackingProvider()
        #if targetEnvironment(simulator)
        .playback(fileName: "MyCustomSession")
        #endif
        .anchorUpdates {
        guard let skeleton = update.anchor.handSkeleton
        else { continue }
        
    }
}


Planes and Meshes

These anchors are saved and provided back by extracting the geometry they came with. If you depend on particulars of the MeshAnchor geometry, then you may need to adjust your code slightly, for example:

// Manual Normal extraction
for await update in SceneReconstructionProvider().anchorUpdates {
    let geometry = update.anchor.geometry
    for index in 0 ..< geometry.normals.count {
        let normalPointer = geometry.normals.buffer.contents().advanced(by: geometry.normals.offset + (geometry.normals.stride * Int(index)))
        let normal = normalPointer.assumingMemoryBound(to: (Float, Float, Float).self).pointee
        let normalVector = SIMD3<Float>(normal.0, normal.1, normal.2)
        // do something with the normalVector
    }
}
// Normals from the CapturedMeshAnchor
for await update in ARUnderstanding.meshUpdates {
    let geometry = update.anchor.geometry
    for normalVector in geometry.mesh.normals {
        // do something with the normalVector
    }
}

What you get

  • ARUnderstandingPlus source code
    • ARUnderstandingPlus is a Swift Package you can include in your projects directly
  • Recorded sessions
    • 6+ recorded sessions of hand gestures to try out with visionOS Simulator
  • HandCapture sample app project
    • HandPreviewView. You can run this project and Preview each session within the Simulator, and each opens a new Volume showing the hand skeleton animating in a loop thru the recorded session
    • Run HandCapture on a visionOS device and you can capture a hand performance as a new session for playback later. Share the file from HandCapture to save it and add it to your projects.
    • Run HandCapture on both visionOS device and the visionOS Simulator with the ability to see each other via Bonjour, and you can cast the hand skeleton from the visionOS device to the visionOS simulator. Each HandCapture app will search and connect to the other via Bonjour when they are showing the Immersive Space.
  • Updates for at least a year from your purchase
    • I am continuing to build upon this package, and will include more session logs with more anchors
    • I will also be delighted to prioritize enhancements based on your customer feedback
$
Buy this

ARUnderstandingPlus source code, HandCapture utility project, and updates thru May 2025

HandAnchor
Supported
DeviceAnchor
Supported
PlaneAnchor
Supported
MeshAnchor
Supported
WorldAnchor
Supported
ImageAnchor
Supported
Price vs another device
1%
Size
14.2 MB
Copy product URL