top of page
  • Фото автораAlexey Parkhomenko

Insights into developing apps for visionOS

During the development process in recent months, I've encountered various inquiries related to visionOS. In this article, I'll share answers to some of my questions, providing valuable insights into novel concepts such as entities, immersive spaces, collision shapes, and more.

1. Should I use a window group, an immersive space, or both?

Consider the technical differences between windows, volumes, and immersive spaces when you decide which scene type to use for a particular feature in your app.

Here are some significant technical differences that you should factor into your decision:

  1. Windows and volumes from other apps the user has open are hidden when an immersive space is open.

  2. Windows and volumes clip content that exceeds their bounds.

  3. Users have full control over the placement of windows and volumes. Apps have full control over the placement of content in an immersive space.

  4. Volumes have a fixed size, windows are resizable.

  5. ARKit only delivers data to your app if it has an open immersive space.

2. What if I want to position my SwiftUI views relative to an entity in a reality view?

Use the RealityView attachments API to create a SwiftUI view and make it accessible as a ViewAttachmentEntity. This entity can be positioned, oriented, and scaled just like any other entity.

RealityView { content, attachments in
    // Fetch the attachment entity using the unique identifier.
    let attachmentEntity = attachments.entity(for: "uniqueID")!
    // Add the attachment entity as RealityView content.
} attachments: {
    // Declare a view that attaches to an entity.
    Attachment(id: "uniqueID") {
        Text("My Attachment")

3. How can I light my scene in RealityKit on visionOS?

You can light your scene in RealityKit on visionOS by:

  • Using a system-provided automatic lighting environment that updates based on real-world surroundings.

  • Providing your own image-based lighting via an ImageBasedLightComponent. To see an example, create a new visionOS app, select RealityKit as the Immersive Space Renderer, and select Full as the Immersive Space.

4. How can I position entities relative to the position of the device?

In an ImmersiveSpace, you can get the full transform of the device using the queryDeviceAnchor(atTimestamp:) method.

5. How can I interact with an entity using gestures?

There are three important pieces to enabling gesture-based entity interaction:

  1. The entity must have an InputTargetComponent. Otherwise, it won’t receive gesture input at all.

  2. The entity must have a CollisionComponent. The shapes of the collision component define the regions that gestures can actually hit, so make sure the collision shapes are specified appropriately for interaction with your entity.

  3. The gesture that you’re using must be targeted to the entity you’re trying to interact with (or to any entity). For example:

private var tapGesture: some Gesture {
        .onEnded { gestureValue in
            let tappedEntity = gestureValue.entity

It’s also a good idea to give an interactive entity a HoverEffectComponent, which enables the system to trigger a standard highlight effect when the user looks at the entity.

0 комментариев


Feel free to follow me on my social media.

Alexey Parkhomenko © 2023. All rights reserved.

  • 25231
  • LinkedIn
  • Twitter
Get weekly updates on new posts

Thanks for subscribing!

bottom of page