Create Custom 3D Models For iOS/SwiftUI and visionOS Projects
Generate 3D models

Create Custom 3D Models For iOS/SwiftUI and visionOS Projects

Article content
Custom 3D model


When you need to display 3D assets in your iOS and visionOS apps, you can use a 3D app like Spline. Apple recommends using professional 3D apps or its own Reality Composer (or Pro). The object library in Reality Composer Pro offers excellent 3D models for your Apple platform projects. However, you will shortly reach a point where you are still looking for the exact 3D shape for a specific need. If you can't find a particular 3D object from the library, continue reading this article and follow the step-by-step guide to create life-like 3D models for SwiftUI, iOS, and vision projects. We will focus on visionOS, using Object Capture for RealityKit and Reality Composer Pro. 


Get the Final Project

Download the 3D model (.usdz) you will create in this article and interact with it using the Reality Composer app for iOS. Grab the final visionOS project on GitHub and explore the sample app. 

Note: You must install Xcode 15 beta 2 or later to run the visionOS app. 


Prerequisites

To follow along and complete this tutorial or test the sample visionOS app, ensure you meet the following requirements. 

  • Scan your physical object: Install Reality Composer for iOS.
  • Add effects and composition: Reality Composer Pro (visionOS).
  • Apple Silicon Mac.
  • iOS device (iPhone or iPad).
  • Xcode 15 beta 2 or later.


Object Capture: Overview

Using the object capture feature in Reality Composer for iOS, you can generate 3D models from physical objects by scanning them at different angles. Follow the steps below to convert objects in your surroundings into well-lit 3D models for visionOS and other Apple platform projects. 


Choosing a Physical Object To Scan

When selecting physical objects to scan with Object Capture: 

  • Choose non-brittle objects so that they do not deform during scanning.
  • Pick non-bendable objects so they do not change their shapes while scanning. 
  • Do not use objects that are very thin in one dimension.
  • Do not scan objects with reflective surfaces or those that have transparency.
  • Avoid using smooth, textureless, and single-color objects. These may not contain the required information to generate the 3D models. 


How RealityKit Object Capture Works

When you scan a physical object at different angles, top, side, and base, Object Capture generates a 3D model by analyzing your provided scans. It also generates a point cloud and a mesh during the analysis and processing. It then maps the object's texture and optimizes the 3D model to be ready for your projects. 


3D Capture

Article content
Capture 3D objects

Grab your iPhone or iPad, generate your first 3D model, and use it for a visionOS app. To start your 3D capture:

  1. Place the object you want to capture on a stand where you can quickly move around the object. You should select objects between 8 cm (3 in) and 2 m (6 ft) in size. 
  2. Open the Reality Composer app on your iPhone or iPad and tap the plus button (+) to add a new project. If it isn't installed, you can download Reality Composer from the App Store. 
  3. Select Object Capture under the 3D CAPTURE section to launch the device camera.
  4. Place the dot you see in the camera feed at the center of the object you want to capture and continue. 

Article content
Start capturing

5. Make sure the object sits inside the bounding box that appears by dragging the handles. Then, start the capture. 


Objects Scanning From Different Angles: Top, Side, Base

At this point, you should move around the object and capture it at different angles. 

  • Side Capturing: Capture the side view of the object by moving around it. 

Article content
Side capturing

  • Lower Angle Capturing: Rotate around the object and capture it from the base.

Article content
Base capturing

  • Higher Angle Capturing: Move around and capture the object from the top.

Once all the scanning phases are complete, click Finish to start processing the 3D model. When the processing is ready, name the model and save it in Reality Composer. Reality Composer stores the project with an extension (.usdz). In the following sections, we will load the (.usdz) file in Xcode to display the 3D model.


Scanning Physical Objects: Best Practices

It would help to avoid the following things when you want to capture an object to generate a 3D model using Reality Composer's Object Capture on iOS. 

  • Avoid using a background with a texture similar to the object you are scanning. Doing so may confuse the algorithm for generating the 3D model and will generate the background as part of the object.  
  • Avoid solid shadows and strong highlights.
  • Remove all unnecessary elements near the object you want to capture.
  • Avoid capturing in low lights. Low lights make it difficult to maintain focus during scanning. 


Export the 3D Model as USDZ

Article content

After Object Capture finishes processing the model, you can click the share button to transfer it to your Mac. The exported model will have the file extension .usdz. The .usdz file format is recommended for displaying the 3D model in SwiftUI.


Using Reality Composer Pro: Augment Your 3D Model With Effects

Article content
Reality Composer Pro


Reality Composer Pro allows you to construct 3D compositions for visionOS apps visually. Its particle simulations feature enables you to render and emit sophisticated effects such as rain, smoke, confetti, snow, fireworks, and sparks. In this section, we will create point and cone sparkle effects and add them to our visionOS project. 

Let's launch Reality Composer Pro and add a new scene as shown in the image above or project. A project can contain many scenes. Think of a scene as separate Swift files containing different sections of an app. We can add the sparkles using the particle emitter. Reality Composer Pro comes with Xcode 15.1 installation or a later version.  

Let's launch Reality Composer Pro and add a new scene or project, as shown in the image above. A project can contain many scenes. Think of a scene as separate Swift files containing different sections of an app.  

Article content
Reality Composer Pro: A Point Sparkle Effect

To add a sparkle animation, 

  1. Click the plus (+) button at the left side of the project browser (bottom part of the UI) and select particle emitter. Name it as pointSparkle.
  2. Ensure the Emitter tab is selected and change the Emitter Shape to Point.   
  3. Click the rectangular grids at the right side of the Particle Emitter section and choose Sparkle. 
  4. Click the play button on the top-right to preview the point spark effect.
  5. Click File -> Export to save the scene as a usdz file.  

Let’s repeat the steps above to create another sparkle effect (coneSparkle) by setting the Emitter Shape to Cone. We will display this effect for the background of the visionOS project.

Article content
Reality Composer Pro: A Cone Sparkle Effect

Next, click File -> Export and save it as a usdz file.


Create a New visionOS Project

Article content
Reality Composer Package

Launch Xcode 15.2 beta or later and add a new visionOS project. Creating a new vision app generates a Reality Composer package by default. To open this package, go to the Sources -> RealityContent folder of the project. You will find Package.realitycomposerpro

Article content
Package.realitycomposerpro

Package.realitycomposerpro is the actual RealityKit file for creating your 3D composition. Use RealityContent.swift to load and display the Reality Composer Pro file if you do not want to export different scenes from Reality Composer Pro as usdz

```swift
RealityView { content in
            if let flowerPot = try? await ModelEntity(named: "flowerPot") {
                content.add(flowerPot)
            }
            Task {
                // Asynchronously perform any additional work to configure
                // the content after the system renders the view.
            }
        }
```        

Instead of using Package.realitycomposerpro, we will show our custom 3D model and the sparkles we made in the previous sections for the visionOS project as .usdz


Display Your 3D Model With SwiftUI’s Model 3D View

To display our custom 3D model and the sparkle particle effects, we can use SwiftUI's Model3D view. 

```swift
Model3D(named: "Flower-Port") { model in
     model
         .resizable()
         .aspectRatio(contentMode: .fit)
 } placeholder: {
     ProgressView()
 }
```        

It is a model container allowing you to load and display 3D models (.usdz) from the Xcode's project bundle or a URL.

Let’s add our 3D model (Flower-Pot.usdz) and the sparkle effects (pointSparkle.usdz, coneSparkle.usdz) to the project bundle in Xcode. Make sure the following options in the image below are selected. 

Article content
Adding files to Xcode's project navigator

Add a new Swift file, FlowerPotView.swift, and replace its content with the following sample code.

```swift
import SwiftUI
import RealityKit
import RealityKitContent

struct FlowerPotView: View {
    
    var body: some View {
        ZStack {
            VStack {
                Model3D(named: "pointSparkle") { model in
                    model
                        .resizable()
                        .aspectRatio(contentMode: .fit)
                        .scaleEffect(0.2)
                        .offset(x: -66, y: 200)
                } placeholder: {
                    ProgressView()
                }
                
                Model3D(named: "Flower-Port") { model in
                    model
                        .resizable()
                        .aspectRatio(contentMode: .fit)
                } placeholder: {
                    ProgressView()
                }
            }
            
            Model3D(named: "BgSparcle") { model in
                model
                    .resizable()
                    .scaledToFit()
                
            } placeholder: {
                ProgressView()
            }
        }
        .padding()
    }
}

#Preview {
    FlowerPotView()
}
```        

The code above displays our custom 3D mode and the sparkle emissions. Running the app will show the following.

Article content
Displaying a custom 3D model and particle effects in a visionOS app

What’s Next?

In this article, I showed you how to generate custom 3D models from real-world objects using Reality Composer for iOS. We dived into Reality Composer Pro, specifically for visionOS projects, and how to bring the models you create into visionOS projects. 

The upcoming weekly articles will focus mainly on assisting developers in adopting the Human Interface Guidelines into their workflow. Stay tuned 🤗. 

To view or add a comment, sign in

Others also viewed

Explore topics