Create Custom 3D Models For iOS/SwiftUI and visionOS Projects
When you need to display 3D assets in your iOS and visionOS apps, you can use a 3D app like Spline. Apple recommends using professional 3D apps or its own Reality Composer (or Pro). The object library in Reality Composer Pro offers excellent 3D models for your Apple platform projects. However, you will shortly reach a point where you are still looking for the exact 3D shape for a specific need. If you can't find a particular 3D object from the library, continue reading this article and follow the step-by-step guide to create life-like 3D models for SwiftUI, iOS, and vision projects. We will focus on visionOS, using Object Capture for RealityKit and Reality Composer Pro.
Get the Final Project
Download the 3D model (.usdz) you will create in this article and interact with it using the Reality Composer app for iOS. Grab the final visionOS project on GitHub and explore the sample app.
Note: You must install Xcode 15 beta 2 or later to run the visionOS app.
Prerequisites
To follow along and complete this tutorial or test the sample visionOS app, ensure you meet the following requirements.
Object Capture: Overview
Using the object capture feature in Reality Composer for iOS, you can generate 3D models from physical objects by scanning them at different angles. Follow the steps below to convert objects in your surroundings into well-lit 3D models for visionOS and other Apple platform projects.
Choosing a Physical Object To Scan
When selecting physical objects to scan with Object Capture:
How RealityKit Object Capture Works
When you scan a physical object at different angles, top, side, and base, Object Capture generates a 3D model by analyzing your provided scans. It also generates a point cloud and a mesh during the analysis and processing. It then maps the object's texture and optimizes the 3D model to be ready for your projects.
3D Capture
Grab your iPhone or iPad, generate your first 3D model, and use it for a visionOS app. To start your 3D capture:
5. Make sure the object sits inside the bounding box that appears by dragging the handles. Then, start the capture.
Objects Scanning From Different Angles: Top, Side, Base
At this point, you should move around the object and capture it at different angles.
Once all the scanning phases are complete, click Finish to start processing the 3D model. When the processing is ready, name the model and save it in Reality Composer. Reality Composer stores the project with an extension (.usdz). In the following sections, we will load the (.usdz) file in Xcode to display the 3D model.
Scanning Physical Objects: Best Practices
It would help to avoid the following things when you want to capture an object to generate a 3D model using Reality Composer's Object Capture on iOS.
Export the 3D Model as USDZ
After Object Capture finishes processing the model, you can click the share button to transfer it to your Mac. The exported model will have the file extension .usdz. The .usdz file format is recommended for displaying the 3D model in SwiftUI.
Using Reality Composer Pro: Augment Your 3D Model With Effects
Reality Composer Pro allows you to construct 3D compositions for visionOS apps visually. Its particle simulations feature enables you to render and emit sophisticated effects such as rain, smoke, confetti, snow, fireworks, and sparks. In this section, we will create point and cone sparkle effects and add them to our visionOS project.
Let's launch Reality Composer Pro and add a new scene as shown in the image above or project. A project can contain many scenes. Think of a scene as separate Swift files containing different sections of an app. We can add the sparkles using the particle emitter. Reality Composer Pro comes with Xcode 15.1 installation or a later version.
Let's launch Reality Composer Pro and add a new scene or project, as shown in the image above. A project can contain many scenes. Think of a scene as separate Swift files containing different sections of an app.
To add a sparkle animation,
Let’s repeat the steps above to create another sparkle effect (coneSparkle) by setting the Emitter Shape to Cone. We will display this effect for the background of the visionOS project.
Next, click File -> Export and save it as a usdz file.
Create a New visionOS Project
Launch Xcode 15.2 beta or later and add a new visionOS project. Creating a new vision app generates a Reality Composer package by default. To open this package, go to the Sources -> RealityContent folder of the project. You will find Package.realitycomposerpro.
Package.realitycomposerpro is the actual RealityKit file for creating your 3D composition. Use RealityContent.swift to load and display the Reality Composer Pro file if you do not want to export different scenes from Reality Composer Pro as usdz.
```swift
RealityView { content in
if let flowerPot = try? await ModelEntity(named: "flowerPot") {
content.add(flowerPot)
}
Task {
// Asynchronously perform any additional work to configure
// the content after the system renders the view.
}
}
```
Instead of using Package.realitycomposerpro, we will show our custom 3D model and the sparkles we made in the previous sections for the visionOS project as .usdz.
Display Your 3D Model With SwiftUI’s Model 3D View
To display our custom 3D model and the sparkle particle effects, we can use SwiftUI's Model3D view.
```swift
Model3D(named: "Flower-Port") { model in
model
.resizable()
.aspectRatio(contentMode: .fit)
} placeholder: {
ProgressView()
}
```
It is a model container allowing you to load and display 3D models (.usdz) from the Xcode's project bundle or a URL.
Let’s add our 3D model (Flower-Pot.usdz) and the sparkle effects (pointSparkle.usdz, coneSparkle.usdz) to the project bundle in Xcode. Make sure the following options in the image below are selected.
Add a new Swift file, FlowerPotView.swift, and replace its content with the following sample code.
```swift
import SwiftUI
import RealityKit
import RealityKitContent
struct FlowerPotView: View {
var body: some View {
ZStack {
VStack {
Model3D(named: "pointSparkle") { model in
model
.resizable()
.aspectRatio(contentMode: .fit)
.scaleEffect(0.2)
.offset(x: -66, y: 200)
} placeholder: {
ProgressView()
}
Model3D(named: "Flower-Port") { model in
model
.resizable()
.aspectRatio(contentMode: .fit)
} placeholder: {
ProgressView()
}
}
Model3D(named: "BgSparcle") { model in
model
.resizable()
.scaledToFit()
} placeholder: {
ProgressView()
}
}
.padding()
}
}
#Preview {
FlowerPotView()
}
```
The code above displays our custom 3D mode and the sparkle emissions. Running the app will show the following.
What’s Next?
In this article, I showed you how to generate custom 3D models from real-world objects using Reality Composer for iOS. We dived into Reality Composer Pro, specifically for visionOS projects, and how to bring the models you create into visionOS projects.
The upcoming weekly articles will focus mainly on assisting developers in adopting the Human Interface Guidelines into their workflow. Stay tuned 🤗.