Navigation
NOTE: If you’re looking to get up and running with the least work possible, we recommend using ScanningViewController as detailed in the iOS Quickstart Guide.
Standard Cyborg provides a Cocoa SDK for real time 3D scanning on iOS devices. The core framework and supporting open-source APIs code are hosted at https://github.com/StandardCyborg/StandardCyborgCocoa
In this repository, you’ll find 3 frameworks:
Performs 3D reconstruction using the TrueDepth camera.
This class manages 3D reconstruction by fusing raw accelerometer, gyro, color camera, and TrueDepth camera data in real time.
When your scanning view controller is presented:
AVCaptureSession
configured to stream both RGB and depth frames.CMDeviceMotion
data.SCReconstructionManager
instance and set its delegate to your view controllerPerformance tip: although it’s possible to stream depth frames from the output of an AR session, it’s more efficient to avoid one and use the camera APIs directly, as the high computational resources used by AR are enough to slow down 3D reconstruction.
CameraManager
makes this easy if your app isn’t already streaming depth frames.
Now that your color and depth data is coming in, you may want to visualize these single frames in 3D. To do that, reconstruct a single depth frame into a point cloud by calling reconstructSingleDepthBuffer
, then render the resulting SCPointCloud
to your liking.
To begin reconstruction, start passing in the device motion data and synchronized color + depth frames to the respective accumulate methods.
The SCReconstructionManager
will call the following delegate method with every incremental assimilated frame
reconstructionManager(_ manager: SCReconstructionManager, didProcessWith metadata: SCAssimilatedFrameMetadata, statistics: SCReconstructionManagerStatistics)
When the user finishes scanning, stop passing in this data and call finalize
, which does some post-processing cleanup and returns the final, reconstructed point cloud. The point cloud wraps a single, interleaved buffer of point positions, colors, normals, and more.
StandardCyborgUI provides a CameraManager
class, which authorizes, initiates, and manages an AVCaptureSession
, providing its delegate with color and depth frames.
To visualize the reconstructed point cloud, you may use the SceneKit helper methods to build an SCNNode
representing this point cloud, or the Metal helper methods to render it yourself. To save this scan, you can use the PointCloudIO
helper methods to export to PLY or USDZ.
Please refer to the in-source documentation for full API reference.
You may tweak any of these parameters, but the default values are great for the vast majority of scanning applications.
let reconstructionManager = SCReconstructionManager()
reconstructionManager.maxDepth = 0.5
Contains a reconstructed point cloud, both during and at the end of a scanning session. The points are represented as a packed array of points, wrapped in an NSData. Each point has a position, normal, RGB color, and other attributes, e.g. [x1 y1 z1 n1 nx1 ny1 nz1 r1 g1 b1 x2 y2 z2 n2 nx2 ny2 nz2 r2 g2 b2 …].
This does not represent a mesh, as StandardCyborgFusion does not yet create meshes.
SCPointCloud+FileIO.h
provides category methods for convenient serialization/deserialization to/from disk.
SCPointCloud+Metal.h
provides category methods that make it more convenient to render a point cloud in Metal.
SCPointCloud+SceneKit.h
provides category methods that make it more convenient to import a point cloud into SceneKit.
A Swift framework that connects to the Standard Cyborg server API for scan analysis and data management. This API is documented at https://www.standardcyborg.com/docs/platform-api
Scan analysis is coming soon in a future version of the API.
Helpful components for iOS apps to use for driving the StandardCyborgFusion framework and visualizing scanning and results. It is not necessary to use these classes in an app that already has a video pipeline and rendering, but many apps may find these classes helpful to use. You can use these classes independently of each other.
Shows a live depth preview with a shutter and a cancel button, and manages scanning sessions. Owns a CameraManager
. Owns a default ScanningViewRenderer
, which may be set to another conforming object to customize rendering.
Here’s a very simple example that presents a scanning view controller, then notifies you when a point cloud is finished scanning or when the user cancels.
import StandardCyborgUI
class MyViewController: ScanningViewControllerDelegate {
func presentScanningVC() {
let scanningVC = ScanningViewController()
scanningVC.delegate = self
present(scanningVC, animated: true)
}
// MARK: - ScanningViewControllerDelegate
func scanningViewControllerDidCancel(_ controller: ScanningViewController) {
dismiss(animated: true)
}
// MARK: - PointCloudPreviewViewControllerDelegate
func scanningViewController(_ controller: ScanningViewController, didScan pointCloud: SCPointCloud) {
// Do something interesting with pointCloud
}
}
Renders an interactive preview of an SCPointCloud
. Provides two buttons which may be customized to perform your own actions.
Instantiate this view controller with an SCPointCloud
instance and present it. You may customize the buttons by direclty mutating its leftButton
and rightButton
.
import StandardCyborgUI
class MyViewController: UIViewController {
func showPreview(for pointCloud: SCPointCloud) {
let previewVC = PointCloudPreviewViewController(pointCloud: pointCloud)
previewVC.leftButton.setTitle("Rescan", for: UIControl.State.normal)
previewVC.leftButton.addTarget(self, action: #selector(previewRescanTapped(_:)), for: UIControl.Event.touchUpInside)
previewVC.rightButton.setTitle("Save", for: UIControl.State.normal)
previewVC.rightButton.addTarget(self, action: #selector(previewSaveTapped(_:)), for: UIControl.Event.touchUpInside)
controller.present(pointCloudPreviewVC, animated: false)
}
@objc private func previewRescanTapped(_ sender: UIButton) {
controller.presentingViewController?.dismiss(animated: true)
}
@objc private func previewSaveTapped(_ sender: UIButton) {
// Do your thing with the scan here!
controller.presentingViewController?.dismiss(animated: true)
}
Interfaces with AVCaptureSession
APIs to initiate and stream synchronized RGB + depth data. Also requests camera access and manages camera state in response to appplication state transitions and interruptions. ScanningViewController
already owns an instance of this, but it might be useful to use a CameraManager
if building your own custom scanning view controller.
It is recommended to use an instance of CameraManager
from within a view controller, like this:
class MyScanningViewController: UIViewController, CameraManagerDelegate {
private let cameraManager = CameraManager()
override func viewDidLoad() {
cameraManager.delegate = self
cameraManager.configureCaptureSession()
}
override func viewDidAppear(_ animated: Bool) {
guard CameraManager.isDepthCameraAvailable else { return }
cameraManager.startSession { result in
switch result {
case .success:
print("CameraManager is now streaming RGB + depth data")
case .configurationFailed:
print("Configuration failed for an unknown reason")
case .notAuthorized:
print("Camera access was not granted by the user")
}
}
}
override public func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
_cameraManager.stopSession()
}
// MARK: - CameraManagerDelegate
public func cameraDidOutput(colorBuffer: CVPixelBuffer, depthBuffer: CVPixelBuffer, depthCalibrationData: AVCameraCalibrationData) {
// Here, if the user is currently scanning, you can pass the data into
// StandardCyborgFusion.SCReconstructionManager.accumulate() to continue
// 3D reconstruction from this data.
// You can also use ScanningViewRenderer to visualize this color and depth buffer.
}
}
To use your own rendering in a ScanningViewController
, implement a class or struct conforming to this protocol. Then, set ScanningViewController.scanningViewRenderer
to an instance of your custom implementation.