🌈
Sensor Stream Pipe
  • What is Sensor Stream Pipe?
  • Getting Started
  • Installation Linux
  • Installation Windows
  • Installation Mac
  • Installation iOS
  • Streaming a Video
  • Streaming with Azure Kinect DK
  • Streaming a Dataset
  • Streaming iOS ARKit RGB-D Data
  • Streaming an OAK-D using Xlink
  • Components Overview
    • Sensor Stream Server
    • Sensor Stream Client
    • Sensor Stream Tester
  • Sending Frames
    • IReaders
    • FrameStruct
    • Config File
    • IEncoders
  • Receiving Frames
    • Receiving Frames
  • How to Extend Sensor Stream Pipe
    • Add New Sensor Interface
Powered by GitBook
On this page
  • Streaming without running AR app
  • 1.0 Build Sensor Stream Pipe for iOS (No Unity)
  • 2.0 Build Sensor Stream Pipe for Mac
  • 3.0 Run Sensor Stream Client on Mac
  • 4.0 Run and Build Sensor Stream Server
  • Streaming while running AR app (Unity Plugin)
  • 1.0 Build Sensor Stream Pipe for iOS (No Unity)
  • 2.0 Build Sensor Stream Pipe for Mac
  • 3.0 Setup Unity Project
  • 4.0 Deploy the Unity Project

Was this helpful?

Streaming iOS ARKit RGB-D Data

PreviousStreaming a DatasetNextStreaming an OAK-D using Xlink

Last updated 1 year ago

Was this helpful?

We have 2 different methods for streaming RGB-D on iOS:

  1. Streaming without running AR app (no Unity)

  2. Streaming while running AR app (Unity Plugin)

Problems?! (shocker)

Reach out on and we will get you going!

Streaming without running AR app

1.0 Build Sensor Stream Pipe for iOS (No Unity)

1.1 Follow the instructions:

2.0 Build Sensor Stream Pipe for Mac

2.1 Follow the instructions here:

3.0 Run Sensor Stream Client on Mac

3.1 Navigate to /Sensor-Stream-Pipe/build/bin

3.2 Run Sensor Stream Client with OpenCV visualization

./ssp_client_opencv 9999

4.0 Run and Build Sensor Stream Server

4.1 Open ssp.xcodeproj in Sensor-Stream-Pipe/build-ios

4.2 Sign, in Signing & Capabilities

4.3 Build and run ssp_server!

If running into issues update host to the ip address of the computer running ssp client opencv in serve_ios_raw.yaml:

Streaming while running AR app (Unity Plugin)

1.0 Build Sensor Stream Pipe for iOS (No Unity)

2.0 Build Sensor Stream Pipe for Mac

3.0 Setup Unity Project

3.1 git clone our sample project

git clone https://github.com/moetsi/SSP-Unity-Plugin-AR-App.git

3.2 Update line 3 in Assets/StreamingAssets/serve_ios_raw.yaml to the correct destination IP address

4.0 Deploy the Unity Project

4.0 Go to Assets/Scenes and open the SampleScene

4.1 Open Unity (2022+ works well) Go to Build Settings, switch to iOS platform, and hit "Build and Run" (this will launch xcode)

Set up your signing so you can deploy your application.

4.3 Deploy the project onto an RGB-D (LiDAR) iPhone and check out the RGB-D data streaming as the iPhone runs an Unity AR Application!

1.1 Follow the instructions:

2.1 Follow the instructions here:

4.1 Move the plugins created in to Assets/Plugins/iOS

4.2 Run ssp_client_opencv on port 9999 on the device you are streaming frame data (check out if need instruction on how to do this)

our discord
Streaming a Video
Installation Mac
Installation Mac
Installation iOS (Unity Plugin)
Installation iOS (Unity Plugin)
this step