Streaming iOS ARKit RGB-D Data
We have 2 different methods for streaming RGB-D on iOS:
- 1.Streaming without running AR app (no Unity)
- 2.Streaming while running AR app (Unity Plugin)
Problems?! (shocker)
3.1 Navigate to
/Sensor-Stream-Pipe/build/bin
3.2 Run Sensor Stream Client with OpenCV visualization
./ssp_client_opencv 9999
4.1 Open ssp.xcodeproj in
Sensor-Stream-Pipe/build-ios
4.2 Sign, in Signing & Capabilities

4.3 Build and run ssp_server!
If running into issues update host to the ip address of the computer running ssp client opencv in
serve_ios_raw.yaml
:
3.1 git clone our sample project
git clone https://github.com/moetsi/SSP-Unity-Plugin-AR-App.git
3.2 Update line 3 in Assets/StreamingAssets/serve_ios_raw.yaml to the correct destination IP address

4.1 Go to Build Settings and hit "Build and Run"
4.2 Once the project has been built and XCode is open we will need to add 2 linker flags
- -framework VideoToolbox
- -lbz2
Go to "Build Settings" under the "Unity-iPhone" project and scroll down to "Other Linker Flags" and add the 2 linker flags

- add script to stream data (needs ARSession reference)
- Add plugin to /Assets/Plugins/iOS
- Add linker flags once there
4.3 Run ssp_client_opencv on port 9999 on the device you are streaming frame data (check out Streaming a Video if need instruction on how to do this)
4.4 Deploy the project onto an RGB-D (LiDAR) iPhone and check out the RGB-D data streaming as the iPhone runs an Unity AR Application!
Last modified 1yr ago