As an augmented reality developer -
[Wall Gallery Designer Lite](https://itunes.apple.com/app/id1289357200), I’ve been waiting for the long-rumored time-of-flight depth sensing. So, of course, I ordered the 2020 iPad Pro immediately.
Yesterday it was delivered. I immediately rebuilt my apps with ARKit 3.5, tested them on the new iPad, and uploaded the new versions to the App Store.
No changes were required to the code to get performance and AR object stability benefits with LiDAR. At the same time, the performance and stability improvements are opening up opportunities for code changes to improve the app’s user interface. I’m excited about the potential there and look forward to leveraging the new capabilities in ARKit 3.5.
While the rebuilt apps do not require LiDAR, they do benefit from ARKit 3.5. My first impression is that object stability and accuracy have improved on my 2017 iPad Pro 12.9 with ARKit 3.5. On the other hand, it is also my first impression that selecting, moving and editing picture frames and walls when many picture frames are present is somewhat slower than the previous version built with ARKit 3.0 possibly because ARKit 3.5 is doing more work to get increased object accuracy and stability.
The tools for building with ARKit 3.5 only arrived with the release of iOS 13.4, so we AR developers will require some time to understand and leverage all the improvements. But so far, I’m liking it.