Apple has unveiled new tools, technologies, and APIs designed to help developers create even richer experiences for their users. Widgets on the Lock Screen enable developers to surface key information from their apps in a new way, while other new APIs across Apple’s platforms help them build more unique features. WeatherKit gives developers the ability to integrate Apple Weather forecast data directly into their apps, and Xcode Cloud — Apple’s continuous integration and delivery service built into Xcode — is available to every Apple Developer Program member to help them create higher-quality apps, faster. Metal 3 enables gaming developers to create breathtaking graphics with accelerated performance, and developing for Apple’s platforms is now even more intuitive with improvements to Swift, SwiftUI, and Xcode. And with improvements to SKAdNetwork, ad networks and developers can better measure how ads perform while still preserving user privacy.
Live Home 3D is available for iOS, Mac, and Windows and offers a variety of tools that simplify the design creation process, handling everything from drawing floor plans to a complete 3D visualization. The software enables an amateur designer to create realistic and professional projects using innovative technologies to produce the final plan.
Swift UI Cookbook for Navigation
All app development starts with a robust navigation framework. The SwiftUI provides a proverbial kitchen of code to enhance the app experience for businesses. The SwiftUi’s navigation stack and the split view feature can help you connect with specific areas of the iOS application and explore the navigational state at a much higher speed. The APIs in SwiftUI scale from the basic stacks on iPhone, Apple TV, Apple Watch, and other iOS devices to a more multi-column presentation. This also creates a presence for the new navigation APIs, recipes for navigation, and persistent state.
Is it intended for iPhone users?
Apple introduced this API for iPhone and iPad devices, but something ahead says that this API has a higher purpose. It’s almost as if Apple is working on a LiDAR-based device that needs to detect the environment around the user in real-time.
With iOS 16, Apple also added 4K HDR video support for ARKit apps for the first time, while it also updated the Nearby Interaction API to integrate the U1 chip with AR. As the company has reportedly been working on its mixed reality headset, perhaps these new APIs were created with the headset in mind.
For Apple, innovative hardware is the entire point. Whether that hardware is used to access Meta’s metaverse, another company’s platform, or isolated experiences crafted by small developers, is almost irrelevant.
Apple’s alternative to Meta’s embodied Internet will likely be a broad offering spanning the iPhone, iPad, Mac, and a future AR/VR headset. The metaverse may simply be an app (or apps) competing with other AR/VR experiences across multiple devices.
“I wouldn’t read too much into Apple not using the word metaverse,” says George Jijiashvili, principal analyst at Omdia. “Rather, focus on what it’s doing to enable experiences which could support virtual social interactions in the future.”
Apple with AR technology
Aside from the new RoomPlan Swift API, a technology that developers can use to quickly create 3D floor plans of real-world spaces, Apple had no all-new AR/VR announcements. RealityOS, the operating system rumored to power Apple’s upcoming headset, wasn’t even teased. This seems like a vote of no confidence in all things metaverse and, perhaps, the entire AR/VR space. However, Jijiashvili warns against such pessimism. “The reality is that Apple has much strength in this space, which it continues to gradually improve,” says Jijiashvili. He points out Apple has acquired over eight AR/VR startups since 2015. ARKit, Apple’s augmented-reality development platform for iOS devices, continues to see interest from developers, including big names like Snapchat and even Instagram, which is owned by Meta.
However, Apple’s lack of news gives the rest of the industry the chance to prepare for its seemingly inevitable push into space. Though consumer headsets are dominated by Meta, which produces nearly 80 percent of all headsets sold, the industry is rife with midsize companies like HTC, Valve, DPVR, Magic Leap, Pico, Lumus, Vuzix, Pimax, and Varjo—to name just a few. Apple’s arrival in space could threaten these innovators.
The demonstration comes courtesy of Russ Maschmeyer from Shopify who published a series of Tweets that showcase the idea behind a new AR prototype that leans on Apple’s AR API. He refers to it as being able to hit the “reset button” on a room and see what it would look like empty of furniture.
The demonstration shows how the technology scans a room using Apple’s RoomPlan API, a tool for developers that was showcased at WWDC earlier this year. The tool uses the LiDAR sensor on iPhones and iPads to scan the geometry of a room and build a 3D model of the space that developers can then use for a variety of purposes.
A plan for the RoomPlan?
Specifically, RoomPlan is an API that could enable new visually immersive use cases in real estate, design, or architecture apps. For example, interior design apps can integrate RoomPlan to let their users visualize wall colors to inform the right shade (and amount) of paint.
The end-user experience involves scanning one’s room using a LiDAR-equipped iPhone. That guided experience then outputs a 3D model of a given room. And that basic framework opens the door for the use cases noted above and several others that will develop.
As Apple puts it, “The framework inspects a device’s camera feed and LiDAR readings and identifies walls, windows, openings, and doors. It also recognizes room features, furniture, and appliances—a fireplace, bed, or refrigerator—and provides that information to the app.”
Apple also refers to RoomPlan as “the first step in architecture and interior design workflows to help streamline conceptual exploration and planning.” The next step is for developers to integrate the RoomPlan API and take it in other creative directions in real estate and home services.
The ultimate goal here is that 3D scanning for physical spaces, previously reserved for deeper-pocketed tech players, is brought to startups and app developers. It could also boost the functionality of existing tools for home projects.
Dimensionally Appropriate
But one question that emerges from all of the above is why this underlying framework is necessary, and what value is Apple adding here? The answer is that AR doesn’t “just work” without a few complex components in place, including dimensional maps of a given space.
Of course, there’s rudimentary AR like early versions of Pokémon Go, but those are more “floating stickers” than AR. But for AR’s true promise, graphics should hide behind trees or remain at a fixed distance while real-life people walk in front or behind them, as dimensionally appropriate.
That brings us back to room scanning. The idea is that spaces are scanned for geometric and semantic understanding. The former is all about dimension, while the latter is all about context. It’s about knowing that a window is a window and a wall is a wall, as noted by Apple.