{"id":74658,"date":"2025-09-02T11:37:40","date_gmt":"2025-09-02T06:07:40","guid":{"rendered":"https:\/\/www.tothenew.com\/blog\/?p=74658"},"modified":"2025-09-09T14:49:25","modified_gmt":"2025-09-09T09:19:25","slug":"ios-arkit-with-swiftui","status":"publish","type":"post","link":"https:\/\/www.tothenew.com\/blog\/ios-arkit-with-swiftui\/","title":{"rendered":"iOS: The Power of ARKit: Face Points Capturing"},"content":{"rendered":"<p><strong>Welcome to our The Power of ARKit blog series!<\/strong><br \/>\nIn this series, we\u2019ll explore how to integrate <strong>ARKit<\/strong> into <strong>SwiftUI<\/strong> and unlock new possibilities for building immersive and innovative iOS applications.<\/p>\n<p>This first part will focus on the <strong>initial setup and configuration<\/strong> of ARKit within a SwiftUI project. We\u2019ll also walk through how to <strong>track face points on a physical device<\/strong>, setting the foundation for more advanced AR experiences in upcoming posts.<\/p>\n<h2>Introduction<\/h2>\n<h2><strong>Why Use SwiftUI with ARKit?<\/strong><\/h2>\n<p>SwiftUI is Apple\u2019s newer way of building iPhone and iPad apps. It\u2019s popular because it\u2019s simple to use and shows live previews of your app while you design it.<\/p>\n<p>Now, when you <strong>combine SwiftUI with ARKit<\/strong>, you can take things further\u2014adding <strong>augmented reality (AR)<\/strong> to your apps. This means your phone isn\u2019t just for normal apps anymore\u2014it can turn into a tool that blends the digital world with the real one.<\/p>\n<p>With ARKit handling the <strong>AR magic<\/strong> (like tracking space, faces, and objects) and SwiftUI making the user <strong>interface easy to design<\/strong>, developers can create amazing apps where the digital and physical worlds work together.<\/p>\n<h2>What is ARKit?<\/h2>\n<p>ARKit is Apple\u2019s tool that lets developers build <strong>augmented reality (AR) apps<\/strong> for iPhones and iPads. It uses the device\u2019s camera and sensors to mix digital objects with the real world in a very natural way.<\/p>\n<p><strong>What ARKit Can Do:<\/strong><\/p>\n<ul>\n<li><strong>World Tracking<\/strong> \u2013 Knows how your phone is moving in 3D space.<\/li>\n<li><strong>Face Tracking<\/strong> \u2013 Detects and follows facial expressions.<\/li>\n<li><strong>Plane Detection<\/strong> \u2013 Finds flat surfaces like tables, floors, or walls.<\/li>\n<li><strong>People Occlusion<\/strong> \u2013 Lets virtual objects appear in front of or behind real people.<\/li>\n<li><strong>LiDAR Support<\/strong> \u2013 Measures depth with high accuracy (on newer iPhones\/iPads).<\/li>\n<li><strong>Motion Capture<\/strong> \u2013 Tracks body movement in real time.<\/li>\n<li><strong>SwiftUI &amp; RealityKit Integration<\/strong> \u2013 Makes it easier for developers to quickly build AR apps with modern Apple tools.<\/li>\n<\/ul>\n<p>ARKit gives developers everything they need to create <strong>realistic, interactive AR experiences<\/strong>\u2014from games and learning apps to shopping tools and design apps\u2014directly on iOS devices.<\/p>\n<p><strong>Real-World Uses of AR &amp; VR<\/strong><br \/>\nAugmented Reality (AR) and Virtual Reality (VR) are not just fancy future ideas anymore. They are being used today in many fields to solve real problems. By mixing digital visuals with real life, AR and VR are changing the way we <strong>learn, shop, work, travel, and even get medical treatment.<\/strong><\/p>\n<p><strong>1. Education &amp; Training<\/strong><\/p>\n<ul>\n<li><strong>Example:<\/strong> Medical students practicing surgery, pilots learning to fly, or workers learning to use machines.<\/li>\n<li><strong>Benefit:<\/strong> People can train safely in a virtual setup, without the risks or costs of real-life mistakes.<\/li>\n<\/ul>\n<p><strong>2. Healthcare<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Doctors seeing a 3D model of a patient before surgery, therapists helping people overcome fears, or remote checkups using AR.<\/li>\n<li><strong>Benefit<\/strong>: Improves accuracy in treatment, makes healthcare easier to access, and helps doctors and students learn better.<\/li>\n<\/ul>\n<p><strong>3. Shopping (Retail &amp; Online Stores)<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Trying furniture in your living room using AR (like IKEA apps), or checking how clothes and makeup look before buying.<\/li>\n<li><strong>Benefit<\/strong>: Reduces wrong purchases, builds trust, and makes shopping more fun and personal.<\/li>\n<\/ul>\n<p><strong>4. Property &amp; Architecture<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Taking VR house tours without visiting, or architects showing 3D building designs on-site.<\/li>\n<li><strong>Benefit<\/strong>: Saves time, helps buyers make faster decisions, and avoids confusion during planning.<\/li>\n<\/ul>\n<p><strong>5. Factories &amp; Industry<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Workers wearing AR glasses to get step-by-step instructions, or experts guiding teams remotely.<\/li>\n<li><strong>Benefit<\/strong>: Improves safety, speed, and accuracy while lowering training costs.<\/li>\n<\/ul>\n<p><strong>6. Fun &amp; Entertainment<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Playing VR games, using AR filters on Instagram\/Snapchat, or watching concerts in VR.<\/li>\n<li><strong>Benefit<\/strong>: Makes entertainment more engaging and interactive than normal screens.<\/li>\n<\/ul>\n<p><strong>7. Travel &amp; Tourism<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Virtual tours of famous places, AR guides in museums, or exploring destinations before booking a trip.<\/li>\n<li><strong>Benefit<\/strong>: Lets people experience culture and travel even if they can\u2019t physically go.<\/li>\n<\/ul>\n<p><strong>8. Teamwork &amp; Remote Work<\/strong><\/p>\n<ul>\n<li><strong>Example<\/strong>: Virtual meeting rooms with 3D avatars, or digital whiteboards for group ideas.<\/li>\n<li><strong>Benefit<\/strong>: Makes online meetings more natural and interactive, bringing the office feel to remote work.<\/li>\n<\/ul>\n<p>AR and VR are making life easier, safer, and more exciting. They <strong>save costs, improve learning, boost customer experience, and open up new ways to connect with the world<\/strong>. These technologies are shaping the future of how we live and work.<\/p>\n<h2>Screen Design &amp; UI<\/h2>\n<p><strong>Getting Started in Xcode<\/strong><br \/>\nNow it\u2019s time to set up our project. Open <strong>Xcode<\/strong> (Apple\u2019s app for making iOS apps) and create a <strong>new project<\/strong>. This will be the starting point where we connect <strong>ARKit<\/strong> and <strong>SwiftUI<\/strong> together.<\/p>\n<p>Follow the steps shown in the screenshot to guide you through the setup.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/1.2.png\" alt=\"Select project type\" width=\"604\" height=\"431\" \/><\/p>\n<p><strong>Create a New App Project<\/strong><br \/>\nIn the project setup window, select the <strong>App<\/strong> option under the <strong>iOS<\/strong> section. For this tutorial, we\u2019ll name the project <strong>ARKitWithSwiftUI<\/strong>.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/2.1-1.png\" alt=\"Project named\" width=\"607\" height=\"388\" \/><\/p>\n<p><strong>Auto-Generated Code<\/strong><br \/>\nOnce the project is created, <strong>Xcode automatically generates some starter code<\/strong> to help you get up and running quickly. Refer to the screenshot below to see the default project structure.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/3-1.png\" alt=\"Auto generated code\" width=\"615\" height=\"402\" \/><\/p>\n<p><strong>Modifying the Default ContentView<\/strong><br \/>\nNext, we\u2019ll update the <strong>pre-populated code<\/strong> <strong>in<\/strong> <span style=\"color: #ff00ff;\"><em>ContentView.swift<\/em> <\/span>to prepare it for ARKit integration. The modified code is shown below:<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/4.png\" alt=\"Modified code\" width=\"620\" height=\"506\" \/><\/p>\n<p><strong>Designing the Landing Page<\/strong><br \/>\nThe first screen of our app is the landing page. We kept it simple:<\/p>\n<ul>\n<li>A title at the top that says <strong>\u201cAR Face Points Tracking\u201d<\/strong><\/li>\n<li>A button right below it labeled<strong> \u201cView in AR\u201d<\/strong>, which the user can tap to move into the AR experience<\/li>\n<\/ul>\n<p>Below, you\u2019ll see both the <strong>screen layout<\/strong> and the <strong>SwiftUI code<\/strong> that creates this design.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/5.png\" alt=\"Landing page design-SwiftUI\" width=\"622\" height=\"447\" \/><\/p>\n<p><strong>Creating the Camera Screen with ARKit<\/strong><br \/>\nNext, we\u2019ll design the <strong>camera screen<\/strong>, where ARKit will be responsible for detecting face points in real time. To achieve this, let\u2019s create a new Swift file named <span style=\"color: #ff00ff;\"><em>FacePointsARView.swift<\/em><\/span><\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/6.1.png\" alt=\"Face point screen-SwiftUI\" width=\"630\" height=\"373\" \/><\/p>\n<p><strong>Adding a Dismiss Button<\/strong><br \/>\nAfter creating the <em><span style=\"color: #ff00ff;\">FacePointsARView.swift<\/span><\/em> file, we\u2019ll add a <strong>dismiss button<\/strong> to the screen. This button will be placed at the <strong>top-right corner<\/strong>, allowing users to easily close the AR camera view and return to the landing page.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/7.png\" alt=\"Face point code file\" width=\"631\" height=\"399\" \/><\/p>\n<p><strong>Updating ContentView to Open the Face Points Screen<\/strong><br \/>\nAfter creating the <strong>Face Points detection screen<\/strong> in SwiftUI, we need to connect it to the app\u2019s main file, ContentView.swift.<\/p>\n<p>This step makes sure that when the user taps the <strong>\u201cView in AR\u201d<\/strong> button on the landing page, the app will smoothly navigate to and show the <em><span style=\"color: #ff00ff;\">FacePointsARView<\/span><\/em> (the AR screen).<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/8.png\" alt=\"Updated content in ContentView\" width=\"636\" height=\"568\" \/><\/p>\n<p><strong>Rendering the Camera View<\/strong><br \/>\nNow we\u2019ll set up the code to show the camera feed on the screen. To keep things organized, we\u2019ll place this ARKit logic in a separate file.<\/p>\n<p>Create a new Swift file called <span style=\"color: #ff00ff;\"><em>FacePointsARViewContainer.swift<\/em><\/span>. This file will handle the ARKit camera rendering and link it with SwiftUI for our project.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/9.png\" alt=\"Face point render screen-SwiftUI\" width=\"635\" height=\"540\" \/><\/p>\n<p><strong>Face Points Tracking with Coordinator<\/strong><br \/>\nIn this step, we\u2019ve implemented the code to <strong>track face points on the camera screen<\/strong>. To handle communication between SwiftUI and ARKit, we created a custom <span style=\"color: #ff00ff;\"><em>Coordinator<\/em><\/span> class. This class is responsible for:<\/p>\n<ul>\n<li>Passing data between the SwiftUI view and ARKit methods.<\/li>\n<li>Rendering and updating the <strong>face tracking points<\/strong> inside ARKit\u2019s delegate callbacks.<\/li>\n<\/ul>\n<p>With that in place, the final step is to load our <em><span style=\"color: #ff00ff;\">FacePointsARViewContainer<\/span><\/em> inside the <em><span style=\"color: #ff00ff;\">FacePointsARView<\/span><\/em> <strong>screen<\/strong>, so the camera with real-time face point tracking is displayed.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/10.png\" alt=\"Updated code in Face point detection screen-SwiftUI\" width=\"640\" height=\"385\" \/><\/p>\n<p><strong>Adding Camera Access Permission<\/strong><br \/>\nSince ARKit relies on the device camera for face tracking, we must request <strong>camera access permission<\/strong> in the app\u2019s configuration. To do this, open the <strong>Info.plist<\/strong> file and add the following key-value pair:<\/p>\n<pre><span style=\"font-weight: 400;\">\r\n<strong>&lt;<span style=\"color: #ff00ff;\">key<\/span>&gt;NSCameraUsageDescription&lt;\/<span style=\"color: #ff00ff;\">key<\/span>&gt;\r\n&lt;<span style=\"color: #ff00ff;\">string<\/span>&gt;This app requires camera access to track face points using ARKit.&lt;\/<span style=\"color: #ff00ff;\">string<\/span>&gt;\r\n<\/strong><\/span>\r\n<\/pre>\n<p>This ensures the system prompts the user for permission when the app attempts to access the camera. Without it, the AR face tracking feature will not work.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/Screenshot-2025-08-30-at-3.35.47\u202fPM.png\" alt=\"Add camera access permission code\" width=\"651\" height=\"441\" \/><\/p>\n<p><strong>Running the Application<\/strong><br \/>\nWith all the setup complete, it\u2019s time to <strong>run the app on a physical iOS device<\/strong>. Since ARKit requires access to the TrueDepth camera, this step cannot be tested on the simulator.<\/p>\n<p>When launched, the app will successfully <strong>detect and render face points<\/strong> in real time using ARKit.<\/p>\n<p>\ud83d\udcf8 Result: Below is the output showing face points being tracked on a physical device:<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/11.png\" alt=\"Final App running screen 1\" width=\"242\" height=\"524\" \/> <img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/12.1.png\" alt=\"Final App running screen 2\" width=\"242\" height=\"524\" \/> <img decoding=\"async\" loading=\"lazy\" class=\"\" style=\"border: 1px solid gray;\" src=\"https:\/\/www.tothenew.com\/blog\/wp-ttn-blog\/uploads\/2025\/08\/13.1.png\" alt=\"Final App running screen 3\" width=\"242\" height=\"524\" \/><\/p>\n<p>Concluding we have covered in this first part of the series, we explored how to integrate <strong>ARKit with SwiftUI<\/strong> by building a sample project that demonstrates <strong>face point tracking<\/strong> on a physical device. This foundation sets the stage for more advanced AR experiences.<\/p>\n<p>In the <strong>next part of the series<\/strong>,, we\u2019ll dive deeper into additional ARKit features with step-by-step examples to expand your understanding and capabilities. Stay tuned!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Welcome to our The Power of ARKit blog series! In this series, we\u2019ll explore how to integrate ARKit into SwiftUI and unlock new possibilities for building immersive and innovative iOS applications. This first part will focus on the initial setup and configuration of ARKit within a SwiftUI project. We\u2019ll also walk through how to track [&hellip;]<\/p>\n","protected":false},"author":1898,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"iawp_total_views":90},"categories":[1400],"tags":[3489,7944,3488,5460],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/74658"}],"collection":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/users\/1898"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/comments?post=74658"}],"version-history":[{"count":25,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/74658\/revisions"}],"predecessor-version":[{"id":76130,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/74658\/revisions\/76130"}],"wp:attachment":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/media?parent=74658"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/categories?post=74658"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/tags?post=74658"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}