top of page

Bridging the Digital and Physical: AR in Flutter with Unity

  • Writer: Maksim Murin
    Maksim Murin
  • Aug 14
  • 8 min read

Augmented reality (AR) is no longer just a futuristic concept reserved for sci-fi movies and tech demos. It has evolved into a powerful technology that transforms the way we interact with digital content, bringing virtual objects into the real world in ways that feel natural and immersive. From retail and education to gaming, healthcare, architecture, and industrial applications, AR is reshaping how users experience information and entertainment.📱🎮 Imagine trying on furniture virtually before buying it, exploring historical landmarks in your living room, or practicing complex medical procedures in a risk-free virtual environment - AR makes all of this possible.


ar in flutter with unity cover

But here’s the challenge: building AR features that are both visually rich and cross-platform is no small task. 🤔 Flutter gives you speed, beautiful UI, and a single codebase for mobile, web, and desktop apps, making it ideal for crafting seamless interfaces. However, it’s not designed for heavy-duty 3D rendering or complex spatial interactions. 🚀 Unity, on the other hand, is a leader in real-time 3D, AR, and game development - providing advanced physics, shaders, and animation tools - but it’s not built for creating flexible, platform-consistent mobile UI 🕹️


Why Flutter and Unity Make a Pair


Flutter, the UI toolkit from Google, has quickly become a favourite for developers who want to build beautiful, high-performance applications for multiple platforms from a single codebase. It excels at building user interfaces, managing app logic, and ensuring a seamless user experience.

However, when it comes to real-time 3D rendering and complex AR interactions, Flutter isn’t equipped out of the box. That’s where Unity comes in.


Unity is a leading platform for real-time 3D development. It’s widely used in gaming, architecture, simulation, and especially augmented reality. It offers advanced capabilities like:

  • Plane detection

  • Object tracking

  • Physics simulation

  • Realistic lighting and rendering


By combining the strengths of both platforms, developers can build applications that are both visually stunning and highly interactive - without sacrificing performance or development speed.


A Symbiotic Relationship


At its core, the synergy between Flutter and Unity relies on a simple yet powerful principle: collaboration through communication. Each platform does what it does best, and they work together to deliver a unified experience.


flutter, unity logo

Imagine opening a travel app and pointing your phone at a monument. Instantly, a 3D model of the building appears on your screen, floating in perfect alignment with reality.

  • Flutter is the “narrator” — controlling the menus, transitions, and user flow

  • Unity is the “stage” — where the AR magic happens with lighting, textures, and interactions

Together, they bring the story to life.


🛠️ The AR Experience


When we set out to create this demonstration, the aim was simple: show what’s possible when Flutter’s sleek UI meets Unity’s immersive AR capabilities. We didn’t want to overwhelm the viewer with complex mechanics or overly detailed assets — instead, we focused on a clear, relatable use case: placing a virtual object into a real-world scene.


Now, let’s walk through the demonstration and see it in action:


ar demo

For the proof of concept, I focused on a clean, functional AR scenario:

  1. Plane Detection — Unity’s AR Foundation continuously scans the environment using the device’s camera, identifying flat surfaces like floors or tables. On the demo, these detected planes are visualised as slightly transparent blue layers, making it easy to see where AR objects can be placed.

  2. QR Code Scanning — Once a surface is found, the AR camera also checks for QR codes using a lightweight detection script.

  3. Object Placement — After scanning the QR code, the app loads a predefined 3D model and places it directly onto the detected surface in the real world.


This simple flow is easy to understand for users, yet demonstrates how Flutter’s front-end flexibility and Unity’s AR capabilities can work in harmony.


Here's how this creative partnership works:

  1. A Shared Stage: When you need AR functionality, your Flutter app launches the Unity project as a dedicated view or window. It's as if Flutter opens a door to Unity’s stage.

  2. Passing the Baton: Flutter and Unity communicate by passing simple messages back and forth. For example, a user's tap on a Flutter button sends a message to Unity: "Place the model here."

  3. The Performance: Unity receives the message and performs its core task: it uses its powerful AR engine to detect surfaces, places the 3D model, and makes it look and feel real.

  4. A Status Update: Once the task is complete, Unity sends a message back to Flutter, like, "The model has been placed successfully." This allows Flutter to update the UI with a confirmation message.


This elegant abstraction ensures that the user sees a single, unified application, while under the hood, two specialized engines work in perfect harmony.


How It Works: AR in Flutter with Unity Under the Hood 🛠️


Bringing Flutter and Unity together may sound complex, but in practice, it’s a structured and achievable process. The idea is to let Flutter manage the app’s UI and navigation, while Unity handles all the 3D and AR features. Here's a practical breakdown of how it works — without getting too technical.


1. Creating the AR Scene in Unity

The AR part of the app is built inside Unity using AR Foundation, which supports both ARKit (iOS) and ARCore (Android). This scene can include:

  • Plane detection to find real-world surfaces

  • QR code scanning or image tracking

  • Placement and animation of 3D objects

  • Realistic lighting and interaction logic


unity editor image

In our demo, for instance, the Unity scene detects a horizontal surface, waits for a QR code to be scanned, and then places a virtual object at that spot. All AR logic is handled on Unity’s side.

You can preview and test the scene directly in the Unity editor before integrating it into the mobile app.


2. Exporting Unity as a Native Library

Once the AR scene is ready, we export the Unity project as a native library:

  • On Android, Unity is exported as an .aar (Android Archive) library

  • On iOS, it’s built as a .framework


2.1. Android

To correctly build and run the Unity module on Android, you need to make changes to the following files:

  • settings.gradle: This is where you define which modules make up your project. You need to add the Unity library to this file so Gradle knows it exists

include(":app", ":unityLibrary")
project(":unityLibrary").projectDir = File("unityLibrary")
  • build.gradle (App-level): This file describes your application's dependencies. This is where you declare that your app depends on the Unity library, and it should be included in the build


  • AndroidManifest.xml: In this file, which is your app's manifest, you must register the Unity Activity. This allows the Android system to correctly launch and display the AR scene

<activity
    android:name="com.unity3d.player.UnityPlayerActivity"
    android:label="@string/app_name"
    android:launchMode="singleTask" />

This process allows the Flutter app to use Unity as part of its native iOS code, similar to how a Gradle module is used in Android.


2.2. iOS

After exporting the Unity project, you'll get a folder containing the UnityFramework.framework file. This file is a native library for iOS.

Integration in Xcode:

  • Open your Flutter project's ios folder in Xcode

  • Drag and drop the Unity-iPhone.xcodeproj file into Xcode's file navigator

  • In the Xcode project settings, go to the General tab

  • In the Frameworks, Libraries, and Embedded Content section, make sure your UnityFramework.framework is added and has the Embed & Sign option selected


This step turns Unity into a "module" that can be launched from Flutter as a native view or activity, rather than a separate app.


At a high level, Unity is embedded as a native view inside the Flutter app. The architecture is straightforward:

  • Flutter Layer — Handles navigation, menus, configuration screens, and any non-AR UI

  • Native Container — A platform-specific Android or iOS view that hosts Unity’s rendering surface

  • Unity Layer — Runs the AR scene, powered by AR Foundation, with plane detection, object placement, and QR code recognition


layer interaction scheme

3. Launching Unity from Flutter

Once Unity is exported as a native library, the next step is to connect it with Flutter. We’ll use a MethodChannel to trigger Unity and exchange messages between the two platforms.


3.1. Android


On Android, integrating Unity into a Flutter application begins with registering UnityPlayerActivity in AndroidManifest.xml. This step is crucial because it declares the Unity activity to the Android system, allowing it to be launched from Flutter when needed. Once registered, you can start the Unity activity from Flutter via a platform channel, enabling seamless interaction between your Flutter UI and Unity’s 3D or AR content.



Next, set up a communication channel in MainActivity.kt to connect Flutter with the native Android code. Using a MethodChannel, you can send commands from Flutter to trigger Unity actions, such as loading the Unity scene or passing data.



3.2. iOS


On iOS, integrating Unity into a Flutter application starts with creating a communication channel between Flutter and the native code in AppDelegate.swift. This channel (FlutterMethodChannel) allows sending commands from Flutter to the iOS code, for example, to load Unity. Once the channel is set up, upon receiving the corresponding call, the Unity view controller is initialized and presented, enabling interaction between the 3D scene or AR content and the rest of the application.



And request camera access in Info.plist:



Under the hood, this starts the Unity scene just like navigating to another screen.

You can even configure it to return back to Flutter after the AR interaction is done — like a temporary AR "overlay" within your app flow.

3.3. Flutter


To call a native method from Flutter, we use a MethodChannel. In this example, the channel 'com.example.ar-Ios/unity' connects Flutter with the native Android or iOS code. By invoking invokeMethod('launchUnity'), we send a command to the native side to launch Unity, effectively bridging Flutter’s UI with Unity’s 3D or AR content.



4. Two-Way Communication Between Flutter and Unity

To make this integration dynamic, we set up a communication layer:

  • From Flutter to Unity: For example, sending a message like "placeObject" or "scanNow" based on user actions

  • From Unity to Flutter: Unity can call back into Flutter using native platform callbacks — for instance, sending the content of a scanned QR code or notifying that an object has been placed


communocation scheme

This keeps both layers in sync, so Flutter can update the UI or trigger additional logic based on Unity’s output.


In practice, these challenges are less about technical impossibility and more about careful engineering discipline. Once the pipeline is established - Unity packaged for each platform, lifecycle events properly handled, and message channels well-defined - adding more features becomes incremental rather than reinventing the wheel. This means teams can confidently iterate on AR features without worrying about breaking the integration.


Conclusion


The experiment demonstrated that combining Flutter and Unity for AR is not only feasible but also highly practical for real-world applications ✅. Flutter delivers rapid development and a polished, responsive UI, while Unity provides the AR capabilities needed for truly immersive and interactive experiences. 🌟 Whether it’s product visualization, interactive training, or creative marketing, this hybrid approach unlocks exciting possibilities for engaging AR content on mobile devices 🤝.

Ultimately, bridging the digital and physical worlds isn’t just about technology — it’s about giving users a seamless, intuitive way to interact with their surroundings. With Flutter and Unity together, that bridge is firmly within reach. 🎯 This approach also allows developers to leverage the strengths of both platforms without compromising performance or design. As AR continues to evolve, hybrid solutions like this will become increasingly vital for creating next-generation mobile experiences.


Let’s Build the Future Together


At Igniscor, we don’t just build apps — we craft complete mobile experiences. From embedded systems to sleek, high-performance mobile interfaces, we turn ideas into reality with creativity and care. Have a project in mind? Let’s make it happen — contact us today! 🚀

Comments


Recent Posts

Leave us a message and we'll get back to you

Our Location

Warszawska 6, lok. 32

Bialystok, Poland

Message was sent! Thanks

Send us your request
bottom of page