INMO Air Development Guide

Hi there! I’d love to hear your thoughts. Feel free to leave a comment, even if it’s just a simple ‘Hi’.

1. INMO Air ADB Activation

1. Introduction

Android Debug Bridge (ADB) is a versatile command-line tool that allows you to communicate with a device. ADB commands help with various device operations, such as installing and debugging applications.

Warning: Due to the openness of the Android system, using ADB to install third-party apps and perform any operations on the system’s native settings may lead to device lag, overheating, and bugs. We will not be responsible for these issues. Once ADB is enabled, the INMO Air2 will not support returns without reason within seven days.

2. Initial Preparation for the Glasses

On this interface, long press the right touchpad twice, each time for 1.5 seconds, and enter the password 20210108 to enable the ADB mode on the glasses.

INMO Air Setting

2. Unity SDK

SDK Function Introduction

This section provides a development guide for the INMO AIR2 SDK on the Unity platform. It includes information on how to adapt development for INMO RING2 on INMO Air2, touchpad interaction development for INMO Air2 AR glasses, and AR capability development kits on the Unity platform.

Quick Start

Download the SDK

INMO SDK

Environment Configuration

Software Environment

  • Unity Hub and Unity 2020.3LTS or higher, with the Android Support module installed.
  • Visual Studio 2019.

To install Unity Hub and Unity Editor, please visit the Unity Official Website.

How to install Android Support?

  1. Open Unity Hub, click on the installations on the left, select the version that needs Android Support, and click the gear icon.
  2. Click Add Modules, find Android Build Support, and click Install. After installation, the Android label will appear under that version.

Recommended installation of Visual Studio 2019 through the above add modules method.

Correspondence between Android NDK and Unity versions:

Unity Version NDK Version
2020.3 LTS r19
2021.3 LTS r21d

Hardware Environment

  • INMO AIR2 Glasses
  • INMO RING2 Ring

Import the SDK

The inmo_unity_sdk package structure is as follows:
INMO Unity SDK Structure

  • Samples: Examples provided by the SDK.
  • SDK: AAR library for AR services.
  • URP: Pipeline configuration. If your project already has URP configuration or uses the Built-in pipeline, you can delete the URP folder.
  • Document: Offline documentation.
  • InmoUnitySdk: SDK assembly.

Import Guide:

  1. Create a new project or open an existing project and switch to the Android platform via File -> Build Settings.
    Unity Android Building
  2. Download the SDK sample code and import it into the project, or import the latest unitypackage provided by the SDK:
    • Drag the unitypackage directly into the Project window.
    • Through Assets -> Import Package -> Custom Package, click the import button at the bottom right of the page and wait for the compilation to complete.

Packaging Settings

Since AR services are developed based on android-28, you need to configure the Minimum API Level to 28 during packaging. The default Scripting Backend is Mono mode, but Il2Cpp mode is recommended for better performance and scalability. The SDK uses the ARMv7 architecture, so you can uncheck ARM64 in Target Architectures to reduce package size. It is recommended to uncheck the “Optimize Frame Rate” option when packaging, as shown below:

Unity Project Setting 1

Unity Project Setting 2

3D Model Resource Requirements

  • The tested simultaneous rendering is about 100,000 to 200,000 faces. Models need to be lightweight, reducing the number of faces as much as possible.
  • Use only one Directional Light as a dynamic light source; use lightmaps for multiple lights.
  • Set the Camera’s Render Shadows to off, and the Directional Light’s Shadow Type to No Shadows.
  • Set the Camera’s Field Of View to 13.4, Near to 0.1, and Far to 15.
  • Avoid using low-performance shaders, such as Highlight outlines for the entire model, which will double the number of faces rendered.

Source

  1. INMO Air ADB Activation
  2. Unity SDK

An Ultimate Guide of Using Python in Unity (Part I)

Hi there! I’d love to hear your thoughts. Feel free to leave a comment, even if it’s just a simple ‘Hi’.

Python is the most popular programming language in the world. Unity is the most popular game engine in the world. To combine these two, a user guide from Unity is all you need. A very noticable point is that the document of version 7 is way more abundant than all the ones of previous versions combined.

https://docs.unity3d.com/Packages/com.unity.scripting.python@7.0/manual/index.html

But, to work on a much harder and much more complex project, You need my guide.

1. Python Scripting Manager Version

The latest version of the Unity Python Scripting manager is 7.0.1; however, by searching “python” in the package manager, the extension I got was 6.0.1, which stopped updating since 2022–08–04. Since 2023–03–21, Unity release 7.0 version and updated to 7.0.1 with more features and less bugs.

If you had the same issue I had, simply go to “Package Manager” -> “+” -> “Add Package from git URL”, then type “com.unity.scripting.python@7.0.1“. Done.

2. For Anaconda Users

All the python scripts are running in a separated Unity project, which makes a very neat python coding environment without worrying about conda or venv. However, if you are a conda user like I am, please notice whether the terminal in “Project Manager” -> “Python Scripting” triggers conda virtual environment automatically.

If it does, all the pip installation will be located in the conda location. To prevent it happens, make sure deactivate the virtual env. Then, all the installations will be successfully installed in Unity’s “Library/PythonInstall/Lib/site-packages”.

3. Run a Juypter Notebook

Jupyter Notebook is my favorite brainstorming and demo testing platform, due its instant interaction and lego like code structuring layouts. Although python scripts are mostly short and easy to debug, Jupyter Notebook boosts the process even more efficient.

Install Jupyter Notebook server:

Project Jupyter | Installing Jupyter

Jupyter Lab is recommended, or every single click opens a new tap sounds reasonable, then jupyter notebook can also be tested, although people will always choose back to Jupyter Lab.

Once it is installed, a simple “jupyter lab” on Unity python script terminal will open a web page instantly, which is the exact location of the Unity Project. All the folders and files on the left side will look familiar.

Alternatively, there is another way of using Jupyter. I personally use vscode for everything, from machine learning to Unity, even writing with the vim extension. VSCode has such a abundant extension libraries, which greatly has one for Jupyter.

Once the extension is installed, click the top right to select a kernal.

Make sure jupyter lab is still running. The extension asks the server, password, and the port. The password is the token from terminal which runs jupyter backend.

In my case is

38368ba9ca2fb4fac974672e1c0e37419d10c4baf0c1c69e

Which is the number right after “http://localhost:8888/lab?token=

Create a file with .ipynb format, for example “test.ipynb”. Now you can enjoy my favorite python brainstorming and demo testing platform in Unity.

More is on the way.

If you have any comments please share it in the issues

A Prototype Document of a LLM Running on an AR Device

Hi there! I’d love to hear your thoughts. Feel free to leave a comment, even if it’s just a simple ‘Hi’.

A Prototype Document of a LLM Running on an AR Device

Meta Smart Glasses do not have screens that makes the voice commanding the only option for the users. For me, typing and reading is a way more convenient way than speaking and listening.

Having an AI assistant on a VR or an MR device is clearly not a wise choice either; not only due to the neck killing weight that most people cannot hold for more than an hour, but also the light speed like battery drainning ratio that saves the users’ necks being broke.

The AR glasses became the best choice among all technological devices, which obviously cannot be a smart phone that staring at one too long will only causes hemiplegia.

More importantly, I got a pair of AR glasses.

And here is the plan that I got:

I will create an app that calls API from the existing LLMs, which can be a ChatGPT or a home server running llama, and the best part is that the chatting UI will not disappear from your sight unless you take off the glasses.

There will be a gigantic button on the middle of the phone screen that would start the users chatting without moving the eye sight from the AR glasses. Once the button is pressed, the keyboard pops up. Since now days everyone is able to type on the smartphone without looking at it, which also helps users keep their neck releases pressure. No one still needs to look at the phone keyboard to type, right?

Then the AI assistant will hit the answer right in front of the user, elegant. To satisfy the other group of users who prefer speaking and listening over typing and reading, there will be another button that would triger this specific need. The text of the answer will also be typed on the screen, to make sure people wont miss out certain extremely important information while the AI shouts out an ancient people don’t familar with.

Furthermore, a picture input with a phone camera will also be supported. People use AR glasses only because they do not want to be segragated from the reality, and the camera is the only way the device can communicate with the information from the reality that carried by photons. Which means, there will be another button to activate the camera.

So,

There will be two parts of the app. One is the information gaining interface, which would be the screen on the AR glasses. The other one is the input interface, which would be the phone. The AR glasses will show the info below:

  1. Typing process animation with text.
  2. Final text typed in.
  3. Result text.
  4. Listening animation.
  5. Camera capture in realtime.
  6. Final camera capture.
  7. A cartoon figure follows you all the time like a Microsoft Office Clippy.

The phone will do:

  1. A button activate the keyboard.
  2. A button activate the microphone.
  3. A button activate the camera.
  4. A button calibrate the axis.
  5. A button exits.

From now on, I will start working on this project. Hopefully I can demostrate it to the public in my lifetime if my procastination does not drag my feet. One mothed to prevent it happens is that I write blogs more frequently, and that is what I am going to do.

If you have any comments please share it in the issues

XReal Air: Transforming Reality in Three Unique Ways

Hi there! I’d love to hear your thoughts. Feel free to leave a comment, even if it’s just a simple ‘Hi’.

The XReal Air stands as my all-time favorite augmented reality (AR) device, cause it is the only one that I have. Just kidding. The real reason is that over a year of exploration, I’ve identified three optimal scenarios for deploying this innovative technology, which makes it stand out from all other products.

1. Partnered with a Smartphone Featuring a Fully Functioning Type C Port

Any smartphone equipped with a fully functional Type C port can seamlessly connect to the XReal Air. Upon connection, users can engage with two distinct modes: mirroring and AR space.

Mirroring essentially allows you to view your phone’s display through the glasses. While not overly complex, this feature significantly reduces neck strain caused by constantly looking down at your phone. Buy it for everyone must save tons of medical insurance budget.

In contrast, AR space offers an immersive experience by projecting a curved screen around you. This mode isn’t just for watching YouTube videos or web browsing; it also supports the download of VR apps, unlocking smartphone experiences previously unimaginable. My personal favorite, a maze VR app, dazzles with its full 3D environment and interesting gameplay, moving the platform that drops the ball into the goal.

This setup excels in environments where mobility is key. Whether lounging on a couch (where it’s practically impossible to maintain a single position) or moving around the house, the XReal Air paired with a smartphone affords unparalleled freedom.

2. Connected to a Computer with a Type C DP Port

Although smartphones are increasingly powerful, they still fall short of the computing might and electrical output of PCs. For a more robust AR setup, a computer with a Type C DP port is ideal for connecting to the XReal Air.

Users can opt to extend their display or mirror it. Extending ensures privacy, as only the wearer can see the content displayed. Meanwhile, mirroring is convenient for those who prefer not to switch between screens frequently. Encapsulated in the glasses, a state of deep focus is attainable, making it perfect for tackling urgent tasks.

This configuration is best suited for stationary environments such as offices, libraries, or coffee shops. It offers an additional, private screen that enhances productivity, especially in serene settings like a library.

3. Utilizing an XReal Beam

Connecting directly to a PC might not offer features like smooth follow or an anchored screen. Although the company’s Nebula software aims to bridge this gap, its beta versions for Windows and Mac struggle with consistency.

Enter the XReal Beam: a hardware solution that embodies the functionality of Nebula. It supports all viewing modes and comes with pre-installed apps, promising an elevated computer AR experience. Notably, it can connect to a Nintendo Switch, overcoming the console’s casting limitations through its dual Type C ports.

My preferred use case for the Beam is lounging on the couch, playing Switch games on what feels like the largest screen imaginable, all while enjoying unmatched comfort.

Conclusion

The XReal Air adapts to various setups, each offering distinct benefits tailored to specific lifestyles. Whether your daily routine aligns with any of these scenarios, the XReal Air is worthy of consideration for a step into augmented reality.

If you have any comments please share it in the issues