top of page
Writer's pictureAlice Cai

AtomXR: the first natural language driven, in-HMD application dev tool

Updated: Sep 24, 2023


Alice, a team-lead for the AtomXR project, demoes AtomXR for a class at the Harvard School of Engineering and Applied Sciences


Overview

Extended reality (XR) game development suffers from three major inefficiencies and entry barriers:

1) the inaccessibility and steep learning curve of game engines and IDEs to non-technicals,

2) the mismatch between development environment on 2D screens and ultimate 3D user experience inside head-mounted displays (HMDs), and

3) the long deployment cycles for testing and iteration onto target HMDs.


To address these inefficiencies, we introduce AtomXR, the first natural language-driven, at-runtime application dev tool. Designed with low floors, AtomXR allows anyone to create any application using intuitive, non-technical interactions, all at runtime.

Adam, a member of the Harvard AR/VR Club and experienced game developer, learns how to use AtomXR.


Users can design 3D environments, spawn objects, and add functionality through natural language input, as well as by physical interactions to modify components. AtomXR provides out-of-the-box functionality for buttons, user interfaces, collectible items, path finding, and more. Beyond this, users can add game logic of arbitrary complexity- boolean operators, loops, conditionals.

We envision a future where anything that can be described in natural language can be implemented with AtomXR.


Goals

Our goals for the project evolved significantly and became more and more ambitious with each success. Here's what we wanted to do to feel satisfied with our work:

  1. Users can create a experiences completely through natural language

  2. Users can develop experiences in the headset without needing to go to a desktop

  3. The development cycle must be near-instant: no compiling and no building

Outcomes

  1. We were able to complete all 3 goals. Using natural language queries like "place this cube here" in the headset, users can create experiences very quickly.

  2. We created an interpreted programming language, AtomScript, that allows for run-time development of experiences while in-game.

  3. AtomXR has been used by dozens of engineering students at Harvard for making XR experiences rapidly and easily.

  4. We are currently working with Harvard SEAS professor Dr. Elena Glassman on a paper for publication.

Timeline

July 2022: AtomXR started as a passion project during the summer of 2022. Due to the potential impact and novelty of the project, the team begins working with an HCI (Human-Computer Interaction) professor at Harvard to pursue publication.


November 2022: The team begins sandbox tests for AtomXR. These tests were to get general feedback before proceeding to the more serious user studies


December 2022 - April 2023: The team begins work on a paper for submission to a journal, running official user studies, analyzing data, and writing up results.


January 2023 - present: Two grad students join the team to commercialize AtomXR, and are building a feature-rich production-grade system for deployment.


Aryan, a friend of the lab, tests out an emulator version of AtomXR in the Harvard AR/VR Club studio. This is part of a sandbox test, which is used to get rapid feedback prior to launching more-serious user studies.


Future Plans

We are currently building out a feature-rich, production-grade version of AtomXR for commercial deployment. We plan on deploying the application during summer of 2023. We are open to partnering, licensing, or selling to other companies interested in this technology.


We envision AtomXR as a standard tool for novices and experts alike to develop in-headset. With our connections to the Harvard AR/VR community, we would like to on-board more of our local onto the platform. Based on feedback we receive from our community, we can then expand to the AR/VR community at large.

Recent Posts

See All

Comments


bottom of page