ARFlow: A Framework for Simplifying AR Experimentation Workflow

Abstract

The recent advancement in computer vision and XR hardware has ignited the community's interest in AR systems research. Similar to traditional systems research, the evaluation of AR systems involves capturing real-world data with AR hardware and iteratively evaluating the targeted system designs. However, it is challenging to conduct scalable and reproducible AR experimentation due to two key reasons. First, there is a lack of integrated framework support in real-world data capturing, which makes it a time-consuming process. Second, AR data often exhibits characteristics, including temporal and spatial variations, and is in a multi-modal format, which makes it difficult to conduct controlled evaluations. In this demo paper, we present the design and implementation of a framework called ARFlow that simplifies the evaluation workflow of AR systems researchers.

AR Experimentation Challenges

The recent advancement in computer vision and XR hardware has ignited the community's interest in AR systems research.

Similar to traditional systems research, the evaluation of AR systems involves capturing real-world data with AR hardware and iteratively evaluating the targeted system designs. However, it is challenging to conduct scalable and reproducible AR experimentation due to two key reasons.

First, there is a lack of integrated framework support in real-world data capturing, which makes it a time-consuming process.

Second, AR data often exhibits characteristics, including temporal and spatial variations, and is in a multi-modal format, which makes it difficult to conduct controlled evaluations.

ARFlow Architecture

Data from different AR devices can be streamed in real-time to an AR researcher server for online and offline experimentation.

Related Links

There's a lot of excellent work that was introduced around the same time as ours.

ExpAR is an AR experimentation platform aiming to provide scalable and controllable AR experimentation. ExpAR is envisioned to operate as a standalone deployment or a federated platform.

LitAR and Xihe both introduce advanced AR lighting estimation frameworks to mobile platforms.

If you are interested in privacy and security issues in AR, check this out: Privacy-preserving Reflection Rendering for Augmented Reality.

BibTeX

@inproceedings{zhao2024arflow,
    author = {Zhao, Yiqin and Guo, Tian},
    title = {Demo: ARFlow: A Framework for Simplifying AR Experimentation Workflow},
    year = {2024},
    isbn = {9798400704970},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    url = {https://doi.org/10.1145/3638550.3643617},
    doi = {10.1145/3638550.3643617},
    abstract = {The recent advancement in computer vision and XR hardware has ignited the community's interest in AR systems research. Similar to traditional systems research, the evaluation of AR systems involves capturing real-world data with AR hardware and iteratively evaluating the targeted system designs [1]. However, it is challenging to conduct scalable and reproducible AR experimentation [2] due to two key reasons. First, there is a lack of integrated framework support in real-world data capturing, which makes it a time-consuming process. Second, AR data often exhibits characteristics, including temporal and spatial variations, and is in a multi-modal format, which makes it difficult to conduct controlled evaluations.},
    booktitle = {Proceedings of the 25th International Workshop on Mobile Computing Systems and Applications},
    pages = {154},
    numpages = {1},
    location = {, San Diego, CA, USA, },
    series = {HOTMOBILE '24}
    }