Using AI to turn everyday objects into proactive assistants

This stapler knows when you need it

A stapler on top of a moveable platform that can anticipate when a person will need to use it. Credit: Carnegie Mellon University

A stapler slides across a desk to meet a waiting hand, or a knife edges out of the way just before someone leans against a countertop. It sounds like magic, but in Carnegie Mellon University’s Human-Computer Interaction Institute (HCII), researchers are combining AI and robotic mobility to give everyday objects this kind of foresight.

Using large language models (LLMs) and wheeled robotic platforms, HCII researchers have transformed ordinary items—like mugs, plates or utensils—into proactive assistants that can observe human behavior, predict interventions and move across horizontal surfaces to help humans at just the right time.

The team’s work on unobtrusive physical AI was presented at the 2025 ACM Symposium on User Interface Software and Technology, held in Busan, Korea.

“Our goal is to create adaptive systems for physical interaction that are unobtrusive, meaning they blend into our lives while still dynamically adapting to our needs,” said Alexandra Ion, an HCII. assistant professor who leads the Interactive Structures Lab.

“We classify this work as unobtrusive because the user does not ask the objects to perform any tasks. Instead, the objects sense what the user needs and perform the tasks themselves.”






The Interactive Structures Lab’s unobtrusive system uses computer vision and LLMs to reason about a person’s goals, predicting what they may do or need next.

A ceiling-mounted camera senses the environment and tracks the position of objects. The system then translates what the camera sees into a text-based description of the scene. Next, an LLM uses this translation to infer what the person’s goals may be and which actions would help them most.

Finally, the system transfers the predicted actions to the item. This process allows for seamless help with everyday tasks like cooking, organizing, office work and more.

“We have a lot of assistance from AI in the digital realm, but we want to focus on AI assistance in the physical domain,” said Violet Han, an HCII Ph.D. student working with Ion.

“We chose to enhance everyday objects because users already trust them. By advancing the objects’ capabilities, we hope to increase that trust.”

Ion and her team have started studying ways to expand the scope of unobtrusive physical AI to other parts of homes and offices.

“Imagine, for example, you come home with a bag of groceries. A shelf automatically folds out from the wall and you can set the bag down while you’re taking off your coat,” Ion said during her episode of the School of “Computer Science”Does Compute?” podcast.

“The idea is that we develop and study technology that seamlessly integrates into our daily lives and is so well assimilated that it becomes almost invisible, yet is consistently bringing us new functionality.”

This stapler knows when you need it

Alexandra Ion, faculty in the Human-Computer Interaction Institute at Carnegie Mellon University’s School of Computer Science, and Violet Han are part of a team using AI to turn everyday objects into proactive personal assistants. Credit: Carnegie Mellon University

The Interactive Structures Lab aims to create intuitive physical interfaces that bring safe, reliable physical assistance into homes, hospitals, factories and other spaces.

More information:
Violet Yinuo Han et al, Towards Unobtrusive Physical AI: Augmenting Everyday Objects with Intelligence and Robotic Movement for Proactive Assistance, Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology (2025). DOI: 10.1145/3746059.3747726

Provided by Carnegie Mellon University


Citation: A stapler that knows when you need it: Using AI to turn everyday objects into proactive assistants (2025, October 15) retrieved 15 October 2025 from https://techxplore.com/news/2025-10-stapler-ai-everyday-proactive.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *