DARPA Planning to Develop Virtual Assistants
DARPA is moving to integrate new and upcoming virtual systems and technologies into the toolbelt of the modern warfighter as soon as possible. Among these is a sort of assistant program that would help personnel navigate the virtual environment most effectively. Dr. Bruce Draper, a program manager in DARPA’s Information Innovation Office, explained that:
“In the not too distant future, you can envision military personnel having a number of sensors on them at any given time – a microphone, a head-mounted camera – and displays like augmented reality (AR) headsets. These sensor platforms generate tons of data around what the user is seeing and hearing, while AR headsets provide feedback mechanisms to display and share information or instructions. What we need in the middle is an assistant that can recognize what you are doing as you start a task, has the prerequisite know-how to accomplish that task, can provide step-by-step guidance, and can alert you to any mistakes you’re making.”
For this purpose, DARPA is running its Perceptually-enabled Task Guidance (PTG) program which is tasked with expediating the development of virtual assistants capable of helping users conduct complex and demanding physical tasks. The new assistants are expected to help bridge the divide between man and machine, allowing for a true symbiotic “partnership”.
The agency has identified four critical problem areas for the program:
- Knowledge transfer: “Virtual task assistants will need to be able to automatically acquire task knowledge from instructions intended for humans, including checklists, illustrated manuals, and training videos.”
- Perceptual grounding: “Assistants need to be able to align their perceptual inputs – including objects, settings, actions, sounds, and words – with the terms the assistant needs to use to describe and model tasks to their human users so that observations can be mapped to its task knowledge.”
- Perceptual attention: “Assistants must be able to pay attention to perceptual inputs that are relevant to current tasks, while ignoring extraneous stimuli. They also need to be able to respond to unexpected, but relevant events that may alter a user’s goals or suggest a new task.”
- User modeling: “PTG assistants must be able to determine how much information to share with a user and when to do so. This requires developing and integrating an epistemic model of what the user knows, a physical model of what the user is doing, and a model of their attentional and emotional states.”
As part of its outreach to researchers and companies which may prove valuable contributors to the project, DARPA plans to hold a PTG-related meeting on Zoom later in March.