On December 16, Xerox’s research division PARC, in collaboration with the University of California Santa Barbara, the University of Rostock, and augmented reality company Patched Reality, was awarded a $5.8 million contract by DARPA, the Pentagon’s blue-sky projects wing. The goal is to make a program that can guide users through complex operations beyond their existing knowledge, like letting a mechanic repair a machine they’ve never seen before. “Augmented reality, computer vision, language processing, dialogue processing and reasoning are all AI technologies that have disrupted a variety of industries individually but never in such a coordinated and synergistic fashion,” Charles Ortiz, the principal investigator for AMIGOS, said in a Xerox release, “By leveraging existing instructional materials to create new AR guidance, the AMIGOS project stands to accelerate this movement, making real-time task guidance and feedback available on-demand” AMIGOS falls under the broader goals of DARPA’s Perceptually-enabled Task Guidance research, which operates with the understanding that humans, with finite capacity, cannot possibly learn everything about every physical task they’ll be asked to perform before they are sent into the field. And yet, those humans will have to treat novel injuries, or repair unfamiliar machines, almost by definition of military work. Consider one such hypothetical scenario. At a 2033 naval base, a mechanic may be tasked with repairing a new kind of underwater robot, recovered with damage. The robot’s schematics are stored on a server in the states, and the engineers who built it are available on call when they wake. But the mechanic has never seen the robot before, and is the only person physically present who has any kind of relevant experience to repair it. Donning a headset powered by AMIGOS, the mechanic could look over the robot, and find the hatch that opens up its processor. As she works, this mechanic would get a visual display of next steps, all of which is powered by an AI on the same server as the schematics. That AI automatically generates a user manual as the mechanic continues repairs, and gives instructions calibrated to the knowledge this mechanic already had, making the unfamiliar feel familiar. As promised, at least, AMIGOS offers the assistance of a singularly useful friend, in the exact moment of need, to guide someone through a new physical task for the first time. For now, the program will work on creating two component parts for AMIGOS. Xerox describes one of them as an offline tool that can scan text from manuals and other instructional material, like videos, to create the step-by-step guide needed for a task. The second component will be online, and intends to use AI to adapt the manual’s instructions into a real-time instructional guide. The offline component ingests learning material, preparing it for use by the online component, which draws on the ingested manuals to generate an updated guide in real time for the user. It is, at once, an exercise in learning and a teaching tool, offering results at the speed of interaction. “Military personnel are expected to perform an increasing number of tasks and more complex tasks than ever before,” reads the DARPA description for Perceptually-enabled Task Guidance. “Mechanics, for example, are asked to repair more types of increasingly sophisticated machines and platforms, and Medics are asked to perform more procedures over extended periods of time.” Both machine and human body repair are specialized tasks, with years of training, and often professional requirements for continuing education. No part of AMIGOS seems built to replace that. Instead, by offering the knowledge of a manual in real-time and through an augmented reality headset, AMIGOS could give crucial knowledge to people who would otherwise not have it. Learning on the fly, even with guidance, isn’t ideal, but it’s much better than a medic simply being unable to treat a life-threatening injury in the field because they are not sure about how to proceed.