Blaine Brownell explores emergent teleoperation and telerobotics technologies that could revolutionize the built environment.
Design practitioners have become familiar with an array of evolving technologies such as virtual and augmented reality (VR/AR), artificial intelligence (AI), the internet of things (IoT), building information modeling (BIM), and robotics. What we contemplate less often, however, is what happens when these technologies are combined.
Enter the world of teleoperation, which is the control of a machine or system from a physical distance. The concept of a remote-controlled machine is nothing new, but advances in AR and communication technologies are making teleoperability more sophisticated and commonplace. One ultimate goal of teleoperability is telepresence, which is commonly used to describe to videoconferencing, a passive audiovisual experience. But increasingly, it also pertains to remote manipulation. Telerobotics refers specifically to the remote operation of semi-autonomous robots. These approaches all involve a humanâmachine interface (HMI), which consists of âhardware and software that allow user inputs to be translated as signals for machines that, in turn, provide the required result to the user,â according to Techopedia. As one might guess, advances in HMI technology represent significant potential transformations in building design and construction.
Tokyo-based company SE4 has created a similar telerobotics system that overcomes network lag by using AI to accelerate robotic control. Combining VR and computer vision with AI and robotics, SE4's Semantic Control system can anticipate user choices relative to the robotâs environment. âWeâve created a framework for creating physical understanding of the world around the machines,â said SE4 CEO Lochlainn Wilson in a July interview with The Robot Report. âWith semantic-style understanding, a robot in the environment can use its own sensors and interpret human instructions through VR.â
Developed for construction applications, the system can anticipate potential collisions between physical objects, or between objects and the site, as well as how to move objects precisely into place (like the âsnapâ function in drawing software). Semantic Control can also accommodate collaborative robots, or âcobots,â to build in a coordinated fashion. âWith Semantic Control, weâre making an ecosystem where robots can coordinate together,â SE4 chief technology officer Pavel Savkin said in the same article. âThe human says what to do, and the robot decides how.â
Eventually, machines may be let loose to construct buildings alongside humans. Despite the significant challenges robotics manufacturers have faced in creating machines that the mobility and agility of the human body, Waltham, Mass.âbased BostonDynamics has made tremendous advances. Its Atlas humanoid robot, made of 3D-printed components for lightness, employs a compact hydraulic system with 28 independently powered joints. It can move up to a speed of 4.9 feet per second. Referring to BostonDynamicsâ impressive feat, Phil Rader, University of Minnesota VR research fellow, tells ARCHITECT that âthe day will come when robots can move freely around and using AI will be able to discern the real world conditions and make real-time decisions.â Rader, an architectural designer who researches VR and telerobotics technologies, imagines that future job sites will likely be populated by humans as well as humanoids, one working alongside the other. The construction robots might be fully autonomous, says Rader, or âit's possible that the robot worker is just being operated by a human from a remote location.â