Technology, Construction Automation, Robotics, BIM, Virtual Reality, Augmented Reality, Artificial Intelligence, Internet of Things, IoT
Blaine Brownell explores emergent teleoperation and telerobotics technologies that could revolutionize the built environment.
Design practitioners have become familiar with an array of evolving technologies such as virtual and augmented reality (VR/AR), artificial intelligence (AI), the internet of things (IoT), building information modeling (BIM), and robotics. What we contemplate less often, however, is what happens when these technologies are combined.
Enter the world of teleoperation, which is the control of a machine or system from a physical distance. The concept of a remote-controlled machine is nothing new, but advances in AR and communication technologies are making teleoperability more sophisticated and commonplace. One ultimate goal of teleoperability is telepresence, which is commonly used to describe to videoconferencing, a passive audiovisual experience. But increasingly, it also pertains to remote manipulation. Telerobotics refers specifically to the remote operation of semi-autonomous robots. These approaches all involve a humanâ€“machine interface (HMI), which consists of â€śhardware and software that allow user inputs to be translated as signals for machines that, in turn, provide the required result to the user,â€ť according to Techopedia. As one might guess, advances in HMI technology represent significant potential transformations in building design and construction.
Tokyo-based company SE4 has created a similar telerobotics system that overcomes network lag by using AI to accelerate robotic control. Combining VR and computer vision with AI and robotics, SE4's Semantic Control system can anticipate user choices relative to the robotâ€™s environment. â€śWeâ€™ve created a framework for creating physical understanding of the world around the machines,â€ť said SE4 CEO Lochlainn Wilson in a July interview with The Robot Report. â€śWith semantic-style understanding, a robot in the environment can use its own sensors and interpret human instructions through VR.â€ť
Developed for construction applications, the system can anticipate potential collisions between physical objects, or between objects and the site, as well as how to move objects precisely into place (like the â€śsnapâ€ť function in drawing software). Semantic Control can also accommodate collaborative robots, or â€ścobots,â€ť to build in a coordinated fashion. â€śWith Semantic Control, weâ€™re making an ecosystem where robots can coordinate together,â€ť SE4 chief technology officer Pavel Savkin said in the same article. â€śThe human says what to do, and the robot decides how.â€ť
Eventually, machines may be let loose to construct buildings alongside humans. Despite the significant challenges robotics manufacturers have faced in creating machines that the mobility and agility of the human body, Waltham, Mass.â€“based BostonDynamics has made tremendous advances. Its Atlas humanoid robot, made of 3D-printed components for lightness, employs a compact hydraulic system with 28 independently powered joints. It can move up to a speed of 4.9 feet per second. Referring to BostonDynamicsâ€™ impressive feat, Phil Rader, University of Minnesota VR research fellow, tells ARCHITECT that â€śthe day will come when robots can move freely around and using AI will be able to discern the real world conditions and make real-time decisions.â€ť Rader, an architectural designer who researches VR and telerobotics technologies, imagines that future job sites will likely be populated by humans as well as humanoids, one working alongside the other. The construction robots might be fully autonomous, says Rader, or â€śit's possible that the robot worker is just being operated by a human from a remote location.â€ť