In the last decade, there have been great advancements in virtual reality (VR) resulting in its availability for everyday consumers. As VR becomes more ubiquitous, there is an opportunity to utilize this technology to create intuitive operator interfaces for interaction with complex dynamic systems, such as humanoid robots. As evidenced in the DARPA Robotics Challenge (DRC), current interfaces for humanoids primarily use a standard computer setup with monitor, keyboard, and mouse requiring operators to process 3D data with 2D devices. And although these interfaces can be very capable in operating a robot, they are often complex and require expert operators as well as extensive training. However, this paradigm can be changed with VR by allowing operators to visualize and interact with 3D data in a 3D environment, allowing for a more natural interaction. In this paper, we present our work on converting a typical interface to a virtual reality interface for NASA’s humanoid robot, Valkyrie. We compare our standard computer interface and our VR interface for Valkyrie, as well as the shared control planners and system architecture that make our interfaces possible. The goal of this work is to better understand the utility of virtual reality interfaces and how they can be employed in human-supervised robot applications so that we may move towards more intuitive and easy-to-use interfaces for control and interaction.