Technologies
For the Cybercity exhibitions, a lot of innovative technology is being used. Learn more about the technological aspect here.
Multi Camera Robot |
Two types of telepresence video robots are used in the exhibitions. The first is the Multi Camera Robot, which is controlled locally and utilizes numerous panoramic camera perspectives mounted onto its chassis, which provides the user with a wide angle view of the miniature maquette landscape that it explores. The robot uses servo motors, an Arduino motor control board, Radio Control technology, batteries, sensors and numerous 2.4 Ghz video transmitters to send High Definition video imagery to the users. The Multi Camera Robot is unique in that it has a camera mounted directly above its center point that provides the user with a perspective of the floor/ground that surrounds the robot. This 'God's eye view' provides the driver with an unparalleled overview of the environment and allows for a more natural interaction for people exploring the exhibition. |
Internet Controlled Robot |
|
The second type of robot used in the exhibition is the Internet Controlled Robot, which is linked to the internet and controllable over Skype videoconferencing from any location in the world. These robots use mini computers with Intel Dual Core 2 chips, wide angle cameras with auto focus, Arduino motor control boards, high performance Lithium/Ion batteries, RFID readers, 12VDC gear motors and IR sensors. The robots use custom software that allows them to be controlled over the internet via Skype by any user who downloads the control software, which is available on the project website. The RFID reader is linked to the Intelligent Floor that provides users with a GPS type navigation tool, allowing people to easily navigate through the exhibition and find specific projects. |
Online Controlled Robot Interface |
|
To control the video robots from home, online visitors can download the robot driver application. Together with Skype, this application makes it possible to steer the video robot over the internet, with only the arrow keys on your keyboard. In the picture left you have the Skype interface and on the right the RobotDriver application. |
Intelligent Flooring |
|
The floor that the robots maneuver over has hundreds of Radio Frequency Identification (RFID) tags embedded into its surface, that provide the robots with their own personal 'Google Maps' type overview of their position within the exhibition. The tags are also positioned around each model in the exhibit, allowing users to upload detailed project descriptions about specific models and the artist/designers who created them. The RFID tags create a unique environment that reacts to the position of the robot guests, providing information, navigation and intelligence to the static structures. |
Body Controlled Robot Interface (Step Pad)
|
|
To control the robots, users use an interface that utilizes body motion as a control input on a device called the Step Pad Interface. The Step Pad Interface gives the audience the ability to steer two of the robots using the movement of their bodies in a natural way that mimics the feeling of a person walking. The first Step Pad Interface is linked to the Multi Camera Robot that is controlled locally from the exhibit and incorporates an overhead video projector that surrounds the user with video imagery of the ground around them. The second Step Pad Interface is located at the virtual exhibition site and is linked to one of the Internet Controlled Robots. These intuitive interface devices allow the audience to interact within the exhibit as if they were strolling through a real environment and offer a next generation experience for people interacting within a model environment. |
Telepresence Furniture / Web Booth
|
|
People from both exhibition sites will be able to interact and talk with each other using Telepresence Furniture that links both audiences via videoconferencing devices, that projects full size images of users and their tabletops into both linked environments to create a shared sense of presence for audiences in both countries. The Telepresence Furniture breaks down barriers, giving people direct eye contact, three-dimensional imagery, shared table space in an easy to use piece of furniture that invites people from both cultures to sit down and share a meal, as if they were a meter away instead of 10,000 kilometers (www.webchair.com). |
Developed by...
The various forms of advanced technology were developed by Graham Smith in cooperation with Webchair VOF, The Hague and:
- Internet Controlled Robots: Mohammad Yosuf Haydary (NL, Faculty Robotics and Interaction, University of Applied Sciences of Rotterdam), Jeff Mann, Electronic Artist (DE, CAN). Pre-study by Engineering Faculty Chair of Dynamics and Control,University Duisburg-Essen, Prof. Dr.-Ing. Dirk Söffker, Dipl.-Ing. Dennis Gamrad. (DE)
- Online Steering Interface: Annelies Wisse, Master Design for Interaction (NL), Faculty of Industrial Design Engineering, Technical University Delft.
- Intelligent Flooring & Design of the Robots: Jorrit van der Zee (NL), Nicolo Giacomello (NL), Faculty Intelligent Product Design, HKU Utrecht School Of The Arts.
- Telepresence Furniture, The Web Booth: Nicolo Giacomello (NL, Faculty Intelligent Product Design, HKU Utrecht School Of The Arts). Pre-study Technical University Eindhoven, Sam Nemeth.
- Step Pad: Jeff Mann, Electronic Artist, (DE, CAN)
Also special thanks to Clara Kerkstra, Rick Storm and Jens Oliver Robbers for designing the visual identity.