Visual/tactile interaction will be implemented by means of a common touch screen monitor anchored
to the robotic platform and an opportunely designed graphical interface produced with C#-like
development software. In addition, user could be provided with a small remote control, with no more
than three or four buttons, to send immediate commands to robot among the ZigBee network.
The Simon open-source speech recognition program, produced by Simon (partner of the project) will
be used to develop Natural language interaction. In particular some actions will be performed: 1)
studying the mobile platforms, 2) definition of conception of an architecture (client/server
architecture) and programming of a version of Simon, which would run with the resources available,
3) preparing and testing the prototype software and the microphone hardware for reaching a
communication between robot and speaker, 4) common definition of the scenarios practicable for
testing, 5) preparing the scenarios defining the treasury of the words and the forms of Human-
Computer Interaction like Command and Control, Control of dialogs, maybe simple forms of artificial
intelligence, 6) creation of the needed speech models in English, Italian and German.