1996

2024 2023 2022 2021 2020
2019 2018 2017 2016 2015 2014 2013 2012 2011 2010
2009 2008 2007 2006 2005 2004 2003 2002 2001 2000
1999 1998 1997 1996 1995 1994 1993 1992 1991 1990
1989 1988 1987 1986 1985 1984 1983
  1. Aucoin, N., Sandbekkhaug, O., and Jenkin, M., An immersive 3D user interface for mobile robot control, Proc. IASTED Int. Conf. on Applications of Control and Robotics, Orlando, FL, 1-4, 1996.
    Very few autonomous mobile robots are really fully autonomous, and most require some sort of operator intervention. For example, in the ARK (Autonomous Robot for a Known environment) project, the ARK robot requires an operator to provide high-level mission commands. These high-level commands can be provided via a 2.5D map based user interface as well as through an immersive 3D interface. The 3D interface provides the operator with a virtual reality like control interface. It allows the operator to move through a simulation of the robot's environment, to examine the environment through an immersive display, and provides access to high-level mission commands in a more informative and natural way than is possible with the standard 2.5D map-based user interface.
  2. Dudek, G., Jenkin, M., Milios, E., and Wilkes, D., A Taxonomy for Multi-Agent Robotics, Autonomous Robots, 3:375-397, 1996.
    A key difficulty in the design of multi-agent robotic systems is the size and complexity of the space of possible designs. In order to make principled design decisions, an understanding of the many agent systems according to communication and other capabilities. We survey existing efforts involving multi systems according to their positions in the taxonomy. We also present additional results concerning multi-agent systems with the dual purposed of illustrating the usefulness of the taxonomy in simplifying discourse about robot collective properties and also demonstrating that a collective can be demonstrably more powerful than a single unit of collective.
  3. Jenkin, M. R. M., Stereopsis near the horoptor, Proc. 4th ICARCV, Singapore, 1996.
    Many stereopsis algorithms attempt to recover depth over a wide range of disparities at every pixel location. If the disparity recovery process is limited to a smaller range of disparities, and if disparity measurements are only required where they can be extracted reliably, then the task of recovering "local" disparities can be simplified considerably. This paper presents an algorithm for extracting these local disparities robustly using a unique, local affine disparity model. Preliminary experiments demonstrate the promise of the approach.