Brent Gillespie 

Publication Abstracts

[Journal] [Conference


Journal Publications


Kinematic Creep in a Continuously Variable Transmission : 
Traction Drive Mechanics for Cobots

B. Gillespie, C. Moore, M. Peshkin and J.E. Colgate

ASME Journal of Mechanical Design (to appear)

Two continuously variable transmissions are examined, one that relates a pair of linear speeds and another that relates a pair of angular speeds. These devices are elemental in the design of cobots, a new class of robot that creates virtual guiding surfaces to aid a human operator in assembly tasks. Both of these transmissions are traction drive mechanisms that rely on the support of either lateral or longitudinal forces across rolling contacts with spin. When a rolling contact between elastic bodies or even between rigid bodies in spin is called upon to transmit a tractive force, kinematic creep develops, expressing a departure from the intended rolling constraint. Creep in turn gives rise to non-ideal properties in a cobot's virtual guiding surfaces. This paper develops simple models of these two transmissions by expressing the relative velocity field in the contact patch between rolling bodies in terms of creep and spin. Coulomb friction laws are applied in a quasi-static analysis to produce complete force-motion models. These models may be used to evaluate a cobot's ability to support forces against its virtual guiding surfaces.

Please e-mail me for a pre-print at : brentg@umich.edu


A General Framework for Cobot Control

B. Gillespie, J.E. Colgate, M. Peshkin, and W. Wannasuphoprasit

 IEEE Journal on Robotics and Automation, Vol. 17, No. 4,  pp. 391-401, August 2001. 

A general framework is presented for the design and analysis of cobot controllers. Cobots are inherently passive robots intended for direct collaborative work with a human operator. While a human applies forces and movements, the contoller guides motion by tuning the cobot's set of continuously variable transmissions. In this paper, a path following controller is developed that steers the cobot so as to asymptotically approach and follow a pre-planned path. The controller is based on feedback linearization. Generality across cobot architectures is assured by designing the controller in task space and developing transformations between each of four spaces : task space, joint space, a set of coupling spaces, and steering space.


Cobot Architecture

M. Peshkin, J.E. Colgate, W. Wannasuphoprasit, C. Moore and B. Gillespie

IEEE Journal on Robotics and Automation, Vol. 17, No. 4,  pp. 377-390, August 2001.

We describe a new robot architecture: the collaborative robot, or cobot. Cobots are intended for direct physical interaction with a human operator. The cobot can create smooth, strong virtual surfaces and other haptic effects within a shared human/cobot workspace. The kinematics of cobots differs markedly from that of robots. Most significantly cobots have only one mechanical degree of freedom, regardless of their taskspace dimensionality. The instantaneous direction of motion associated with this single degree of freedom is actively servo-controlled, or steered, within the higher dimensional taskspace. This paper explains the kinematics of cobots, and the continuously variable transmissions (CVTs) which are essential to them. Powered cobots are introduced, made possible by a parallel interconnection of the CVTs. We discuss the relation of cobots to conventionally actuated robots and to nonholonomic robots. Several cobots in design, prototype, or industrial testbed settings illustrate the concepts discussed.


Conference Publications


Extremal Distance Maintenance for Parametric Curves and Surfaces

V. Patoglu and B. Gillespie

IEEE Conference on Robotics and Automation, Washington D.C., May 11-15, 2002. 

A new extremal distance tracking algorithm is presented for parametric curves and surfaces undergoing rigid body motion. The essentially geometric extremization problem is transformed into a dynamical control problem by differentiating with respect to time. Extremization is then solved with the design of a stabilizing controller. We use a feedback linearizing controller. The controller simultaneously accounts for the surface shape and motion while asymptotically achieving (and maintaining) the extremal pair. Thus collision detection takes place in a framework fully analogous to the framework used for the simulation of dynamical response.

(8 Pages PDF)


Haptic Feedback and Human Performance in a Dynamic Task

F. Huang, B. Gillespie, and A. Kuo.  

IEEE Virtual Reality Conference, Orlando, FL, March 24-28, 2002.

This study explores the effects of haptic feedback on performance and learning by human subjects executing a dynamic task.  We present the results of experiments involving the control of a ball and beam.  Human subjects perform position targeting of the ball through hand operation of the beam angle.  In our dynamic analysis we discuss how this prototype task may be used to test the efficacy of various haptic feedback conditions.  We obtain results for two conditions of haptic feedback, produced using two ball sizes, and apply various metrics to analyze performance.  We also examine ordering effects that occur in the presentation of these haptic conditions.  Our analysis and experimental findings indicate that the performance of a dynamic task is governed by the complexity of system dynamics and the magnitude of haptic feedback. Our results provide modest support to recommend exposure to a more complex, higher force-feedback task prior to the execution of a simpler lower feedback task.

(8 Pages PDF)


Shared Control Between Human and Machine

M. Steele and B. Gillespie

Human Factors and Ergonomics Society 45th Annual Meeting, Minneapolis, MN, October 2001.

When humans interface with machines, the control interface is usually passive and its response contains little information pertinent to the state of the environment.  Usually, information flows through the interface from human to machine but not so often in the reverse direction.  This work proposes a control architecture in which bi-directional information transfer occurs across the control interface, allowing the human to use the interface to simultaneously exert control and extract information.  In this alternative control architecture, which we call shared control, the human utilizes the haptic sensory modality to share control of the machine interface with an automatic controller.  We present a fixed-base driving simulator experiment in which subjects take advantage of a haptic steering wheel, which aids them in a path following task.  Results indicate that the haptic steering wheel allows a significant reduction in visual demand while improving path following performance.

(8 Pages PDF)


Cancellation of Feedthrough Dynamics Using a Force Reflecting Joystick

 B. Gillespie, P. Tang, and C. Hasser

Presented at: ASME  IMECE, Nashville, TX. November, 1999. 

This paper reports on a set of theoretical and experimental investigations into the use of force reflection to enhance the dynamic behavior of human piloted vehicles, especially joystick-controlled vehicles such as fly-by-wire jets, bulldozers, and powered wheelchairs. Briefly, force reflection is used in the manual controls to cancel the effects of feedthrough dynamics. Feedthrough dynamics refers to generation of inadvertent steering or speed command inputs due to the action of inertia forces between the pilot's hand and manual controls. These inertia forces arise in the pilot's body in response to vehicle accelerations. To eliminate feedthrough dynamics, a canceling force is produced on the force-reflecting joystick using a model of the pilot dynamics and the known vehicle accelerations. A custom motion platform and a commercial force-reflecting joystick are used in a set of experiments to test the idea. Parameter values for an assumed model are estimated by observing the response of the pilot's body to known platform accelerations. Cancellation of feedthrough dynamics is demonstrated for a human subject.

(8 Pages PDF)

The Virtual Teacher

B. Gillespie, S. O'Modhrain, P. Tang, C. Pham, and D. Zaretsky

Proceedings of the ASME  IMECE, DSC-Vol. 64, Anaheim, CA Nov 15-20, 1998.  pp. 171-178.

This paper introduces the virtual teacher, a device or agent that supplements an environment in order to facilitate acquisition by a human user of a manual skill. Like the virtual fixture, a virtual teacher generally acts as an aide or facilitator to task execution, but unlike the virtual fixture, the virtual teacher is present only during training periods. During eventual task performance the teacher is absent. The virtual teacher's objective, implicitly understood by the user, is to promote independent mastery over the task. We review and organize common paradigms for the teaching of manual skills in real-world settings and use these as inspiration for the design of virtual teachers. In particular, we are interested in the ways in which a teacher, real or virtual, can demonstrate a strategy or impart a `feel' for a task by guiding movement of the pupil's hand. A pilot study involving 24 participants was used to test the virtual teacher concept with a simulated crane moving task. The present virtual teacher implementation did not significantly improve learning curves. However, further performance interpretations indicate that the lack of positive effect can be remedied with modifications to the virtual teacher that address component skills and ensure suitability to various initial skill levels.

(8 Pages PDF)

A Survey of Multibody Dynamics for Virtual Environments

 B. Gillespie and J. E. Colgate

Proceedings of the ASME  IMECE, DSC-Vol. 61, Dallas, TX. Nov 15-20, 1997.  pp. 45-54.

The field of computational dynamics is surveyed, focusing on issues relevant to the construction of a general purpose simulator with haptic display.   The various formalisms available for generating dynamical models will be examined with regard to the form of the equations of motion which they produce.   To render the effects of intermittent contact and other transient phenomena as driven by a human haptic explorer, a model is needed which is computationally efficient yet can handle changing kinematic constraints neatly.  Models in dependent and independent coordinates produced with the Newton-Euler, Lagrange and Kane formalisms will each be examined in order to provide a vantage point from which an informed selection may be made from among the many tools now available in computational dynamics. 


(10 pages PDF)


Stable User-Specific Rendering of the Virtual Wall

B. Gillespie and M. Cutkosky

Proceedings of the ASME IMECE, DSC-Vol. 58, Atlanta, GA, Nov 17-22, 1996. pp. 397-406.

Efficient control algorithms are developed to implement stiff virtual walls without chatter.  An analysis of the complete coupled system comprising controller, interface device, and user's finger underlies the design of a wall algorithm,  thus each virtual wall is tailored to a specific user impedance.  The finger is modeled as a static second order impedance with justification drawn from empirical studies of limb dynamics available in the literature and from observations of the disparity in time scales between contact instability and volitional control.  Compensation is incorporated for the destabilizing effects of the zero order hold using either model based prediction or design in the digital domain.  The destabilizing effects of asynchronous  wall on/off switching times and sampling times are tracked by a special watchdog while deadbeat control is used to periodically eliminate these effects.  Extensions are discussed, including on-the-fly system identification of the user impedance.
 
(10 pages PDF)


The Moose: A Haptic User Interface for Blind Persons

S. O’Modhrain and B. Gillespie

Proceedings of the Third WWW6 Conference, Santa Clara, CA, April 1997.
This paper presents our work to date on a haptic interface whose immediate aim is to provide access for blind computer users to graphical user interfaces.  In this presentation, we describe the hardware and supporting software which together reinterprets a Microsoft Windows screen for the haptic senses.  Screen objects such as windows, buttons, sliders, and pull-down menus are mapped to the workspace of a two-axis haptic interface called the Moose where they emerge as patches and lines of varying resistance.  The Moose operates much like a mouse except that it is able to move under its own power and thereby make apparent touchable virtual objects.   Thus presented to the hand, interface objects may be located, identified, and even manipulated or activated.  Using Microsoft Windows as a test bench, we have proven the feasibility and usefulness of the haptic interface approach for non-visual computer access.  Extensions to haptic browsing of the Web are discussed.
 

Examining the Influence of Audio and Visual Stimuli on a Haptic Display

N. Miner, B. Gillespie, and T. Caudell

Proceedings of the 1996 IMAGE Conference, Phoenix, AZ, June 23-25, 1996.
This paper presents the results of a cross-modal study that examines the perceptual influence of visual and auditory stimuli on a virtual object presented through a haptic interface device. The experiment yields a characterization of the effectiveness of visual and auditory stimuli at reinforcing a haptic percept. Results also indicate that the visual stimuli, and the combination of the visual and auditory stimuli, influenced a subject's haptic perception. The auditory stimuli alone did not significantly influence the haptic perception. This finding may be due to the specific auditory stimuli used, thus highlighting the importance of carefully selecting and characterizing the perceptual qualities of sensory stimuli. These results provide an impetus for further investigation into cross-modal interfaces, especially as a means for creating more convincing virtual objects for the haptic senses.
 
(http document)


The Virtual Piano Action: Design and Implementation

B. Gillespie

Proceedings of the International Computer Music Conference, Aahus, Denmark, Sept 12-17, 1994. pp. 167-170.

The design of a virtual piano action composed of a keyboard of motorized keys and a real-time mechanical system simulation is presented.   Using this simulator, we have re-created certain aspects of the feel of the grand piano by numerically integrating the equations of motion of a simplified piano action in real time in a human-in-the-loop simulation scheme.  In this paper, the simulation of the release and catch of the hammer  is used to introduce the simulator architecture.  The structure of a software module which manages the simulation of  models with changing kinematic constraints is discussed, including a finite state machine driver which allows for the simulation of rigid-body systems which may take on various constraint conditions in a sequence dependent on run-time interaction.
 
(4 pages PDF)


The Moose: A Haptic User Interface for Blind Persons with Application to the Digital Sound Studio

B. Gillespie and S. O’Modhrain

Stanford University Department of Music Technical Report STAN-M-95, October, 1995.

This project involves the development of a new computer interface based on haptics.  Information will be transferred from computer to user mechanically (by touch) instead of visually or aurally.  The immediate aim is to provide access for blind sound engineers to the graphics-based computer interfaces currently found in digital sound studios.

Access for blind persons to computer applications with graphical user interfaces is virtually nonexistent at present.  A mouse is of no use without the user's visual tracking of the cursor on the screen.  With a haptic interface, screen objects such as buttons, sliders and pull-down menus will be presented mechanically to the user's haptic senses (kinesthetic and tactile), where they can be felt, located, identified, and, through the use of the same device for input, activated.  We have built a prototype two-axis device which operates much like a mouse, except that it is also able to move under its own power.  By producing forces on the user's hand which are a function of both the user's motions and the buttons or windows under the cursor, touchable representations of the screen objects are created.   Using this prototype device we have already proven the feasibility and usefulness of the haptic interface approach for non-visual computer access.    We expect that haptic interface devices will become standard computer interface tools, supplementing the visual  resentation with haptic presentation for all users.  Less attention to visual tracking would be needed.  A more complete
and holistic presentation of information will be made by the computer.  This approach is, we believe, particularly valuable in the design of application interfaces for digital audio editing.


(8 pages PDF)


Design of High-fidelity Haptic Display for One-dimensional Force Reflection Applications

B. Gillespie and L. Rosenberg

 Telemanipulator and Telepresence Technology, Proceedings of the International Society of Optical Engineers (SPIE) East Coast Conference, Boston, MA, Oct 31-Nov 1, 1994, pp. 44-54.

This paper discusses the development of a virtual reality platform for the simulation of medical procedures which involve needle insertion into human tissue.  The paper's focus will be the hardware and software requirements for haptic display of a particular medical procedure known as Epidural Analgesia.  To perform this delicate manual procedure, an anesthesiologist must carefully guide a needle through various layers of tissue using only haptic cues for guidance.  A simplifying aspect for the simulator design, all motions and forces involved in the task occur along a fixed line once insertion begins.    To create a haptic representation of this procedure, we have explored both physical modeling and perceptual modeling techniques.  A preliminary physical model was built based on CT-scan data of the operative site.  A preliminary perceptual model was built based on current training techniques for the procedure provided by a skilled instructor.  We compare and contrast these two modeling methods and discuss the implications of each. We select and defend the  perceptual model as a superior approach for the epidural analgesia simulator.
 
(11 pages PDF)


Interactive Dynamics with Haptic Display

B. Gillespie and M. Cutkosky.

Proceedings of the ASME Winter Annual Meeting, New Orleans, LA, Nov 28-Dec 3, 1993. DSC-Vol. 49 pp. 65-72.

The simulation of virtual environments which are  multi-degree-of-freedom presents unique challenges to the designer of control software for haptic display devices. Once the successfully rendered stiffness, damping, and mass elements are interconnected to form multibody systems, a host of controller implementation choices arise. For one, the control law must manifest the dynamics of the simulated multibody system.  Further extensions are required if the interconnection topology is allowed to change, as in the case of changing kinematic constraints. The control law must be able to undergo transformations in order to reflect these. We want to include such capabilities in our simulator repertoire, as these are among the most interesting to explore haptically. This paper describes a combined simulation and experimental apparatus for exploring issues in the haptic display of dynamical models. In particular, we address problems in the simulation of changing kinematic constraints with numerical integration methods which have been specialized for real-time mechanical system simulation. Issues in the software design of a haptic display are addressed. A simplified model of the piano action is used as an illustrative example.
 
(8 pages PDF)


The Touchback Keyboard

B. Gillespie

Proceedings of the International Computer Music Conference, San Jose, CA, Oct 14-18, 1992. pp. 447-448.
A digital control system capable of simulating multi degree-of-freedom dynamical systems in real time with visual, audio, and haptic displays is presented.  A set of software and hardware tools form a testbed in which a dynamical system can be modeled, reduced to equations of motion, and simulated.  The user interacts with a powered key to influence the behavior of the dynamical system and feel the computed interaction forces being fed back in real-time.  Of primary interest for simulation with haptic display are the grand piano action and other keyboard instrument controllers.  Various keyboard actions are demonstrated.
 
(2 pages PDF)

Dynamical Modeling of the Grand Piano Action

B. Gillespie and M. Cutkosky.

Proceedings of the International Computer Music Conference, San Jose, CA Oct 14-18, 1992,  pp. 77-80.

The grand piano action is modeled as a set of four rigid bodies using Kane's method.  Computerized symbol manipulation is utilized to streamline the formulation of the equations of motion so that several models can be considered, each of increasing detail.  Various methods for checking the dynamical model thus derived are explored.  A computer animation driven by simulation of the equations of motion is compared to a high-speed video recording of the piano action moving under a known force at the key.  For quantitative evaluation, the velocities and angular velocities of each of the bodies are extracted from the video recording by means of digitization techniques.  The aspects of the model of particular interest for emulation by a controlled system, namely, the mechanical impedance at the key and the velocity with which the hammer strikes the string, can be studied in the equations of motion and compared to empirical data.
 
(4 pages PDF)


Brent Gillespie, brentg@umich.edu
4 July, 2002