Qualitative Motion Understanding

160.49 €

Order
Qualitative Motion Understanding
Mobile robots operating in real-world, outdoor scenarios depend on dynamic scene understanding for detecting and avoiding obstacles, recognizing landmarks, acquiring models, and for detecting and tracking moving objects. Motion understanding has been an active research effort for more than a decade, searching for solutions to some of these problems; however, it still remains one of the more difficult and challenging areas of computer vision research.
Qualitative Motion Understanding describes a qualitative approach to dynamic scene and motion analysis, called DRIVE (Dynamic Reasoning from Integrated Visual Evidence). The DRIVE system addresses the problems of (a) estimating the robot's egomotion, (b) reconstructing the observed 3-D scene structure; and (c) evaluating the motion of individual objects from a sequence of monocular images. The approach is based on the FOE (focus of expansion) concept, but it takes a somewhat unconventional route. The DRIVE system uses a qualitative scene model and a fuzzy focus of expansion to estimate robot motion from visual cues, to detect and track moving objects, and to construct and maintain a global dynamic reference model.

More from the series "The Springer International Series in Engineering and Computer Science"

More books by Wilhelm Burger

Log in to get access to this book and to automatically save your books and your progress.

Purchase this book or upgrade to dav Pro to read this book.

When you buy this book, you can access it regardless of your plan. You can also download the book file and read it in another app or on an Ebook reader.

80 % of the price goes directly to the author.

ISBN: 9780792392514

Language: English

Publication date: 30.06.1992

Number of pages: 210

Our shipping costs are a flat rate of €2.50, regardless of the order.
Currently, we only ship within Germany.

Shipping is free for PocketLib Pro users.

An error occured. Please check your internet connection or try it again later.