Intelligent Robot: Implementation and Applications

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Today, the development of robots is making steady advances. In particular, the Robot Operating System (ROS) offers a unified platform that greatly facilitates the development of robots and has become a new hotspot for learning and application in the field of robotics research.This book introduces readers to the key technologies and development methods for ROS-based intelligent robots. Covering both the development history of robots and various aspects of programming robots, it offers effective support for beginners.The book is divided into three parts, the first of which introduces the basics of robots, including their definition, development, composition, and key technologies. In turn, the second part covers the hardware and software components and the implementation of functions such as vision, speech, grasping, and autonomous navigation. These functions need to work together to provide user-friendlier and more intelligent service. The third part shows how to develop robots with different functions in different application scenarios.Combining theoretical and practical aspects, with a strong focus on application, this work can be used as a reference book for robotics-related courses. Moreover, it will benefit all readers who are interested in intelligent robot development, sharing essential insights into developing service robots based on ROS.

Author(s): Feng Duan, Wenyu Li, Ying Tan
Publisher: Springer
Year: 2023

Language: English
Pages: 292
City: Singapore

Preface
Contents
Part I: Basic Knowledge
Chapter 1: Overview of Robots
1.1 Definition and Classification of Robots
1.1.1 Definition of Robots
1.1.2 Classification of Service Robots
1.2 The Evolution of Modern Robots
1.2.1 Early Stages of Modern Robots Research
1.2.2 The 1970s
1.2.3 The 1980s
1.2.4 The 1990s
1.2.5 Twenty-First Century
1.3 Composition of the Robots
1.3.1 Actuators for the Robot
1.3.2 Drive Units
1.3.3 Sensing Devices
1.3.4 Control Systems
1.3.5 Intelligent Systems
1.3.6 Intelligent Human-Machine Interface Systems
1.4 Key Technologies for Robots
1.4.1 Autonomous Mobile Technologies
1.4.2 Perceptual Technologies
1.4.3 Intelligent Decision-Making and Control Technologies
1.4.4 Communication Technologies
1.5 Trends in Robots
1.5.1 Hierarchical and Humanized Human-Computer Interaction
1.5.2 Interaction Intelligence with the Environment
1.5.3 Networking of Resource Utilization
1.5.4 Standardization, Modularization and Systematization of Design and Production
Further Reading
Chapter 2: Getting Started with ROS
2.1 Introduction to ROS
2.1.1 Why Use ROS
2.1.2 What Is ROS
2.1.3 Differences Between ROS and Computer Operating Systems
2.1.4 Key Features of ROS
2.2 Installation and Uninstallation of ROS
2.2.1 Versions of ROS
2.2.2 Installing and Configuring ROS Indigo
2.2.3 Installing and Configuring ROS Melodic
2.2.4 Uninstallation of ROS
2.3 Resources for Further Learning
Further Reading
Chapter 3: The Framework and Fundamental Use of ROS
3.1 ROS Framework
3.1.1 File System Level
3.1.2 Calculation of Graph Levels
3.1.3 Community Level
3.2 Basics of ROS Use
3.2.1 Overview of Catkin
3.2.2 Workspaces and Their Creation Methods
3.2.3 Creating ROS Project Packages
3.2.4 Compiling the ROS Project Package
3.2.5 Creating ROS Nodes
3.2.6 Compiling and Running ROS Nodes
3.2.7 Use of Roslaunch
3.2.8 Creating ROS Messages and Services
3.2.9 Writing a Simple Message Publisher and Subscriber (C++ Implementation)
3.2.10 Writing a Simple Message Publisher and Subscriber (Python Implementation)
3.2.11 Testing Message Publisher and Subscriber
3.2.12 Writing a Simple Server and Client (C++ implementation)
3.2.13 Writing a Simple Server and Client (Python Implementation)
3.2.14 Testing a Simple Server and Client
Further Reading
Chapter 4: ROS Debugging
4.1 Common Commands for ROS Debugging
4.2 Common Tools for ROS Debugging
4.2.1 Using rqt_console to Modify the Debug Level at Runtime
4.2.2 Using roswtf to Detect Potential Problems in the Configuration
4.2.3 Displaying Node State Graphs Using rqt_graph
4.2.4 Plotting Scalar Data Using rqt_plot
4.2.5 Displaying 2D Images Using image_view
4.2.6 3D Data Visualization Using rqt_rviz (rviz)
4.2.7 Recording and Playing Back Data Using rosbag and rqt_bag
4.2.8 rqt Plugins and rx Applications
4.3 Summary of Basic ROS Commands
4.3.1 Creating a ROS Workspace
4.3.2 Package Related Operations
4.3.3 Related Operations of Nodes
4.3.4 Related Operations of the Topic
4.3.5 Related Operations of the Service
4.3.6 Related Operations of rosparam
4.3.7 Bag-Related Operations
4.3.8 Related Operations of rosmsg
4.3.9 Related Operations of rossrv
4.3.10 Other Commands of ROS
Part II: Implementation of Core Robot Functions
Chapter 5: Installation and Initial Use of the Robots
5.1 Introduction to the Turtlebot Robot
5.2 Composition and Configuration of the Turtlebot Robot Hardware
5.3 Installation and Testing of the Turtlebot Robot Software
5.3.1 Installing from Source
5.3.2 Deb Installation Method
5.3.3 Configuration According to Kobuki Base
5.4 Launching Turtlebot
5.5 Manual Control of Turtlebot via Keyboard
5.6 Controlling Turtlebot Through Scripting
5.7 Monitoring the Battery Status of the Kobuki
5.8 Extensions to the Turtlebot Robot
Further Reading
Chapter 6: Implementation of Robot Vision Functions
6.1 Vision Sensors
6.1.1 Kinect Vision Sensor
6.1.2 Primesense Vision Sensors
6.2 Driver Installation and Testing
6.3 Running Two Kinects at the Same Time
6.4 Running Kinect and Primesense at the Same Time
6.5 RGB Image Processing With OpenCV in ROS
6.5.1 Installing OpenCV in ROS
6.5.2 Using OpenCV in ROS Code
6.5.3 Understanding the ROS-OpenCV Conversion Architecture
6.5.4 ROS Node Example
6.6 Point Cloud Library and Its Use
6.6.1 Introduction to Point Clouds and Point Cloud Libraries
6.6.2 Data Types of PCL
6.6.3 Publish and Subscrib to Point Cloud Messages
6.6.4 Tutorial on How to Use PCL in ROS
6.6.5 A Simple Application of PCL - Detecting the Opening and Closing State of a Door
Further Reading
Chapter 7: Advanced Implementation of Robot Vision Functions
7.1 Implementation of the Robot Follow Function
7.1.1 Theoretical Foundations
7.1.2 Operational Testing of Follow Functions
7.2 Implementation of the Robot Waving Recognition Function
7.2.1 Implementation Framework and Difficulties Analysis of Robot Hand Waving Recognition
7.2.2 Face Detection Based on AdaBoost and Cascade Algorithms
7.2.3 Identifying Human Hands With the Template Matching Algorithm
7.2.4 Skin Tone Segmentation Based on YCrCb Color Space
7.2.5 Operational Testing of the Wave Recognition Function
7.3 Implementation of Object Recognition and Localization Functions of the Robot
7.3.1 Sliding Window Template Matching Method Based on Hue Histogram
7.3.2 Object Localization Methods Based on Spatial Point Cloud Data
7.3.3 Implementation and Testing of Object Recognition and Localization
7.4 Implementation of Face and Gender Recognition for Service Robots
7.4.1 Traditional Face and Gender Recognition Methods Based on OpenCV
7.4.2 Operational Testing of the OpenCV-Based Face and Gender Recognition Function
7.4.3 Face Recognition Method Based on Dlib Library
7.4.4 Operational Testing of the Face Recognition Function Based on the Dlib Library
7.5 Using TensorFlow to Recognize Handwritten Numbers
7.5.1 Introduction to TensorFlow
7.5.2 Installing TensorFlow
7.5.3 Basic Concepts of TensorFlow
7.5.4 Handwritten Digit Recognition Using TensorFlow
Further Reading
Chapter 8: Autonomous Robot Navigation Function
8.1 Key Technologies for Autonomous Robot Navigation
8.1.1 Robot Localization and Map Building
8.1.1.1 Method of Robot Localization
8.1.1.2 Map Building Methods
8.1.1.3 Simultaneous Positioning and Mapping
8.1.2 Path Planning
8.1.2.1 Global Path Planning
8.1.2.2 Local Path Planning-Robot Obstacle Avoidance
8.2 Kinematic Analysis of the Kobuki Base Model
8.3 Navigation Package Set
8.3.1 Overview of the Navigation Package Set
8.3.2 Hardware Requirements
8.4 Basics of Using the Navigation Project Package Set
8.4.1 Installation and Configuration of the Navigation Project Package Set on the Robot
8.4.1.1 Construction of the Robot
8.4.1.2 Configure the Navigation Project Package
8.4.1.3 Run the Navigation Package
8.4.2 Robot tf Configuration
8.4.2.1 The Concept of tf Conversion
8.4.2.2 Writing the Code
8.4.2.3 Dissemination of a Conversion
8.4.2.4 Use a Conversion
8.4.2.5 Compile the Code
8.4.2.6 Run the Code
8.4.3 Basic Navigation Debugging Guide
8.4.3.1 Robot Navigation Preparation
8.4.3.2 Cost Map
8.4.3.3 Local Planner
8.4.4 Release of Odometer Measurements Via ROS
8.4.4.1 nav_msgs/Odometry Messages
8.4.4.2 Use tf to Post Odometer Conversions
8.4.4.3 Writing the Code
8.4.4.4 Compile and Run
8.4.5 Publishing Sensor Data Streams Via ROS
8.4.5.1 ROS Message Headers
8.4.5.2 Publish LaserScan Messages on ROS
LaserScan Message
Write Code to Publish a LaserScan Message
Run to Display LaserScans
Publish Point Clouds on ROS PointClouds
8.5 Configuring and Using the Navigation Project Package Set on Turtlebot
8.5.1 Creating SLAM Maps Via Turtlebot
8.5.2 Autonomous Navigation Via Turtlebot’s Known Maps
Further Reading
Chapter 9: Robot Voice Interaction Functions of Basic Theory
9.1 Speech Recognition
9.1.1 Acoustic Models
9.1.1.1 Hidden Markov Models
9.1.1.2 Gaussian Mixture Model
9.1.1.3 Deep Neural Networks
9.1.2 Language Models
9.1.2.1 N-Gram
9.1.2.2 NNLM
9.1.2.3 Word2Vec
9.2 Semantic Understanding
9.2.1 Seq2Seq
9.3 Speech Synthesis
Further Reading
Chapter 10: Implementation of Robot Voice Interaction Functionality: PocketSphinx
10.1 Hardware
10.2 Introduction of PocketSphinx Speech Recognition System
10.3 Installing, Testing PocketSphinx on Indigo
10.3.1 Installation of PocketSphinx
10.3.2 Installing a Sound Library for Speech Synthesis
10.3.3 Language Modeling with Online Tools
10.4 Installing, Testing PocketSphinx on Kenetic
10.4.1 Installing PocketSphinx on Kenetic versions
10.4.2 Testing of PocketSphinx Speech Recognition
Chapter 11: Implementation of Robot Arm Grasping Function
11.1 Components of the Robot Arm Hardware
11.2 Kinematic Analysis of the Robot Arm
11.3 Setting the Servo ID of the Robot Arm
11.4 Controlling Turtlebot-Arm with USB2Dynamixel
11.4.1 Installing and Testing the dynamixel_motor Package
11.4.2 Implementation of the Robotic arm Gripping Function
Part III: Applications of Robots
Chapter 12: Integrated Robots Application Case 1: Long Command Recognition and Multitasking Execution
12.1 Case Objectives
12.2 Voice Recognition Tasks
12.3 Autonomous Navigation in the Home Environment
12.4 Object Recognition and Grasping
Chapter 13: Integrated Robots Application Case 2: Following and Assisting the User
13.1 Case Objectives
13.2 Voice Recognition Commands
13.3 Following and Autonomous Navigation
13.4 Detecting and Recognizing Faces
Chapter 14: Integrated Robotics Application Case III: Customers Wave for the Robot Ordering
14.1 Case Objectives
14.2 Robot Real-Time Mapping
14.3 Robot Recognizes Wavers and Moves to Waving Man
14.4 Voice Recognition Menu
14.5 Autonomous Navigation Back to the Bar