Skip to content

naitikpahwa18/moveit_arm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Voice-Controlled 3+1 DOF Robotic Arm using ROS 2, MoveIt 2, and Amazon Alexa

Overview

This project implements a 3+1 DOF robotic arm (3 revolute arm joints + 1 gripper actuation controlling both fingers) that can be controlled using voice commands via Amazon Alexa Developer Test Console as well as ROS2 actions. The system is built on ROS2 and MoveIt2, and supports both Python and C++ MoveIt APIs, depending on the ROS2 distribution.

The project is designed to be modular and extensible, making it suitable for learning, experimentation, and further research in robot manipulation, motion planning, and voice-based human–robot interaction.


Key Features

  • 3+1 DOF robotic arm with synchronized gripper actuation
  • Motion planning and execution using MoveIt2
  • Voice-based control using Amazon Alexa
  • ROS 2 Action–based task execution via a dedicated Task Server
  • Supports both simulation and extension to real hardware

Technologies Used

  • ROS 2

    • Humble: C++ MoveIt APIs
    • Iron and newer: Python MoveIt APIs
  • MoveIt 2

  • Python and C++

  • URDF / SRDF for robot modeling and semantic description

  • Amazon Alexa Developer Console (custom skill)

  • ROS 2 Actions for task execution


Control Interfaces

1. Voice Control (Amazon Alexa)

Voice commands are processed via a custom Alexa skill. The skill backend communicates with ROS 2 and triggers predefined manipulation tasks.

Note: The Alexa Skill ID is intentionally replaced with a placeholder in the code. You must create your own skill using the Amazon Alexa Developer Console and insert your Skill ID manually.


2. ROS 2 Action Interface (Task Server)

A dedicated Task Server exposes a ROS 2 action interface that allows commanding the arm directly using ROS tools.

Example tasks are mapped using an integer-based interface:

  • task_number: 0
  • task_number: 1
  • task_number: 2

Each task corresponds to a predefined arm or gripper motion.


Setup and Build Instructions

1️⃣ Clone the repository into your ROS 2 workspace

cd ~/<your_ws>/
git clone https://github.com/naitikpahwa18/moveit_arm.git

2️⃣ Add your Alexa Skill ID

Create a custom skill using the Amazon Alexa Developer Console, then edit the following file:

👉 alexa_interface.py

Replace:

skill_id = "YOUR_ALEXA_SKILL_ID_HERE"

3️⃣ Go to your workspace root and build the workspace

cd ~/<your_ws>
colcon build

4️⃣ Source the workspace

. install/setup.bash

5️⃣ Launch the simulated robot

ros2 launch manipulator_bringup simulated_robot.launch.py

Controlling the Robot

Method 1: Voice Control via Amazon Alexa

  • Use the Alexa Developer Console test interface
  • Invoke your skill and issue supported voice commands
  • The commands are translated into ROS 2 tasks and executed by MoveIt

Method 2: ROS 2 Action Command

Send a task goal directly to the Task Server:

ros2 action send_goal /task_server manipulator_msgs/action/ManipulatorTask "task_number: 0"

Replace 0 with 1 or 2 to trigger different tasks.


🎥 Demo Videos

MoveitArm.mp4

Project Structure (High-Level)

manipulator_ws/
├── manipulator_description/   # URDF, SRDF, meshes
├── manipulator_bringup/       # Launch files
├── manipulator_controller/    # Controllers and configs
├── manipulator_remote/        # Alexa interface and remote control
├── manipulator_task_server/   # ROS 2 action server
└── manipulator_msgs/          # Custom messages and actions

About

Voice-controlled 3+1 DOF robotic arm using ROS2, MoveIt2, and Amazon Alexa

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors