The Mobile Pick and Place project involves a robotic arm mounted on an Autonomous Mobile Robot (AMR). The system performs pick-and-place operations autonomously by navigating to a target location, detecting and estimating the pose of objects using a YOLO-based object detection module, and executing robotic arm movements. The navigation stack and Robotics arm software is implemented using the ROS 2 Humble framework.
- Navigation : The AMR navigates from its current location to the designated pick and drop location using the Nav2 stack.
- Object Detection and 3D/6D Pose Estimation( pose_estimation_pkg ): A YOLO-based AI module identifies the object and uses it for estimating the pose of the object relative to the camera by utilizing Hyco compiler.
- Pose Transformation: The
rcar_communication
package transforms the pose from the camera frame to the base frame of the robotic arm. - Pick-and-Place Operation: Joint angles are calculated, and the robotic arm executes the pick-and-place task.
- ROS 2 Humble installed on your system.
- Correct hardware connections for the AMR and robotic arm.
- A YOLO v8-based object detection model.
git clone https://github.com/Ignitarium-Renesas/R-Car_Mobile_Pick_Place_V2.git
cd R-Car_Mobile_Pick_Place_V2/pick_n_place_ws/
pip install -r src/pose_estimation_pkg/requirements.txt
pip install pymycobot --upgrade
cd R-Car_Mobile_Pick_Place_V2/pick_n_place_ws/src/Project-Rover-Robot
npm i
cd R-Car_Mobile_Pick_Place_V2/pick_n_place_ws
colcon build
Flash the code provided in this folder in to arduino board.
Ensure all hardware connections are correctly set up.
- Turn on the Wi-Fi Router.
- Turn on the robotic arm & mobile base.
Connect to the Wi-Fi access point:
SSID: IGN-Robo_5G
Pass: IGN_Robo
Log in to the Mecharm using SSH:
ssh er@192.168.0.146
# Password: Elephant (default password for Elephant Robotics Hardware)
Open a terminal in the Mecharm and launch the camera node:
ros2 launch realsense2_camera rs_launch.py
Open a terminal in the mecharm and launch the camera node:
python3 Server_280.py
ssh root@192.168.0.217
docker start rcar
docker exec -it rcar bash
ros2 launch rcar_demo run_demo.launch.py
# copy app_temp in to rcar board
scp -r HyCo_Infer_App/app_temp/ root@192.168.0.217:~/
ssh root@192.168.0.217
cd app_temp
export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:/lib:/home/root/app_temp/onnxruntime-linux-aarch64-1.17.1/lib:/home/root/app_temp
./rcar_app_ws yolo_v5s_ign-app/exec_config.json
ssh root@192.168.0.217
docker start rcar
docker exec -it rcar bash
#run navigation modules:Bringup
ros2 launch rcar_robot base_bringup.launch.py
ssh root@192.168.0.217
docker start rcar
docker exec -it rcar bash
#run navigation modules:Navigation
Ros2 launch rcar_robot rcar_navigation.launch.py
cd R-Car_Mobile_Pick_Place_V2/pick_n_place_ws/src/mycobot280/Project-Rover-Robot/
node index.js
#Open a Browser and in search tab type:
http://localhost:5000/
- Refer the PCB_Files folder for Design related files
- Refer this README for further instruction related to pose estimation.
- Refer this README for further instructions related to HyCo Inference.
- Refer this README for further instructions related to Pose Transformation and services.