카테고리 보관물: ugworks

ArUco Marker Tracking for Collaborative Robot

Sieun Lee (KAIST ME) has developed a program for a collaborative robot to align with a object using ArUco Marker. Knowing the position of the markers with a camera attached to the robot will be utilized to autonomous robots in machine tools: knowing the machine’s door and move a workpiece without any collisions. She implemented the application after she learned robot control using ROS (Robot Operating System) and image processing in OpenCV during the summer’s individual research, 2024.

Multiple 3D Shots for Autonomous Robotic Bin Picking

In his individual study and undergraduate thesis in 2023, Ahn Ho Tung (KAIST ME) conducted research to utilize multiple 3D shots to find objects for autonomous robotic bin picking. 3D depth sensors has measurements errors due to their optical limitation. He developed a method how the multiple shots for different angles by an affordable depth sensors with robot can minimize the position error of an object . Based on the method, the robot could improve success rate to pick up objects in random positions. In this study, ROS melodic is used to control the robot. YOLO AI model by Pytorch was used to find top object from a cluttered bin.

VR Interface for Collaborative Robot

In his individual study, Minseo Jang (장민서, KAIST ME) has developed virtual reality-based interface for a collaborative robot (Doosan Robotics). It is the digital twin that can be manipulated freely in the virtual space, which is aimed to a new interface for virtual manufacturing and human-AI collaborative space. Unity (game engine) with ROS melodic are used.