NOTICE |
We would like to inform that we've released the new SentiBotics mobile robotics development kit.
The SentiBotics kit enables the rapid development and testing of mobile robots and comes with software, sample programs, a tracked platform and grasping robotic arm, 3D vision, object recognition and autonomous navigation capabilities.
Based on years of dedicated robotics research and algorithm development, the kit provides researchers, academic institutions, robotics developers and hobbyists with a ready-to-go mobile robotic platform that can dramatically reduce the time and effort required to create the necessary infrastructure, hardware, component tuning and software functionality required for research and development of robots.
SentiBotics robot hardware includes the following components, which can be easily obtained from manufacturers and suppliers worldwide, so developers can use SentiBotics as reference hardware to build their own units or to incorporate different platforms and materials:
• Tracked platform, capable of carrying a payload of up to 10 kg.
• Modular robotic arm (8 degrees of freedom, force feedback support, capable of lifting objects up to 0.5 kg).
• Two 3D cameras that allow the robot to "see" and recognize objects at a range of 0.15 to 3.5 meters.
• Onboard computer (Intel NUC i5 computer with 1.8 GHz CPU, 8 GB of RAM, 64 GB SSD drive, 802.11N wireless network interface).
• 20 Ah lithium battery with charger.
• Control pad.
The SentiBotics Development Kit also includes:
• Details of all algorithms used, including source codes in C++ and full documentation.
• ROS-based infrastructure that allows users to rapidly integrate third party robotics algorithms, migrate to other hardware (or modify existing hardware) and provides a unified framework for robotic algorithm development.
• Components and programming samples that can be used for testing or demonstration of the robot's capabilities:
o Manual robot control via a control pad: driving, rotating the joints of the robotic arm, object grasping.
o Manual on-line environment map building while driving the robot.
o Pre-programmed or autonomous environment exploration with obstacle avoidance.
o Object learning and recognition.
o Autonomous navigation to a specified, previously-visited waypoint.
o Basic grasping of an object within reach.
o Seek and grasp an object when the object is located in a previously visited waypoint.
For more information, please visit the following webpage:
http://brucenbrian.com/product5-2.html
|