【Driven by AI, powered by Jetson Nano】JetArm is a desktop-level AI vision robot arm developed for ROS education scenarios. It supports Python programming, uses deep learning frameworks, and is developed in conjunction with MediaPipe to implement AI creative projects such as object image recognition, somatosensory control, and voice interaction.
【High-performance Vision Robot Arm】JetArm includes a 6DOF vision robot arm, featuring intelligent serial bus servos with a torque of 35KG. An HD camera is positioned at the end of robot arm, which provides a first-person perspective for object gripping tasks.
【Depth Point Cloud, 3D Scene Flexible Grabbing】JetArm is equipped with a high-performance 3D depth camera. Based on the RGB data, position coordinates and depth information of the target, combined with RGB+D fusion detection, it can realize free grabbing in 3D scene and other AI projects.
【Far-field Voice Interaction】JetArm ultimate kit incorporates a circular microphone array and speaker allowing for man-robot interaction applications, including Text to Speech conversion, 360° sound source localization, voice-controlled mapping navigation, etc. Integrated with vision robot arm, JetArm can implement voice-controlled gripping and transporting.
【Robot Control Across Platforms】JetArm provides multiple control methods, like WonderAi app (compatible with iOS and Android system), wireless handle, Robot Operating System (ROS) and keyboard, allowing you to control the robot at will.
【AI Vision Robot Powered by Raspberry Pi】ArmPi mini is a smart vision robot arm powered by Raspberry Pi. It adopts high-performance intelligent servos and HD camera, and can be programmed in Python. With its impressive capabilities, ArmPi mini unlocks diverse AI applications, such as smart vision-guided recognition and grasping.
【AI Vision Recognition and Tracking】ArmPi mini combines a HD camera and OpenCV library to recognize and locate target objects, enabling the implementation of AI applications like color sorting, target tracking and intelligent stacking.
【Inverse Kinematics Algorithm】ArmPi mini employs an inverse kinematics algorithm, enabling precise target tracking and gripping. It also provides complete source code for the inverse kinematics function assisting you in learning AI.
【App Remote Control】 ArmPi mini offers ultimate remote control with a dedicated app (iOS/ Android) and PC software. In addition, you can access first-person-view perspective on the app giving you an immersed using experience.
【Driven by AI ,Powered by NVIDIA Jetson Nano】JetMax is an open-source AI robot arm based on Robot Operating System and powered by Jetson Nano control system. It supports programmed in Python, leverages mainstream deep learning frameworks, incorporates MediaPipe development, enables YOLO model training, and utilizes TensorRT acceleration. This combination delivers a diverse range of AI applications, including object recognition, object sorting, target tracking and somatosensory control.
【AI Vision, Deep Learning】A HD camera is positioned at the end of JetMax, which enables real-time First-Person View (FPV) transmission and can recognize color, face, gesture, etc. Combined with advanced computing capabilities of Jetson Nano and deep learning, JetMax can train models for various interesting applications, including image, number, alphabet recognition, and object gripping and transportation.
【Inverse Kinematics Algorithm】JetMax employs an inverse kinematics algorithm, enabling precise target tracking, gripping, sorting, and stacking. It also provides detailed analysis on inverse kinematics, DH model, and offers the source code for the inverse kinematics function.
【Robot Control Across Platforms】JetMax provides multiple control methods, like WonderAi app (compatible with iOS and Android system), wireless handle, OC software, Robot Operating System and mouse, allowing you to control the robot at will. By importing corresponding codes, you can command JetMax to perform specific actions.
【ROS Robot Arm Powered by Raspberry Pi】ArmPi FPV is an open-source AI robot arm based on Robot Operating System and powered by Raspberry Pi. Loaded with high-performance intelligent servos and AI camera, and programmable using Python, it is capable of vision recognition and gripping.
【AI Vision Recognition and Tracking】A HD wide-angle camera is positioned at the end of ArmPi FPV, providing real-time First-Person View (FPV) transmission with a resolution of 100W pixels. By processing images with OpenCV, it can recognize color, tag and human face, opening up a wide range of AI applications, such as color sorting, target tracking, intelligent stacking and face detection.
【Inverse Kinematics Algorithm】ArmPi FPV employs an inverse kinematics algorithm, enabling precise target tracking and gripping within. It also provides detailed analysis on inverse kinematics, DH model, and offers the source code for the inverse kinematics function.
【Robot Control Across Platforms】ArmPi FPV provides multiple control methods, like WonderPi app (compatible with iOS and Android system), wireless handle, mouse, PC software and Robot Operating System, allowing you to control the robot at will.
【Abundant AI Applications】Guided by intelligent vision, ArmPi FPV excels in executing functions such as stocking in, stocking out, and stock transfer, realizing integration into Industry 4.0 environments.