Robotics
Hiwonder
Open-Source Robotic Hand AiHand Powered by micro:bit V2 Programming Educational Robot, Support WonderCam AI Vision Module
- 【micro:bit Programming, AI Vision Robotic Hand】AiHand is an open-source robotic hand powered by micro:bit. AiHand features a high-performance WonderCam AI vision module with an integrated HD camera. This module enables various AI applications, including color recognition, tag recognition, and more.
- 【Powerful hardware, Support Secondary Development】AiHand is equipped with an AI vision module and a glowy ultrasonic sensor. AiHand has multiple expansion interfaces reserved to facilitate users to carry out secondary development of the robotic hand.
- 【One-click Training and Learning】AiHand's AI vision module is integrated with some learning algorithms, allowing it to complete diverse AI vision projects such as waste sorting and tag recognition.
- 【Various Control Methods&Makecode Programming】AiHand supports Android/ iOS APP control, micro:bit control.AiHand also supports Makecode programming.Drag and drop building block-style modules for programming,which is easy to learn.
Hiwonder
ROSPug Quadruped Bionic Robot Dog Powered by Jetson Nano ROS Open Source Python Programming
【Driven by Jetson Nano and high-voltage intelligent serial bus servos】 ROSPug is a smart quadruped robot dogdruped robot driven by Jetson Nano and built on the Robot Operating System (ROS). It is equipped with 12 high-voltage strong-magnetic intelligent serial bus servos, delivering high-precision performance, rapid rotation speed, and robust torque.
【AI Vision, Unlimited Creativity】 ROSPug is equipped with an HD wide-angle camera. It utilizes OpenCV library for efficient image processing, enabling a diverse range of AI applications, including target recognition and localization, line following, obstacle avoidance, face detection, ball shooting, color tracking and tag recognition.
【Various Control Methods and Live Camera Feed】 You can conveniently control ROSPug through WonderROS app available for Android and iOS devices, PC software, or a wireless PS2 handle. Additionally, ROSPug provides a first-person perspective experience by transmitting the live camera feed to the app.
【Gait Planning, Free Adjustment】 ROSPug incorporates inverse kinematics algorithm offering precise control over the touch time, lift time, and lifted height of each leg. You can easily adjust these parameters to achieve different gaits, including ripple and trot. Additionally, ROSPug provides detailed analysis of inverse kinematics, along with the source code for the inverse kinematics function.
- The ROSPug's color was upgraded from black to gray in May 2025.
Hiwonder
uHand UNO Open Source AI Bionic Robot Hand Support Somatosensory Control, Arduino Programming
- 【Arduino Programming, Open Source】uHand UNO is built on the Atmega328 platform and is compatible with Arduino programming. The programs for uHand UNO are open-source, and learning tutorials and secondary development examples are available, making it easier for you to develop your robotic hand.
- 【High Performance, Support Sensor Expansion】 uHand UNO is equipped with a 6-channel knob controller, Bluetooth module, 6 anti-blocking servos, and other high-performance hardware. Moreover, it provides multiple expansion ports for sensor integration, including ESP32 Cam, accelerometer, touch sensor, glowy ultrasonic sensor, etc., empowering users to engage in secondary development for sonic ranging and pose control capabilities.
- 【Versatile Control Options】uHand UNO supports both app control and wireless glove control. Users can utilize knob potentiometers for real-time knob control and offline action editing.
- 【High-quality AI Education Demonstration System】AiArm is a smart vision robot arm powered by a self-developed controller, CoreX. It adopts high-performance intelligent servos and vision modules and can be programmed in Scratch and Python. With its impressive capabilities, AiArm unlocks diverse AI applications, such as smart vision-guided recognition and grasping.
- 【AI Vision Recognition and Tracking】AiArm combines the WonderCam AI vision module to recognize and locate target objects, enabling the implementation of AI applications like color sorting and waste sorting.
- 【Different Motion Control Methods】Whether it's PC software control, offline control, or inverse kinematics control, you have the versatility to design and edit various actions, unlocking the full potential of the robot arm's capabilities.
- 【Multiple Sensor Expansion】Integrating with sensors and modules, AiArm can execute various interesting functions, including an AI face tracking fan, ultrasonic vision hunt.
Hiwonder
Hiwonder JetArm ROS1/ROS2 3D Vision Robot Arm, with Multimodal AI Model (ChatGPT), AI Voice Interaction and Vision Recognition, Tracking & Sorting
【AI-Driven and Jetson-Powered】 JetArm is a high-performance 3D vision robot arm developed for ROS education scenarios. It is equipped with the Jetson Nano, Orin Nano, or Orin NX as the main controller, and is fully compatible with both ROS1 and ROS2. With Python and deep learning frameworks integrated, JetArm is ideal for developing sophisticated AI projects.
【High-Performance AI Robotics】JetArm features six intelligent serial bus servos with a torque of 35KG. The robot is equipped with a 3D depth camera, a built-in 6-microphone array, and Multimodal Large AI Models, enabling a wide variety of applications, such as 3D spatial grabbing, target tracking, object sorting, scene understanding, and voice control.
【Depth Point Cloud, 3D Scene Flexible Grabbing】 JetArm is equipped with a high-performance 3D depth camera. Based on the RGB data, position coordinates and depth information of the target, combined with RGB+D fusion detection, it can realize free grabbing in 3D scene and other AI projects.
【Enhanced Human-Robot Interaction Powered by AI】 JetArm leverages Multimodal Large AI Models to create an interactive system centered around ChatGPT. Paired with its 3D vision capabilities, JetArm boasts outstanding perception, reasoning, and action abilities, enabling more advanced embodied AI applications and delivering a natural, intuitive human-robot interaction experience.
【Advanced Technologies & Comprehensive Tutorials】 With JetArm, you will master a broad range of cutting-edge technologies, including ROS development, 3D depth vision, OpenCV, YOLOv8, MediaPipe, AI models, robotic inverse kinematics, MoveIt, Gazebo simulation, and voice interaction. We provide in-depth learning materials and video tutorials to guide you step by step, ensuring you can confidently develop your own AI-powered robotic arm.
Hiwonder
AI Vision Robot Nexbit, micro:bit Programming Educational Robot, Support WonderCam Smart Vision Module
- 【micro:bit Programming, AI Vision Robot】Nexbit is AI smart robot car powered by micro:bit. Nexbit features a high-performance WonderCam AI vision module with an integrated HD camera. This module enables various AI applications, including color recognition, vision line following, target tracking, and more.
- 【Powerful hardware】Nexbit boasts an impressive array of features within its compact body, including a high-precision 4-channel knob line follower, an AI vision module, a glowy ultrasonic sensor, an infrared receiver, RGB lights, and other electronic components.
- 【One-click Training and Learning】Nexbit's AI vision module is integrated with some learning algorithms, allowing it to complete diverse AI vision projects such as waste sorting and tag tracking.
- 【Various Control Methods&Makecode Programming】Nexbit supports Android/ iOS APP control, Handlebit Remote Control.Nexbit also supports Makecode programming.Drag and drop building block-style modules for programming,which is easy to learn.
Hiwonder
Track Chassis/ Suspension Shock Absorption Full-Metal Tank Robot Encoder Motor/ Smart Car Chassis
- 【High-quality vibration reduction effect】The chassis incorporates an 8-channel high-elasticity carbon steel tension spring and is equipped with micro bearings, ensuring agile maneuverability across diverse terrains.
- 【Strong robot tank bracket】The main body is crafted from aluminum alloy and undergoes an anodized surface treatment, resulting in an exquisite appearance.The top layer can be easily removed,facilitating DIY development.
- 【More extended functions】Bracket contain multiple expansion ports and are fully compatible with popular controllers on the market such as Jetson Nano, Raspberry Pi, Arduino ect. You also can add multiple sensors and servos to create your robot.
- 【Application】This it is perfect for hobbyists, educational, competitions and research projects. Many schools or education departments choose this car chassis for school students to learn AI robot knowledge.
- 【Version Difference】The standard version is a single layer Track chassis; The advanced version is the double layer Track chassis.
Hiwonder
AiNex ROS Education AI Vision Humanoid Robot Powered by Raspberry Pi Biped Inverse Kinematics Algorithm Learning Teaching Kit
- 【High-performance Hardware Configurations】AiNex is developed upon Robot Operating System(ROS) and featuring a Raspberry Pi, 24 intelligent serial bus servos, an HD camera, movable mechanical hands. It is a professional AI humanoid robot capable of lively mimicking human actions.
- 【Advanced Inverse Kinematics Gait】AiNex integrates inverse kinematics algorithm for flexible pose control as well as gait planning for omnidirectional movement.
- 【Outstanding AI Vision Recognition and Tracking】Leveraging technologies, like machine vision and OpenCV, AiNex excels in precise object recognition, enabling it to accomplish target.
- 【Robot Control Across Platforms】AiNex provides multiple control methods, like WonderROS app (compatible with iOS and Android system), wireless handle, and PC software.
- 【Detailed Tutorials and Professional After-sales Service】We offer an extensive collection of tutorials covering up to 18 topics.
Hiwonder
Hiwonder JetAcker AI Robot Kit – NVIDIA Jetson-Powered ROS1/ROS2 Educational Coding Robot with multimodal AI model (ChatGPT), Voice Control, AI Vision Interaction & SLAM
【Driven by Al, powered by Jetson】 JetAcker is a high-performance educational robot developed for ROS learning scenarios. Equipped with Jetson Nano/Orin Nano/Orin NX controllers and compatible with both ROS1 and ROS2, it integrates deep learning frameworks with TensorRT acceleration, making it ideal for advanced Al applications such as SLAM and vision recognition.
【SLAM Development and Diverse Configuration】JetAcker is equipped with a powerful combination of a 3D depth camera and Lidar. It utilizes a wide range of advanced algorithms including gmapping, hector, karto, cartographer and RRT, enabling precise multi-point navigation, TEB path planning, and dynamic obstacle avoidance. Using 3D vision, it can capture point cloud images of the environment to achieve RTAB 3D mapping navigation.
【Empowered by Large Al Model, Human-Robot Interaction Redefined】 JetAcker deploys multimodal models with ChatGPT at its core, integrating 3D vision and a 6-microphone array. This synergy enhances its perception, reasoning, and actuation capabilities, enabling advanced embodied AI applications and delivering natural, context-aware human-robot interaction.
【Classical Ackermann Steering Mechanism】 The Ackermann chassis combines maneuverability and steering precision, facilitating the learning and validation of real-world vehicle steering principles. This design enables realistic simulation of autonomous driving scenarios for enhanced educational experiences.
【Comprehensive Learning Tutorials】 Through JetAcker's structured curriculum, master cutting-edge technologies including ROS development, SLAM mapping and navigation, 3D depth vision, OpenCV, YOLOv8, MediaPipe, Large Al model integration, Movelt and Gazebo simulation, and voice interaction.
Supported by extensive documentation and video tutorials, our progressive learning system breaks down complex concepts into digestible modules, guiding you from fundamentals to advanced implementations-empowering you to build your own intelligent robotic systems.
Hiwonder
Hiwonder TurboPi Raspberry Pi ROS2 Robot Car with Mecanum Wheels, AI Vision & Tracking, Integrated Multimodal Large AI Model ChatGPT, and Voice Interaction
【Raspberry Pi 5 & ROS2 Platform】 TurboPi runs on the ROS2 operating system and leverages Python and OpenCV to deliver efficient AI processing and a wide range of robotic applications.
【Multimodal large AI model ChatGPT & Voice Interaction】With an integrated multimodal large AI model and AI voice interaction capabilities, TurboPi supports smart conversations, environment awareness, and flexible task execution for richer human-machine interactions.
【AI Vision & Autonomous Driving】 Equipped with a 2-DOF HD camera, TurboPi offers FPV video feedback, object and color recognition, line following, and autonomous driving features—perfect for creative AI projects.
【360° Omnidirectional Movement】 Featuring a robust metal chassis and Mecanum wheels, TurboPi can move in any direction and rotate on the spot, adapting smoothly to various scenarios.
【Comprehensive Code & Learning Resources】 We provide full Python source code, diverse experiment examples, and detailed course materials to support your journey in mastering AI and programming while inspiring endless innovation.
Hiwonder
Hiwonder JetTank ROS Robot Tank Powered by Jetson Nano with Lidar Depth Camera Touch Screen, Support SLAM Mapping and Navigation
- 【Smart ROS Robots Driven by AI】 JetTank supports Robot Operating System (ROS). It leverages mainstream deep learning frameworks, incorporates MediaPipe development, enables YOLO model training. This combination delivers 3D machine vision applications, including autonomous driving, somatosensory interaction and KCF target tracking.
- 【SLAM Development and Diverse Configuration】JetTank is equipped with a 3D depth camera and Lidar. It utilizes a wide range of advanced algorithms including gmapping, hector, karto and cartographer, enabling precise multi-point navigation, TEB path planning, and dynamic obstacle avoidance.
- 【High-performance Hardware Configurations】JetTank is made of aluminum alloy and employs various hardware components, including reinforced nylon continuous track, 520 Hall encoder gear motors, metal drive wheel, Lidar, Astra Pro Plus depth camera, 6-microphone array, speaker, etc.
- 【Far-field Voice Interaction】JetTank advanced kit incorporates a 6-microphone array and speaker allowing for man-robot interaction applications, including Text to Speech conversion, voice wake-up, 360° sound source localization, voice-controlled mapping navigation, etc.
- 【Robot Control Across Platforms】JetTank provides multiple control methods, like WonderAi app (iOS&Android), wireless handle, Robot Operating System (ROS) and keyboard, allowing you to control the robot at will.
Hiwonder
Hiwonder JetMax Pro JETSON NANO Robot Arm with Mecanum Wheel Chassis/ Electric Sliding Rail Support ROS Python
- Powered by Jetson Nano(included)
- Open source and based on ROS
- Deep learning, model training, inverse kinematics
- Abundant sensors for function expansion
- Changeable robot models with mecanum wheel chassis or sliding rail
Hiwonder
JetAuto Pro ROS1 ROS2 Robot Car with Vision Robotic Arm Powered by Jetson Nano Support SLAM Mapping/ Navigation/ Python
- 【Smart ROS Robots Driven by AI】 JetAuto Pro is a professional robotic platform for ROS learning and development, powered by NVIDIA Jetson Nano and supports Robot Operating System (ROS). It leverages mainstream deep learning frameworks, incorporates MediaPipe development, enables YOLO model training.
- 【SLAM Development and Diverse Configuration】JetAuto Pro is equipped with a powerful combination of a 3D depth camera and Lidar. It utilizes a wide range of advanced algorithms including gmapping, hector, karto and cartographer, enabling precise multi-point navigation, TEB path planning, and dynamic obstacle avoidance.
- 【High-performance Vision Robot Arm】JetAuto Pro includes a 6DOF vision robot arm, featuring intelligent serial bus servos with a torque of 35KG. An HD camera is positioned at the end of robot arm, which provides a first-person perspective for object gripping tasks.
- 【Far-field Voice Interaction】JetAuto Pro advanced kit incorporates a 6-microphone array and speaker allowing for man-robot interaction applications, including Text to Speech conversion, 360° sound source localization, voice-controlled mapping navigation, etc. Integrated with vision robot arm, JetAuto Pro can implement voice-controlled gripping and transporting.
- 【Robot Control Across Platforms】JetAuto Pro provides multiple control methods, like WonderAi app (compatible with iOS and Android system), wireless handle, Robot Operating System (ROS) and keyboard, allowing you to control the robot at will.
Hiwonder
Hiwonder AiNova Pro 16-in-1 Programmable Building Robotic Kit Toys Support Scratch & Python for Kids Ages 12+
- Buy 1 get 16. One robot car turns into 16 different models
- Includes more than 200 parts. Electronic modules, gripper, servos, sensors, 100+ Lego blocks
- Support Scratch and Python programming
- Encoder motor for precise performance
Hiwonder
Hiwonder MaxArm Open Source Robot Arm Powered by ESP32 Support Python and Arduino Programming Inverse Kinematics Learning
- Powered by ESP32 microcontroller
- Linkage mechanism for better inverse kinematics learning
- Compatible with Hiwonder sensors for implementing different tasks
- Worked with the sliding rail to simulate industrial scenario
- Support Auduino and Python
- Support APP, PC software, wireless handle and mouse controls
Hiwonder
Hiwonder ArmPi mini 5DOF Vision Robotic Arm Powered by Raspberry Pi 5 Support Python, OpenCV Target Tracking for Beginners
- 【AI Vision Robot Powered by Raspberry Pi】ArmPi mini is a smart vision robot arm powered by Raspberry Pi. It adopts high-performance intelligent servos and HD camera, and can be programmed in Python. With its impressive capabilities, ArmPi mini unlocks diverse AI applications, such as smart vision-guided recognition and grasping.
- 【AI Vision Recognition and Tracking】ArmPi mini combines a HD camera and OpenCV library to recognize and locate target objects, enabling the implementation of AI applications like color sorting, target tracking and intelligent stacking.
- 【Inverse Kinematics Algorithm】ArmPi mini employs an inverse kinematics algorithm, enabling precise target tracking and gripping. It also provides complete source code for the inverse kinematics function assisting you in learning AI.
- 【App Remote Control】 ArmPi mini offers ultimate remote control with a dedicated app (iOS/ Android) and PC software. In addition, you can access first-person-view perspective on the app giving you an immersed using experience.