add wishlist add wishlist show wishlist add compare add compare show compare preloader
Hiwonder will be on National Holiday from Sept 29th - Oct 4th 2023. Orders placed from Sept 28th will be dispatched on 5th October. Thank you for your support!
  • +86 18825289328

Robotics

Showing 1 - 16 of 53 items
  • 【Classic Ackermann Steering Structure】JetAcker is built upon an Ackermann chassis providing an opportunity to learn and validate robots based on the Ackermann steering structure.
  • 【Dive in AI Algorithm and Smart Robotics】JetAcker runs on NVIDIA Jetson Nano B01, supports ROS, utilizes deep learning frameworks, MediaPipe, YOLO training, and TensorRT acceleration for diverse 3D machine vision applications.
  • 【SLAM Development and Diverse Configuration】JetAcker is equipped with a 3D depth camera and Lidar, enabling remote communication, precise 2D mapping navigation, TEB path planning, and dynamic obstacle avoidance. 
  • 【Robot Control Across Platforms】JetAcker provides multiple control methods, like WonderAi app (compatible with iOS and Android system), wireless handle, Robot Operating System (ROS) and keyboard.
  • 【Detailed Tutorials and Professional After-sales Service】We offer an extensive collection of tutorials, covering wide range of topics, including getting ready, Linux operating system, ROS, OpenCV, depth camera and Lidar mapping navigation, autonomous driving, 3D vision interaction and voice interaction courses.
  • 【High-performance Hardware Configurations】JetAcker features an aluminum alloy bracket, CNC steering, and a range of hardware: 100mm rubber wheels, 520 Hall encoder motors, Lidar, ORBBEC Astra Pro Plus camera, 240Β° pan-tilt, multi-functional expansion board, motor driver, and 11.1V 6000mAh Lipo battery.
  • 【Support Secondary Development】You can utilize the existing learning tutorials to carry out secondary development. Please note that our technical support is limited to the existing learning tutorials.
  • Powered by Raspberry Pi 4B/ CM4 and open source
  • Support Python programming
  • 360 degree omnidirectional movement and filed of view
  • Loaded with glowy ultrasonic sensor for obstacle avoidance
  • Combine camera and OpenCV to recognize and track items
  • Powered by NVIDIA Jetson Nano and based on ROS
  • Support depth camera and Lidar for mapping and navigation
  • Optional 7-inch touch screen for parameter monitoring and debugging
  • Optional 6-microphone array for voice interaction
  • Open-source, and ample PDF materials and tutorials are provided
  • Powered by Jetson Nano(included)
  • Open source and based on ROS
  • Deep learning, model training, inverse kinematics
  • Abundant sensors for function expansion
  • Changeable robot models with mecanum wheel chassis or sliding rail
  • Powered by NVIDIA Jetson Nano(included) and based on ROS
  • FPV robotic arm for picking, sorting and transporting
  • Depth camera and Lidar for mapping and navigation
  • 7-inch touch screen to monitor and debug parameters
  • Optional 6-microphone array for voice interaction
  • Open-source, ample PDF materials and tutorials are provided
  • Powered by ESP32 microcontroller
  • Linkage mechanism for better inverse kinematics learning
  • Compatible with Hiwonder sensors for implementing different tasks
  • Worked with the sliding rail to simulate industrial scenario
  • Support Auduino and Python
  • Support APP, PC software, wireless handle and mouse controls
  • Powered by Raspberry Pi 4B and open source
  • Mini portable size with HD wide-angle camera
  • Integrate OpenCV and FPV vision for recognition and tracking
  • Built-in inverse kinematics for precise servo control
  • Support PC software and App control
  • Abundant learning materials are provided
  • Powered by NVIDIA Jetson Nano and based on ROS
  • Support depth camera and Lidar for mapping and navigation
  • Optional 7-inch touch screen for parameter monitoring and debugging
  • Optional 6-microphone array for voice interaction
  • Open-source, and ample PDF materials and tutorials are provided
  • Powered by Raspberry Pi 4B/ CM4 (included in the kit) and based on ROS
  • Equipped with HD wide-angle camera
  • Support FPV transmitted image
  • Powerful vehicle chassis for omni-directional movement
  • Open Source and abundant tutorial materials
  • Powered by NVIDIA Jetson Nano and based on ROS
  • Support depth camera and Lidar for mapping and navigation
  • Upgraded inverse kinematics algorithm
  • Capable of deep learning and model training
  • Open source and provide ample PDF materials and tutorials
  • Compatible with ArmPi FPV robotic arm, various sensors, modules and controllers
  • Nice damping effect with 8V encoder geared motors
  • Encoder geared motors and wheels contribute to flexible movement 
  • Strong and durable high-strength aluminum alloy chassis
$149.99

Hiwonder

  • Self-driving traffic system learning
  • Load with WonderCam vision module
  • Support Scratch graphical programming 
  • Programmable mini autonomous driving education kit
$269.99

Hiwonder

  • Powered by Raspberry Pi 4B/ CM4
  • Tiny desktop robot car with 5DOF robot arm
  • Learn AI machine vision by OpenCV
  • Realize omni-directional movement with mecanum wheels car
  • FPV vision for intelligent picking, target tracking, object sorting, etc
  • Powered by Raspberry Pi 4B/ CM4 and based on ROS
  • Capable of gait planning and adopt linkage kinematics
  • Possess machine vision and work with OpenCV
  • Support Gazebo simulation
  • Ample tutorials and open-source codes are provided
  • Loaded with robotic arm and HD camera
  • First person view for various AI creative games
  • Built-in inverse kinematic algorithm and work with OpenCV
  • Powered by Raspberry Pi 4B/ CM4 and support Python programming
  • Powered by Raspberry Pi 4B and support Python programming
  • Work with OpenCV to realize AI vision recognition and tracking
  • Right hand and left hand are available
  • Open-source code and detailed tutorials are provided
Light
Dark