Omnipresent Perception | ArmPi Ultra Unlocks New Desktop Embodied Intelligence
As embodied intelligence technology moves toward lighter, more context-aware applications, desktop robot arms are evolving from execution-only device into fully autonomous systems capable of perception, decision-making, and execution.
Hiwonder ArmPi Ultra could completely reshape the way you think about these devices. Beneath its compact exterior lies an intelligent robotic system integrating 3D vision, AI, and proprietary inverse kinematics algorithms. More than just a machine that sees, it perceives, understands, and acts — achieving true harmony between its hand, eye, and brain.
Dual-Brain Architecture: The Perfect Balance of Power and Precision
ArmPi Ultra Robot Arm features a distributed computing architecture powered by a dual-brain design — the Raspberry Pi 5 serves as the high-level decision unit, while the STM32 acts as the low-level control core. In simple terms, the Raspberry Pi 5 handles intelligent decision-making and visual processing, while the STM32 excels at real-time motion control. It's like a research strategist working seamlessly with an operations expert. The Raspberry Pi 5 functions as the strategic brain, deciding what to grab and how to grab it, while the STM32 serves as the execution specialist, precisely controlling how far and how fast each joint moves. This dual-brain synergy bridges the gap between thinking and doing, achieving a seamless blend of cognitive intelligence and motion precision.
3D Vision — Seeing the World in Depth
Hiwonder ArmPi Ultra is equipped with its own “eyes” — a 3D structured-light depth camera. Unlike a regular camera, it projects laser patterns and analyzes the reflections to create a real-time 3D map of its surroundings. Put simply, it goes beyond recognition — it understands their exact position, distance, shape, and orientation, enabling precise grasping even at the millimeter scale.

Inverse Kinematics for Precision Hand-Eye Coordination
ArmPi Ultra is powered by Hiwonder self-developed advanced inverse kinematics algorithm that intelligently converts target positions into precise joint movements. Similar to how the human brain figures out the angles of each joint when reaching for something, the algorithm models the robotic arm and calculates optimal movements in real time. It also ensures the arm stays stable throughout the motion. Unlike traditional robotic arms with fixed motion routines, ArmPi Ultra sees and reacts in real time, picking up objects exactly as it perceives them.

Powered by 25kg high-torque smart bus servos and leveraging MoveIt’s RRT and PRM algorithms, ArmPi Ultra can autonomously plan optimal paths even in cluttered desktop. This enables truly flexible obstacle avoidance and precise execution, achieving intelligent “hand–eye–brain” coordinated operation.
AI Interaction — Mastering Embodied Intelligence
Hiwonder ArmPi Ultra integrates the ChatGPT AI largemodel, enabling it to do more than just follow preset commands, but follow your voice commands and figure out what to do. Say your commands like, “Clear the hazardous items from the desktop” or “Pick up the rectangular block on the desk”, and it will comprehend your instructions, recognize objects, plan its movements, and execute the task. It makes controlling the robot arm feel as smooth and natural as working alongside a human partner.
💡Note: Get ArmPi Ultra tutorials here, or go to Hiwonder GitHub for open source codes.

Simulation & Expansion — Creativity Without Limits
ArmPi Ultra AI robot arm supports multi-modal expansion and can be combined with a Mecanum-wheel chassis to form a mobile robotic arm, enabling full-range operational capabilities. With a motorized sliding rail, the robotic arm can now reach farther and move sideways with ease. It slides left and right to grab items, becoming your ultimate right-hand helper. Pair it with a conveyor belt to build your own mini automated factory. Watch the arm sort items with pinpoint accuracy and explore the possibilities of intelligent automation. Hiwonder ArmPi Ultra not only supports versatile hardware expansion but also opens up a high-precision virtual simulation playground. Using Gazebo and MoveIt, you can create a digital twin of the robotic arm on your computer. You can then plan motions, avoid collisions, and test algorithms without the actual robot, rehearsing real-world tasks safely and efficiently. Whether in virtual simulations or real-world projects, Hiwonder ArmPi Ultra empowers you to experiment, innovate, and push the limits of what's possible.

Developer-Friendly, Learn with Ease
ArmPi Ultra delivers a complete robotics learning experience. Whether you’re starting out or already skilled in robotics, it makes developing intelligent robots easy and engaging. From controlling hardware at the core to experimenting with advanced AI applications, you get a full-stack, hands-on learning journey. From low-level motor control and sensor data handling to motion planning, visual recognition, and advanced human–robot interaction, every feature comes with step-by-step tutorials and supporting materials, so you can learn while building and immediately put your skills into practice. You can explore all open sources on our Hiwonder GitHub.

ArmPi Ultra, Perceive, Decide, Act — Everything at Your Fingertips. Propose your ideal tricks in the comments, and it may be selected for development. We also appreciate your support through follows and shares.