Part 3/11:
The robot comes in multiple editions tailored to specific applications: Light, Edu, and Super, covering everything from training models to full-scale interaction deployments. Its Open SDK grants access to full body states, sensor data, joint control, and task orchestration, making it a versatile platform for both teleoperation and autonomous AI training pipelines. Developers can remotely push motion libraries or custom control logic, thanks to cloud integration, and the system supports onboard language models for natural language interactions.
This level of sophistication hints at exciting possibilities for integration into various industries, especially as the robot will make its public debut at the World Robot Conference in Beijing, promising live demonstrations shortly.