Watch awkward Chinese humanoid robot lay it all down on the dance floor

Watch awkward Chinese humanoid robot lay it all down on the dance floor

The humanoid machine starts to dance — not just shuffling from side to side, but tearing through a fast, tightly choreographed routine that looks halfway between a TikTok challenge and a club performance, raising new questions about how close robots are getting to moving like us.

A dancing robot that almost looks human

The star of the new viral clip is “Adam‑U Ultra,” a full-size humanoid robot built by Chinese company PNDbotics. In the video, the bot launches into what the firm labels a Charleston, though the moves feel much closer to hip-hop: sharp arm pops, twisting hips, quick footwork and syncopated steps that stay perfectly on beat.

The performance feels slightly off, in that oddly stiff way only a robot can manage, yet the coordination is undeniably impressive. Arms, waist and ankles snap through complex angles in sync with the music, with no obvious wobbling or near-falls. For a machine bolted together from motors and metal, Adam-U Ultra looks surprisingly comfortable on the dance floor.

Adam-U Ultra completes a fast, multi-step routine without a slip, a stumble or a missed beat, showcasing unusual control for a humanoid robot.

Forty-one joints, one awkward robo-dancer

The secret behind those moves lies in Adam’s mechanical design. PNDbotics says the platform uses 41 independently controlled joints, known as actuators, spread across its limbs and torso. Each actuator behaves like a motorised joint, allowing the robot to bend, twist and pivot much like a human skeleton.

That number matters. More joints mean more freedom of movement, so Adam can tilt its torso while rotating its hips, bend its knees while rolling its ankles, and swing its arms in complex arcs instead of simple straight lines. For something like dancing, where timing and posture change every fraction of a second, that flexibility is crucial.

  • 41 actuated joints for arms, legs, hands and torso
  • Height: around 1.6 metres (5.2 feet)
  • Weight range: 60–63 kilograms (132–139 pounds)
  • Built for both lab work and real-world tasks

Even with all those moving parts, the different Adam models remain relatively light. The heaviest version, Adam Pro, weighs about as much as a small adult at 63 kilograms, while the lightest comes in at 60 kilograms. Keeping weight down helps with stability and allows the motors to shift the robot’s centre of gravity quickly, which is vital for any rapid stepping or spinning on two legs.

See also  Putting an “insulating jacket” around your hot water tank could save £50–£60 on energy bills this winter

An AI “brain” built for motion

Under the shell, Adam-U Ultra is driven by a dedicated AI computing platform built around Nvidia’s Jetson Orin module. That compact board houses a CPU, GPU and other chips in a single system, acting as the robot’s “brain” and handling everything from balance to vision.

PNDbotics relies on advanced control software that combines whole-body control with model predictive control. Put simply, the robot constantly runs simulations of what will happen to its body if it moves a foot here or swings an arm there. It then picks the action that keeps it upright while still hitting the next dance beat.

➡️ Buying a spa at 70, “water above 104°F increases health risks” experts warn

➡️ This French shipping giant with the world’s 3rd-largest fleet just launched a 366-metre colossus that breaks a symbolic barrier

➡️ A bowl of salt water by the window in winter solves a condensation problem most people blame on insulation

➡️ Stéphane Macquaire, Professional Hairdresser in Paris: “This is the secret to spacing out appointments between highlights”

➡️ If you still throw away lemon seeds, you’re missing this surprising plant for your living room

➡️ I wrapped my doorknobs in foil overnight: the unexpected reason homeowners swear by it

➡️ Meteorologists detect a developing “cold dome” that could intensify early-February frost

➡️ Meteorologists warn that early February Arctic changes disrupting marine plankton cycles critical to wildlife could set off a chain of ecological collapse while skeptics accuse scientists of manufacturing panic over natural climate variability

The robot is not just replaying a script; it is constantly adjusting for balance and stability based on simulated scenarios and sensor feedback.

Much of this behaviour is trained in large-scale virtual environments before it ever reaches the physical robot. Neural networks practise motions in simulation, working through thousands of variations of walking, turning, bending or dancing, learning which sequences lead to success and which result in a robotic face-plant.

Vision, language and action in one body

Adam-U Ultra is more than a blind, pre-programmed dancer. The platform includes what PNDbotics describes as a “vision-language-action” model, often shortened to VLA. This is a type of embodied AI system that combines sight, understanding and physical control.

What the VLA system actually does

VLA ties three capabilities together:

  • Vision: sensors build a 3D map of the surroundings
  • Language: the robot parses spoken or written instructions
  • Action: AI converts goals into precise joint movements
See also  Political Earthquake Pelosi’s Retirement Announcement Could Open Floodgates For A Wave Of New Candidates Chaos Incoming

In practice, that means you could tell the robot, “Walk to the table and wave,” and its software would identify the table, plan a path and move its arm at the right moment. For dance, a human operator might describe the routine or set high-level cues, and the VLA system would translate those into steps and poses.

Adam’s vision stack leans on an Intel RealSense D455 depth camera, which gives it distance data for each point in its field of view. That sensor, along with lidar units and standard cameras, helps the robot understand where the floor, walls and obstacles sit in three-dimensional space. Real-time awareness is what stops the bot from kicking a chair mid-routine.

A family of robots behind the viral clip

The dancing figure seen online is part of a broader Adam platform. PNDbotics is developing multiple full humanoid versions alongside a stationary unit called Adam-U. That fixed model acts as a research and data collection tool, letting engineers test control algorithms, sensor setups and training methods without worrying about the robot crashing to the ground.

On the company’s roadmap are four fully mobile humanoid variants, each with different levels of movement, sensory gear and computing power. The dancing Adam-U Ultra appears pitched as a showcase model, demonstrating what the highest-spec version can achieve when pushed.

Adam line role Primary focus
Adam-U (stationary) Data collection and research platform
Mobile humanoid versions Locomotion, manipulation, interaction with people
Showcase dancer (Adam-U Ultra) Demonstration of balance, agility and control

Beyond TikTok: what these robots could actually do

PNDbotics pitches the Adam line as more than a novelty. The company says these robots could slot into a range of roles that require a human-like body but don’t always have a human available.

In research and lab settings, an Adam unit could assist with repetitive experiments, precise handling of equipment or monitoring instruments around the clock. Because the robot can mimic human motion, tasks designed for people—such as turning knobs, pushing trolleys or opening fridges—need less redesign.

The firm also highlights medical and rehabilitation applications. Think of a robot demonstrating exercises to stroke patients, tracking whether they perform each movement correctly, or helping physiotherapists by handling basic, supervised routines. In training environments, a humanoid bot might act as a stand-in patient, allowing medical staff to practise lifting, repositioning or support techniques repeatedly.

PNDbotics suggests Adam could assist in rehabilitation, monitor patient progress or even collaborate with clinicians in specific, highly controlled surgical tasks.

There is also interest in more traditional industrial jobs. A robot shaped roughly like a person can work on existing manufacturing lines, use standard tools and move around spaces built for humans. Outside factories, Adam could act as a concierge, receptionist or guided-tour assistant, greeting visitors, giving directions and, occasionally, breaking into that now-famous dance routine for marketing flair.

See also  Not 65 or 75 : the age limit to keep your driving licence in France has just been confirmed

Why the dancing looks awkward – and why that matters

For all the hype, Adam-U Ultra does not move exactly like a human dancer, and that awkwardness is part of the story. Slight delays in arm swings, rigid shoulders and the overly clean timing signal that this is a machine following precise trajectories, not a person improvising to the music.

Those visible imperfections can actually make the robot more relatable. People instinctively notice the gap between human and machine, which keeps expectations in check. Right now, Adam is following pre-designed patterns and carefully tuned AI controllers; it is not about to freestyle in a nightclub or read a crowd’s mood.

The gap between this kind of robot and a fully autonomous, general-purpose humanoid is still wide. Each new video—whether dancing, running or doing parkour—is usually the result of many failed trials, simulation runs and limited test conditions. Real workplaces remain messy, unpredictable and socially complex.

Key concepts behind the tech

Several terms crop up often around this project and are worth unpacking briefly:

  • Actuator: A motorised joint that controls movement. The more actuators, the more nuanced poses a robot can achieve.
  • Model predictive control: A method where software simulates future states of the robot’s body before choosing the next move.
  • Embodied AI: Artificial intelligence running in a physical body, which has to deal with gravity, friction and real-world uncertainty.
  • Vision-language-action model: An AI system that combines perception, understanding of instructions and motion planning in a single pipeline.

Put together, these ingredients are turning what once looked like clunky theme-park animatronics into robots that can walk, lift, and, yes, dance with something approaching personality. Today it is an awkward Charleston-by-way-of-hip-hop video. Next, the same control systems may be quietly guiding machines in hospitals, warehouses and factories.

Originally posted 2026-03-08 18:15:51.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top