PaXini's 'Tactile Infrastructure' Gives Robots a Human-Like Touch
- 15 dimensions measured by the PX-6AX-GEN3 tactile sensor, including force, texture, and elastic response.
- 1,140 tactile processing units in the DexH13 robotic hand for fine motor control.
- CNY 1 billion ($139 million) in funding secured in 2025 from investors like JD.com and BYD.
Experts would likely conclude that PaXini’s 'Tactile Infrastructure' represents a significant advancement in robotic dexterity, positioning the company as a key player in embodied AI by addressing critical challenges in touch-based interaction and data acquisition.
PaXini’s ‘Tactile Infrastructure’ Gives Robots a Human Touch
LAS VEGAS, NV – January 07, 2026 – Amid the dazzling displays of CES 2026, a crowd gathered in the ENTERPRISE AI Zone to watch a humanoid robot make ice cream. The robot, TORA-ONE, wasn't just following a pre-programmed routine; it manipulated levers, handled ingredients, and carefully passed a finished cup to a person, showcasing a level of dexterity that felt new. This demonstration was the public face of PaXini Tech's ambitious new strategy: to build the "Tactile Infrastructure" for the next generation of embodied AI.
The Shenzhen-based robotics firm unveiled a complete, full-stack product matrix, arguing that for robots to become truly useful in human environments, they must not only see and think but also feel. The company's presentation went far beyond a single impressive demo, detailing a closed-loop ecosystem of advanced sensors, dexterous hands, humanoid platforms, and a novel data acquisition system designed to solve some of the biggest challenges facing the robotics industry.
The Revolution of Robotic Touch
At the heart of PaXini’s vision is the ability to endow machines with a nuanced sense of touch. The company asserts that this is the missing link for moving robots from caged, repetitive tasks to dynamic, real-world applications. The TORA-ONE's ability to handle an ice cream scoop and a delicate cup relies on a sophisticated, full-body sensing ecosystem.
A key component is the PX-6AX-GEN3 multidimensional tactile sensor, which functions as the robot's fingertips. According to the company, this sensor can measure 15 different dimensions simultaneously, including six-axis force, material texture, and even elastic response, all with a repeatability of less than 0.5% full-scale. This allows the robot to differentiate between a hard lever and a soft cup, adjusting its grip and force in real time.
This granular perception at the fingertips is complemented by force control throughout the robot's body. PaXini showcased the PX6D/PXTS, which it claims is the world's first commercial Hall-effect 6D force/torque sensor designed specifically for embodied AI. Integrated into the robot's wrists and joints, these lightweight sensors provide the precise whole-body force perception needed for stable movement and interaction, preventing the robot from fumbling objects or exerting unsafe pressure.
Translating this rich sensory data into action is the job of the DexH13, a multidimensional tactile dexterous hand. With a four-finger bionic structure and 16 degrees of freedom, the hand is equipped with over 1,140 tactile processing units. At CES, it demonstrated its ability to mirror human hand gestures, stably grasp irregular objects from test tubes to cubes, and perform delicate tasks like turning a knob—feats that require a deep integration of perception and fine motor control.
An Ecosystem for Industrial-Scale AI
While a dexterous, ice-cream-scooping robot captures the imagination, PaXini’s strategy is deeply pragmatic and aimed squarely at industrial adoption. The company is positioning itself not just as a hardware manufacturer but as a provider of a complete embodied AI infrastructure. This full-stack approach, encompassing sensors, robotic platforms, and data, is designed to lower the barrier to entry for businesses looking to deploy advanced automation.
This strategy has attracted significant financial backing from major industry players. In 2025, PaXini secured over CNY 1 billion (approx. $139 million) in funding, with investment rounds led by e-commerce giant JD.com and including a major contribution from electric vehicle manufacturer BYD. These partnerships are more than just financial; they represent strategic alignments with companies that have massive logistics and manufacturing operations—prime use cases for humanoid robots. PaXini has already begun validating its platforms in large-scale logistics warehouses and automotive manufacturing facilities.
The competitive landscape for humanoid robots is heating up, with companies like Boston Dynamics, Agility Robotics, and Sanctuary AI all making significant strides. Boston Dynamics' all-electric Atlas is slated for deployment in Hyundai factories, while Agility Robotics' Digit is already moving totes in Amazon warehouses. However, PaXini is carving out a distinct niche by focusing intensely on tactile intelligence as the core enabler. While competitors often highlight locomotion or general cognitive abilities, PaXini bets that a robot’s ability to physically interact with its environment with human-like sensitivity will be the key differentiator for widespread, practical use in complex industrial settings.
Solving AI's 'Data Anxiety' Problem
Perhaps the most forward-looking part of PaXini's announcement was its solution to one of embodied AI's most critical bottlenecks: the scarcity of high-quality training data. AI models learn from data, and for an AI that must interact with the physical world, that data needs to capture the complex interplay of vision, force, touch, and movement. This type of data is notoriously difficult and expensive to collect.
PaXini unveiled what it calls the world's first Omni-Modality Embodied AI Data Acquisition System. The company showcased a replica of the system at its booth, explaining its "human-centric" approach. Rather than relying solely on slow and limited teleoperation (where a human remotely controls a robot), this system is designed to capture comprehensive data from human actions, offering higher reusability and long-term value.
The company has already established the capacity to produce nearly 200 million omni-modality data entries annually. In a move to accelerate industry-wide progress, PaXini announced plans to open this resource globally through a cloud store. This initiative directly addresses the "data anxiety" felt by many developers and researchers, who lack the vast, multimodal datasets needed to train robust physical AI models. While other firms, such as RealMan Robotics, have also begun open-sourcing real-world datasets, PaXini's planned scale and commercial platform represent a major step toward creating a sustainable data infrastructure for the entire embodied AI field. By providing the fuel for learning, PaXini aims to enable AI to truly understand the physical world and make intelligent touch accessible everywhere.
