On September 12, the 26th China International Optoelectronic Exposition (CIOE 2025) successfully concluded at the Shenzhen World Exhibition and Convention Center. As a global benchmark for the optoelectronics industry, this year’s event attracted over 3,000 exhibitors from more than 30 countries and regions, covering optical communications, lasers, infrared, sensing, intelligent manufacturing, and other frontier fields, reaching a new record in scale.
As a leading enterprise in robotic perception and localization, Slamtec was invited to participate, presenting its core sensors and robotic mobility solutions. Slamtec showcased how high-performance perception and motion capabilities accelerate the implementation of embodied intelligence.
Spotlight on Perception and Motion
Slamtec’s booth drew significant attention from industry experts and visitors alike.
Our featured products included:
-
Aurora Integrated Perception Sensor
-
Combines LiDAR, IMU, and vision for multimodal sensing
-
Enables geometric mapping plus semantic understanding, allowing robots not only to “see” but also to “understand”
-
Applicable to humanoid robots, lawn mowers, AMRs, and inspection robots
-
Provides a complete solution from spatial perception to data output

CEO Insights: The Future of Multimodal Perception
At the 2025 (2nd) Humanoid Robot Perception & Control Summit and Embodied Intelligence Data Collection & Training Forum, Slamtec founder & CEO Chen Shikai delivered a keynote speech titled: “Aurora: A Multimodal Bionic Sensor for Spatial Perception for Embodied Intelligence.”
Chen Shikai highlighted three key aspects:
-
Industry Trend: Robotics is evolving from geometric perception to semantic understanding, moving beyond “walking and seeing” to “understanding and interacting.”
-
Technical Challenges: Real-time data synchronization, stability, and generalization remain major hurdles. Single-modality sensors are insufficient for complex environments, requiring multimodal sensor collaboration. Algorithms must also balance precision with efficiency on edge-computing platforms.
-
Aurora’s Approach: By integrating LiDAR, IMU, and vision, Aurora achieves millisecond-level synchronization, outputting comprehensive geometric and semantic data. Its bionic design, inspired by human perception, enables environmental sensing, dynamic obstacle detection, and scene segmentation in one solution.

-
In humanoid robots, it enables precise gait navigation and interaction safety.
-
In logistics and inspection, it enhances dynamic obstacle prediction, reducing false stops and misdetections.
-
In emerging fields such as autonomous driving and security monitoring, it serves as a spatial perception “neural hub,” improving decision-making accuracy.
“Multimodal perception is not just stacking sensors—it’s the key step for robots to gain true cognitive ability. With Aurora, we aim to provide a stable and scalable foundation for spatial perception, accelerating the development and deployment of embodied intelligence.” — Chen Shikai
Looking Ahead
From LiDAR to integrated sensing to robotic chassis, Slamtec continues to focus on its two core strengths: spatial perception and motion capability.
We are committed to:
-
Continuously optimizing multimodal perception algorithms to enhance spatial understanding
-
Driving deeper integration of perception, decision-making, and motion, enabling robots to truly become human assistants
-
Building an open embodied intelligence ecosystem with partners across the industry to achieve large-scale deployment
Slamtec firmly believes in the future of embodied intelligence, where every robot can “see clearly, move steadily, and think intelligently.”