Designing couture around LiDAR, cameras, capacitive touch, force/torque sensors, IMUs, and microphone arrays. A deep technical guide to garments that never compromise robot perception.
A modern humanoid robot is, first and foremost, a sensor platform. Tesla Optimus carries over 40 individual sensors across its body. Boston Dynamics Atlas integrates LiDAR, stereo vision, IMUs, and force/torque sensors at nearly every joint. Figure 03 distributes cameras and proximity sensors across its head, torso, and limbs. Every one of these sensors was designed, calibrated, and positioned to operate with the robot's original exterior surfaces exposed directly to the environment.
Placing fabric, film, or any material over these sensors introduces variables that the robot's perception stack was never designed to accommodate. A layer of cotton over a LiDAR unit scatters the return pulse beyond recognition. A loose fabric draped near a camera creates dynamic occlusion that confuses vision algorithms. A thick garment layer over force/torque sensors dampens the force feedback that enables safe human interaction. These are not theoretical concerns. They are engineering constraints that define the boundaries of what robot fashion can and cannot do.
We treat sensor compatibility as the foundational discipline of robot fashion engineering. Before any aesthetic decision is made, before fabric selection or color palette or silhouette design, we map every sensor on the target platform and define the compatibility envelope within which the garment must operate. This page documents the technical principles, material science, and testing protocols that underpin every sensor-compatible garment we produce.
LiDAR (Light Detection and Ranging) is the primary spatial perception system on most advanced robots. It operates by emitting laser pulses, typically at 905nm or 1550nm wavelengths in the near-infrared spectrum, and measuring the time of flight of reflected returns. Any material placed between the LiDAR emitter/receiver and the environment must transmit these wavelengths with minimal attenuation, scattering, or phase distortion.
Most conventional textiles are effectively opaque at LiDAR wavelengths. Cotton, polyester, nylon, and wool all scatter near-infrared light to varying degrees, reducing the usable return signal below the noise floor of most LiDAR receivers. Even materials that appear translucent to visible light may be opaque at 905nm or 1550nm due to absorption by dyes, coatings, or fiber structure.
Our materials laboratory has developed a family of textiles specifically engineered for near-infrared transparency. These fabrics use synthetic fiber compositions with molecular structures that do not absorb in the 850nm to 1600nm range, covering both common LiDAR wavelengths. The fibers are woven in open mesh patterns that maximize transmittance while maintaining sufficient fabric density for visual opacity at visible wavelengths. The result is a material that looks like solid fabric to the human eye but is largely transparent to LiDAR.
Transmittance varies by weave density and fiber diameter. Our standard LiDAR-compatible fabric achieves 87% transmittance at 905nm and 91% at 1550nm, with a visible-light opacity sufficient for solid color presentation. Premium grades achieve 93% and 96% transmittance respectively, suitable for applications where the garment covers a LiDAR unit with tight signal margin budgets.
LiDAR transparency is wavelength-specific. A fabric that transmits 905nm light may absorb 1550nm light, and vice versa. We test transmittance at the exact wavelength used by each platform's LiDAR hardware. See our Smart Textiles guide for broader material science details.
Many conventional textile dyes absorb strongly in the near-infrared. A fabric that tests as LiDAR-transparent in its undyed state may become opaque once dyed black, navy, or certain reds. We maintain a validated dye library of colorants that have been tested for near-infrared neutrality. These dyes produce a full visible color range without affecting transmittance at LiDAR wavelengths. Surface finishes, coatings, and treatments are similarly validated. Anti-static treatments, water-repellent coatings, and flame retardants can all affect near-infrared performance and are tested individually before specification.
Camera systems on humanoid robots range from single wide-angle lenses to multi-camera stereo arrays, depth cameras, and thermal imagers. Each has a defined field of view (FOV) that must remain completely unobstructed for the robot's vision system to function correctly. Even partial occlusion of a camera's FOV can degrade object detection, depth estimation, and visual SLAM (Simultaneous Localization and Mapping) performance.
The challenge for robot fashion is that cameras are often positioned at points where garments naturally drape, fold, or shift during movement. A chest-mounted camera may be occluded by a jacket lapel. A shoulder camera may be blocked by a raised arm's sleeve. A wrist camera may be covered by a cuff. Each scenario requires a different engineering approach.
For cameras in relatively static positions (head-mounted, torso-mounted), we employ rigid bezel systems. These are precision-molded frames, typically 3D-printed in matte black PA12 nylon, that create a fixed clearance zone around the camera lens. The garment attaches to the outer edge of the bezel, and the bezel's geometry prevents fabric from entering the camera's FOV regardless of garment movement, wind, or static charge. Bezels are designed with anti-reflection surfaces to prevent stray light from garment-adjacent surfaces from entering the optical path.
Cameras positioned near joints require dynamic clearance solutions that maintain unobstructed FOV across the full range of joint motion. we use articulated fabric management systems: spring-loaded fabric tensioners, guided channels, and elastomeric retractors that keep fabric clear of camera apertures throughout the robot's movement envelope. These systems are designed using motion-capture data from the target platform, ensuring clearance is validated at every point in the kinematic chain.
Capacitive touch sensors detect changes in electrical capacitance caused by the proximity or contact of a conductive object, typically a human hand. These sensors are increasingly common on collaborative robots, where touch-based interaction and emergency stop functionality depend on reliable capacitive detection through any covering material.
Standard fabrics are capacitive insulators. A layer of cotton, polyester, or nylon between a human hand and a capacitive sensor will attenuate the capacitance change below the sensor's detection threshold, effectively disabling touch interaction. This is not merely an inconvenience; on platforms where capacitive sensors serve as emergency stop triggers, covering them with insulating material creates a safety hazard.
Our solution is a conductive relay layer: a fabric zone directly over each capacitive sensor that contains conductive fibers or coatings capable of transmitting capacitive signals from the garment's exterior surface to the sensor beneath. The relay layer is calibrated to maintain the sensor's original sensitivity threshold, ensuring that a human touch on the garment surface triggers the same response as a touch on the bare sensor.
Conductive relay zones are constructed using silver-coated nylon fibers woven into the garment at capacitive sensor locations. The silver coating provides excellent electrical conductivity while maintaining fabric flexibility and compatibility with standard garment construction techniques. Each relay zone is individually tested against the platform's capacitive sensor specifications, with sensitivity verified across humidity ranges from 20% to 90% RH, as humidity significantly affects capacitive coupling through textile layers.
Tesla Optimus Gen 2 features capacitive touch sensors on both hands and across the torso for human-robot interaction and safety shutdown. Our Optimus garment templates include pre-mapped conductive relay zones at all 14 capacitive sensor positions, each calibrated to Optimus's specific sensor hardware. Garments for Optimus undergo mandatory capacitive response testing at every sensor location before delivery. Learn more about our Tesla Optimus platform support.
Force/torque (F/T) sensors at robot joints measure the forces and torques applied during physical interaction with the environment. These sensors are critical for collaborative safety: they enable the robot to detect unexpected contact (such as a collision with a person) and respond by stopping or yielding. They also enable precise manipulation, force-controlled assembly, and compliant movement.
Garments add mass and stiffness to the kinematic chain, altering the forces and torques that reach F/T sensors. A heavy garment on a robot arm increases the baseline force reading at the wrist F/T sensor, reducing the available dynamic range for detecting external contact. A stiff garment that resists joint movement adds torque that the F/T sensor must distinguish from external interaction forces.
We define mass and stiffness budgets for every garment zone based on the F/T sensor specifications of the target platform. For a robot arm with a wrist F/T sensor rated for 100N full scale, a sleeve weighing 200g adds approximately 2N of gravitational force at the sensor, consuming 2% of the sensor's dynamic range. This is acceptable. A sleeve weighing 2kg would consume 20%, which may not be. Stiffness budgets are similarly computed: the torque required to flex the garment through the joint's range of motion must remain well below the F/T sensor's collision detection threshold.
These budgets drive material selection and construction techniques. Lightweight, low-stiffness fabrics are specified for zones near F/T sensors. Seams, closures, and reinforcements that add stiffness are routed away from joint articulation paths. The result is a garment that the robot's control system barely notices, allowing F/T-based safety and interaction functionality to operate within its designed parameters.
Inertial Measurement Units (IMUs) combine accelerometers, gyroscopes, and sometimes magnetometers to measure a robot's orientation, angular velocity, and linear acceleration. IMUs are essential for balance, locomotion control, and pose estimation. They are sensitive instruments, and certain garment materials and construction elements can interfere with their operation.
The primary concern is magnetometer interference. Magnetometers measure the Earth's magnetic field to determine heading. Ferromagnetic materials (iron, nickel, cobalt, and alloys containing these elements) near a magnetometer distort the local magnetic field, causing heading errors. Some garment hardware, including certain zippers, snaps, buckles, and magnetic closures, contains ferromagnetic materials that can produce measurable magnetometer distortion at distances of several centimeters.
We specify non-ferromagnetic materials for all garment hardware within the magnetometer exclusion zone, which varies by platform but typically extends 10 to 15cm from each IMU location. Zippers use aluminum, brass, or polymer elements. Snaps and fasteners are brass, titanium, or engineered polymer. Magnetic closures are prohibited within the exclusion zone. Thread and fiber selections are verified to be free of ferromagnetic content, including metallic-effect threads that may contain steel or nickel cores.
Accelerometer and gyroscope interference is less common but can occur if garment elements create vibrations at frequencies that overlap with the IMU's measurement bandwidth. Loose buckles, flapping fabric edges, or resonant structural elements that vibrate during robot locomotion can introduce noise into acceleration and angular rate measurements. Our garment designs eliminate these sources through secure fastening, tensioned fabric management, and vibration damping at potential resonance points.
Voice interaction is a primary interface modality for social and service robots. Microphone arrays on robots like 1X NEO and Figure 03 are carefully positioned and calibrated for beamforming, noise cancellation, and speaker localization. Covering these microphone arrays with fabric introduces acoustic attenuation, frequency-dependent filtering, and disruption of the spatial relationships between array elements that beamforming algorithms depend on.
We address microphone compatibility through acoustically transparent fabrics. These are open-weave textiles with acoustic impedance characteristics that allow sound waves to pass through with minimal attenuation across the speech frequency range (300Hz to 8kHz). The fabrics are selected to provide less than 3dB of attenuation at any frequency within this range, preserving the signal-to-noise ratio required for reliable speech recognition.
Microphone array beamforming depends on precise time-of-arrival differences between array elements. If fabric thickness or density varies across the array, it introduces differential delays that distort the beamforming pattern. We ensure uniform acoustic properties across the entire area of each microphone array by using a single, continuous fabric panel over all array elements, avoiding seams, folds, or overlapping layers within the array zone. Fabric tension is controlled to prevent dynamic changes in acoustic transmission as the robot moves.
Every material used in a MaisonRoboto garment undergoes sensor compatibility testing before it is approved for production. Testing follows a five-stage protocol that evaluates each relevant sensor modality.
Material samples are tested in a spectrophotometer across the 850nm to 1600nm range to determine near-infrared transmittance for LiDAR compatibility. Samples are tested in both dry and wet conditions, as moisture content significantly affects NIR absorption. Pass criteria: greater than 85% transmittance at the target platform's LiDAR wavelength in both dry and wet states.
Conductive fabric zones are tested on a capacitive sensor test bench that replicates the target platform's sensor hardware. A standardized test probe simulates human touch at various pressures and positions. Pass criteria: detection within 5ms of bare-sensor baseline at all probe positions, across 20% to 90% relative humidity.
Garment sections covering F/T sensor zones are tested on a force transmission bench. Known forces are applied to the garment exterior, and the force measured at the sensor position beneath the garment is compared to bare-surface readings. Pass criteria: force transmission fidelity greater than 95% for collision detection forces, and garment self-weight below the defined mass budget for each zone.
All garment hardware and materials are tested for ferromagnetic content using a Gaussmeter at positions corresponding to IMU locations on the target platform. Pass criteria: magnetic field distortion below 0.5 microTesla at the closest IMU position when garment is installed.
Fabric samples are tested in an anechoic chamber with calibrated speaker and microphone arrays replicating the target platform's configuration. Frequency response, attenuation, and phase characteristics are measured across 100Hz to 12kHz. Pass criteria: less than 3dB attenuation and less than 10 degrees phase shift at any frequency from 300Hz to 8kHz.
Each robot platform presents a unique sensor compatibility profile based on its sensor types, positions, and specifications. We maintain a detailed compatibility matrix for every supported platform, documenting each sensor's location, type, exclusion zone dimensions, and garment design constraints. The following summarizes the key sensor challenges by platform.
Primary challenges: Capacitive touch sensors on hands and torso require conductive relay zones. Multiple cameras on head and torso require rigid bezel clearance systems. IMUs at torso and hip require ferromagnetic-free hardware within 12cm. Medium overall complexity. Full platform details on our Tesla Optimus Gen 2 page.
Primary challenges: Dense LiDAR and stereo camera coverage requires extensive LiDAR-transparent panels and camera clearance. F/T sensors at all major joints impose strict mass and stiffness budgets. Highest overall sensor density of any platform. Very high design complexity. Visit our Boston Dynamics Atlas page for platform specifications.
Primary challenges: Extensive camera coverage requiring precise aperture management across head, torso, and limbs. Microphone arrays for voice interaction require acoustic transparency zones. Medium-high overall complexity. See our Figure 03 platform page for details.
Primary challenges: Soft-body construction with integrated tactile sensing across the entire surface presents unique challenges, as the garment must relay tactile information across the full body rather than at discrete sensor points. Camera and microphone arrays require standard clearance and acoustic transparency. High overall complexity due to whole-body tactile sensing.
The sensor landscape for humanoid robots is evolving rapidly. MaisonRoboto's materials research lab continuously evaluates emerging sensor technologies and develops compatible textile solutions in advance of commercial deployment. Current areas of active development include electronic skin (e-skin) compatibility, where robots with full-body distributed pressure sensing require garments that transmit pressure information across their entire surface area. We are also developing compatibility solutions for event cameras (dynamic vision sensors), which operate on different optical principles than conventional cameras and may have different material transparency requirements.
Thermal imaging sensors, increasingly used for human detection and safety, require materials with controlled thermal emissivity. Radar-based gesture detection, used on some industrial platforms, requires radar-transparent fabric zones similar in principle to LiDAR-transparent materials but operating at microwave frequencies. Each emerging technology adds a new dimension to the sensor compatibility challenge and a new discipline to MaisonRoboto's smart textiles research program.
Sensor compatibility is the technical foundation that enables everything else in robot fashion. Without it, a garment is an obstruction. With it, a garment becomes a transparent layer that enhances the robot's appearance without compromising its capabilities. Explore our comprehensive Robot Fashion Guide for the full picture of how technical engineering meets design artistry.
Every sensor on your robot was engineered for a reason. MaisonRoboto garments respect that engineering with materials science that makes fashion and function inseparable.
Commission Sensor-Compatible Couture