Accurate perception of a legged robot’s body states is crucial for effective motion control. However, certain states of the system cannot be directly measured, necessitating their estimation by combining available sensor data. Both methods of valuing or neglecting the interaction between the legs and the ground have achieved significant success, but it seems that these methods lack attention to absolute position and yaw. In this study, a novel multi-sensor fusion approach for legged robots is proposed, addressing the deficiency by combining Ultra Wide Band (UWB) technology, Inertial Measurement Unit (IMU) data, and leg kinematics data. These measurement data are injected into an Error State Kalman Filter (ESKF) to achieve unbiased state estimation. The proposed approach is validated through experiments, demonstrating its effectiveness in long-term, long-distance, and uneven terrain scenarios. The contributions of this study include full dimensional unbiased estimation, a modeling approach in prediction and update steps that enhances efficiency, theoretical confirmation through observability analysis, and its application on a hexapod robot in real-world scenarios. This research provides a viable solution to the global localization problem of legged robots, paving the way for the development of intelligent and stable legged robots in the future.