Nowadays, the urban air mobility (UAM) industry has been in the limelight. To guarantee the high reliability of industrial-level autonomous flying in urban regions, flight safety and survivability should be a high priority concern. To satisfy these industrial needs, this paper proposes a semantic segmentation-based vision-enabled safe landing position estimation framework. This framework is a core component of the contingency plan that should be employed in the case of an emergency during flight for UAM and drone transportation. More precisely, our proposed framework can identify the coarse candidate region for the safe landing using deep neural network (DNN)-based semantic segmentation and then determine the refined safety landing zone and estimated position utilizing contour detection, surface flatness and inclination filtering. To verify the performance of the proposed contribution, the simulation was conducted and analyzed, and its result has proven to be valid with respect to both quantitative and qualitative terms. As a result, our proposed framework is capable of not only overcoming the inherent difficulty of image processing in the real world but also significantly improving flight emergency relief performance.