Edge-preserving smoothing is crucial in image processing for removing noise and fine textures while maintaining significant structures. This paper focuses on filter-based methods due to their computational efficiency and ease of implementation. Edges contain essential information defining object boundaries and texture details, characterized by both intensity and scale. Traditional filters, such as the bilateral, domain transform, and guided filters, primarily rely on edge intensity without the ability to adjust scale. This limitation prevents them from effectively smoothing small-scale textures while preserving significant structures. To address this limitation, we propose an edge-preserving smoothing filter that enables real-time control of both edge intensity and scale. Our method introduces a novel metric based on the variance of pixel values within patches to quantitatively assess regional flatness at a specific scale. The fundamental idea is to smooth patches at a specific scale to remove smaller-scale details while preserving larger-scale structures. Each pixel is assigned a weighted average of the smoothed results from multiple overlapping patches, with the weights determined by the inverse of the patch variances. This approach allows adaptive filtering that effectively smooths textures while preserving significant edges. Experimental comparisons with conventional methods demonstrate that our proposed filter efficiently removes textures and noise while preserving significant edges. By providing immediate visual feedback, our method allows rapid adjustments of both scale and intensity, making it suitable for real-time applications. Future work will focus on adaptive scale control to develop a texture suppression filter adaptable to diverse image structures and textures.