Remote sensing image fusion allows the spectral, spatial and temporal enhancement of images. New techniques for image fusion are constantly emerging shifting the focus from pan-sharpening to spatiotemporal fusion of data originating from different sensors and platforms. However, the application of image fusion in the field of Earth observation still remains limited. The number and complexity of the different techniques available today can be overwhelming thus preventing users from fully exploiting the potential of fusion. The aim of this study is to make fusion products more accessible to users by providing them with a simple tool for spatiotemporal fusion in Python. This tool will contribute to the better exploitation of data from available sensors making possible to bring the images to the spectral, spatial and temporal resolution required by the user. The fusion algorithm implemented in the tool is based on the spatial and temporal adaptive reflectance fusion model (STARFM) - a well established fusion technique in the field of remote sensing often used as benchmark by other algorithms. The capabilities of the tool are demonstrated by three case studies using Sentinel-2 and simulated Sentinel-3 data. The first case study is about deforestation in the Amazon forest. The other two case studies concentrate on detecting change in an agricultural site in Southern Germany and urban flooding caused by the hurricane Harvey.