Essays are widely used for learning assessment in the educational context. Commercial solutions for automated essay scoring have shown promising results, but vulnerability to fraud is still criticized in the scientific community. An off-topic essay detection tool can be used to increase the reliability of automated essay scoring systems and to generate feedback to students. In this context, this paper presents a systematic review of the literature on automatic detection of off-topic essays. We describe the techniques and resources, the corpora and the performance of existing approaches. The results found indicate some gaps and deficiencies in the existing literature, including the need to reduce error rates and to use validation sets based on real examples of off-topic essays.