Abstract Managing and referencing design knowledge is a critical activity in the design process. However, reliably retrieving useful knowledge can be a frustrating experience for users of knowledge management systems due to inherent limitations of standard keyword-based searches. In this research, we consider the task of retrieving relevant lessons learned from the NASA Lessons Learned Information System (LLIS). To this end, we apply a state-of-the-art natural language processing (NLP) technique for information retrieval (IR): semantic search with sentence-BERT, which is a modification of a Bidirectional Encoder Representations from Transformers (BERT) model that uses siamese and triplet network architectures to obtain semantically meaningful sentence embeddings. While the pre-trained sBERT model performs well out-of-the-box, we further fine-tune the model on data from the LLIS so that it learns on design engineering-relevant vocabulary. We quantify the improvement in query results using both standard sBERT and fine-tuned sBERT over a keyword search. Our use case throughout the paper is to use queries related to specific requirements from a NASA project. Fine tuning the sBERT model on LLIS data yields a mean average precision (MAP) of 0.807 on queries based on information needs from a real NASA project. Results indicate that applying state-of-the-art natural language processing techniques, especially when fine-tuned using engineering data, to design information retrieval tasks shows significant promise in modernizing design knowledge management systems.