Federated Learning (FL) is a popular distributed machine learning paradigm that enables devices to work together to train a centralized model without transmitting raw data. However, when the model becomes complex, mobile devices' communication overhead can be unacceptably large in traditional FL methods. To address this problem, Federated Distillation (FD) is proposed as a federated version of knowledge distillation. Most of the recent FD methods calculate the model output (logits) of each client as the local knowledge on a public proxy dataset and do distillation with the average of the clients' logits on the server side. Nevertheless, these FD methods are not robust and perform poorly in the non-IID (data is nonindependent and non-identically distributed) scenario such as Federated Recommendation (FR). In order to eliminate the non-IID problem and apply FD in FR, we proposed a novel method named FedDyn to construct a proxy dataset and extract local knowledge dynamically in this paper. In this method, we replaced the average strategy with focus distillation to strengthen reliable knowledge, which solved the non-IID problem that the local model has biased knowledge. The average strategy is a dilution and perturbation of knowledge since it treats reliable and unreliable knowledge equally important. In addition, to prevent inference of private user information from local knowledge, we used a method like local differential privacy techniques to protect this knowledge on the client side. The experimental results showed that our method has a faster convergence speed and lower communication overhead than the baselines on three datasets, including MovieLens-10OK, MovieLens-IM and Pinterest.