In this article, we propose FedAND, a unified federated learning optimization algorithm, to tackle client drift and server drift issues under partial client participation. Federated learning is gaining popularity due to privacy concerns and mobile computing, but it still faces challenges due to heterogeneous and distributed data. FedAND leverages consensus alternating direction method of multipliers (ADMM) and resolves the server drift caused by the server state in the global update. Under partial participation, we prove that FedAND preserves the strong convergence properties of ADMM while suppressing the server drift, which in turn reduces the client drift and thus achieves better convergence. Our empirical results demonstrate superior performance compared to other methods such as FedProx, FedADMM, FedPD, and FedDyn in diverse scenarios of statistical and system heterogeneity under partial client participation.