Artificial neural networks (NNs) are a machine learning algorithm that have been used as a convenient alternative of conventional statistical models, such as regression in prediction and classification because of their capability of modeling complex relationships between dependent and independent variables without a priori assumptions about the model form and variable distributions. However, traditional NNs cannot incorporate dependencies of data with a clustering or nesting structure involved in longitudinal studies and cluster sampling. This research is intended to fill this literature gap by integrating the random-effects structure into NNs to account for within-cluster correlations. The proposed NN method incorporating random effects (NNRE) is trained by minimizing the cost function using the backpropagation algorithm combined with the quasi-Newton and gradient descent algorithms. Model overfitting is controlled by using the L2 regularization method. The trained NNRE model is evaluated for prediction accuracy by using the leaving-one-out cross-validation for both simulated and real data. Prediction accuracy is compared between NNRE and two existing models, the conventional generalized linear mixed model (GLIMMIX) and the generalized neural network mixed model (GNMM), using simulations and real data. Results show that the proposed NNRE results in higher accuracy than both the GLIMMIX and GNMM.