Abstract Brain-computer interface (BCI) is a cutting-edge technology that enables interaction with external devices by decoding human intentions, and is highly valuable in the fields of medical rehabilitation and human-robot collaboration. The technique of decoding motor intent for motor execution (ME) based on electroencephalographic (EEG) signals is in the feasibility study stage. There are still insufficient studies on the accuracy of motor execution EEG signal recognition in between-subjects classification to reach the level of realistic applications. This paper aims to investigate EEG signal-driven hand movement recognition by analyzing low-frequency time-domain (LFTD) information. Experiments with four types of hand movements, two force parameter (extraction and pushing) tasks, and a four-target directional displacement task were designed and executed, and the EEG data from thirteen healthy volunteers was collected. Sliding window approach is used to expand the dataset in order to address the issue of EEG signal overfitting. Furtherly, CNN-BiLSTM model, an end-to-end serial combination of a Bidirectional Long Short-Term Memory Network (BiLSTM) and Convolutional Neural Network (CNN) is constructed to classify the raw EEG data to recognize the hand movement. According to experimental data, the model is able to categorize four types of hand movements, extraction movements, pushing movements, and four target direction displacement movements with an accuracy of 99.14%±0.49%, 99.29%±0.11%, 99.23%±0.60%, and 98.11%± 0.23%, respectively. Furthermore, comparative tests conducted with alternative deep learning models (LSTM, CNN, EEGNet, CNN-LSTM) demonstrates that the CNN-BiLSTM model is with practicable accuracy in terms of EEG-based hand movement recognition and its parameter decoding.