Graphs have become a widely-used tool to model data with relationships in real life for a long time. To discover the important contents in the graph, many graph neural networks (GNNs) have been come up with. Nevertheless, these models tend to adopt ReLU as their activation function for its nonlinearity, effectiveness, and efficiency. Owing to its own deficiency, it would cause many generated feature elements to be zero, which would miss part significant features. To overcome this problem, an adaptive weight vector to tune the features was provided. By limiting the elements of the weight vector, it can be a better substitute for ReLU. Besides, such a weight vector adaptively measures the importance of each feature element to work as a feature selection operator. To show the function of the weight vector, we examine a GCN with the weight vecotor in node classification, and it exhibits overall improvements over three well-known citation networks.