In some applications, the mean or median response is linearly related to some variables but the relation to additional variables are not easily parameterized. Partly linear models arise naturally in such circumstances. Suppose that a random sample {(Ti, Xi, Yi),i=1, 2, ?, n} is modeled byYi=XTis0+g0(Ti)+errori, whereYiis a real-valued response,Xi?RpandTiranges over a unit square, andg0is an unknown function with a certain degree of smoothness. We make use of bivariate tensor-product B-splines as an approximation of the functiong0and consider M-type regression splines by minimization of ?ni=1?(Yi?XTis?gn(Ti)) for some convex function?. Mean, median and quantile regressions are included in this class. We show under appropriate conditions that the parameter estimate ofsachieves its information bound asymptotically and the function estimate ofg0attains the optimal rate of convergence in mean squared error. Our asymptotic results generalize directly to higher dimensions (for the variableT) provided that the functiong0is sufficiently smooth. Such smoothness conditions have often been assumed in the literature, but they impose practical limitations for the application of multivariate tensor product splines in function estimation. We also discuss the implementation of B-spline approximations based on commonly used knot selection criteria together with a simulation study of both mean and median regressions of partly linear models.