This paper explores the impact of error cross-sectional dependence (modelled as a factor structure) on a number of widely used IV and generalized method of moments (GMM) estimators in the context of a linear dynamic panel data model. It is shown that, under such circumstances, the standard moment conditions used by these estimators are invalid - a result that holds for any lag length of the instruments used. Transforming the data in terms of deviations from time-specific averages helps to reduce the asymptotic bias of the estimators, unless the factor loadings have mean zero. The finite sample behaviour of IV and GMM estimators is investigated by means of Monte Carlo experiments. The results suggest that the bias of these estimators can be severe to the extent that the standard fixed effects estimator is not generally inferior anymore in terms of root median square error. Time-specific demeaning alleviates the problem, although the effectiveness of this transformation decreases when the variance of the factor loadings is large.