A number of authors have explored the computational power of dynamical systems with a finite number of continuous degrees of freedom. We review this work, from the point of view both of physics and of recurrent neural networks. We state several conjectures limiting the dimensionality, smoothness, and robustness to noise that these systems can have and still be computationally powerful. We also explore resource-bounded and real-time analog computers, review how techniques like VC dimension can be used to prove lower bounds, and suggest several directions for further research.