How does the cost of optimal methods increase with the dimension?
We answer this question for integration of very smooth functions
that are defined on the d-dimensional unit cube.
For suitable classes of smooth functions
the order of convergence of optimal methods does not depend on d.
Nevertheless we prove exponential lower bounds for quadrature formulas
with positive weights.
Therefore the optimal order of convergence does not say much
about the tractability or intractability of a problem.
We consider ``very smooth'' functions.
We use the norm
|| f ||^2 = \sum || D^\alpha f ||_2^2,
where the (infinite) sum is over all partial derivatives.
Our class of functions is the unit ball with respect to this norm.
We prove an exponential (in d) lower bound and prove that the error
of any positive quadrature formula with n sample points is at least
(1 - n c^d)^{1/2} ,
where c=0.995.
Submitted for publication.