I shouldn't have blandly stated "very slow". It's relative and if you're not doing it a lot it's not noticeable, however the effect exists and is measurable.
As an example, I wrote a loop which cycles a million times generating a random number, incrementing a pointer and checking the UBound() of an array: it ran in time taken approx one-tenth of a second.
Then I added code to ReDim the array and store a value in it, thus:-
Code:
For iLoop = 1 To iLimit
iValue = Rnd()
arrPtr = arrPtr + 1
If arrPtr > UBound(arr) Then
ReDim Preserve arr(UBound(arr) + 1000)
End If
arr(arrPtr) = iValue
Next iLoop
For 200k loops it took less than a second; for 400k loops, 6 secs; 600k loops, 14 secs; 800k loops, 24 secs; 1M loops, 38 secs; 2M loops, 156 secs; 3M loops, 347 secs. ReDim takes longer and longer as the program progresses and as the array gets bigger - obviously, because VBA has to make a copy of the entire array every time it ReDims it and as the copy gets bigger, each copy operation takes longer than the previous one. The increase is exponential.
I then tried increasing the size of the array by 1000 each time: this brought the run time for 1M loops down to 9 secs; for 2M loops, 40 secs; for 3M loops, 89 secs. The increase is effectively linear.
Okay, they're big numbers but if you're doing a lot of ReDimming, especially on slower machines, there is a performance downside. And it doesn't take long to rack up big numbers if you're storing data from large workbooks. For small applications I wouldn't worry too much about it but the code optimiser in me is always inclined to keep unnecessary operations down to a minimum.
Lots of info at
http://www.google.co.uk/#q=redim+preserve+performance.