As I already mentioned, it depends on the processing type. If you run several programs that analyzes a lot of stuff in the memory all the time, the memory speed and size (+bus) and CPU is deciding the speed the most. However, it's very usual in music to record and playback things at the same time when you run different real-time effects in the memory in a multithreaded environment. What happens now is that if the disk read and write access time is slow it doesn't matter how much you have in memory and how large it is since what you have in memory is dependent on the data coming from the disks when the CPU is processing it all. You might see a fast user interface because of the high RAM memory, but the result might be high latency and clipping in the sound.

It is true that swapping on the disks is a huge performance eater, but when you start recording many channels at once and run real-time effects on them you will notice a lot of swapping even though you have much RAM. This is very noticable if you record on a slow hard drive with little available disk space left.

The computer is always as fast as the slowest part of it, that is currently the hard drive. Once again, in music playback and recording is so essential that it basically is a matter of how fast the computer gets and sets data on the disk that decides the computers performance in such an environment. The reason why it is so important is because several processes need to access different things on the disk at the same time. So the more tracks you run the more it affects the performance negatively if the disk is slow, even though you might have much RAM memory. To solve this you need 2 things: intelligent data access and fast disk mechanisms built in it.