A statistic you won't see on Box Office Mojo is how many CPU cycles were spent making a given film. An exact count would be impossible, but no one gives even a rough estimate!
I saw Avatar in 3D the other week and I would guess it is the most computationally expensive movie ever. The complexity of the scenes and the sheer number of screen minutes which are 100% virtual is stunning.
Avatar's main characters are synthetic but so is their whole world, with rich and detailed plant and animal life and stunning landscapes. If you watch the "making of" footage you see them "filming" in an empty sound stage: essentially everything is 3D graphics added in later. But its all believable, it's so well done that it's not hard to suspend disbelief and just sit back and enjoy the movie.
Most of the effects for Avatar were created by Weta Digital, and this article about their server farm says they have 40,000 processors. But for a full accounting of CPU cycles let's go all the way back to James Cameron typing up the screenplay. Even an average laptop crunches away at a few billion cycles per second. If he spends 8 hours writing he alone has burnt through trillions of cycles.
When pre-production started probably thousands of computer hours went into proving out the technology and developing the characters. And once in production an army of 3d modelers, animators and lighters cranked away for years. Not to mention less powerful laptops and desktops of everyone involved, from the producers to the schedulers. And let's throw in everyone's iPhones and iPods and anything else used on set or in support of the production. All these are ticking away millions or billions of times per second.
Now let's cast the net even wider. What about cycles spent long before production started. All these high tech products were designed on computers. All the software used in the production was written somewhere, slaved over line by line for months or years. Software like Windows or Photoshop has been around for decades, so the total computer time spent developing them is staggering. These CPU cycles were spent to benefit many users, not just Avatar's staff, but they provided the foundation, the context in which this highly advanced movie could be made.
Now you can see what it takes to become the most computationally expensive movie to date. Any big modern production can vie for the title, because with each passing year the computational mountain we all stand on is built up a little higher. In Avatar's case you take all that legacy, a very big production team, and a 40,000 processor server farm, and I bet they have set a new record. For now.