But we are desperate to get you some performance numbers so we are publishing them anyway. His iMac was running a newer version of the NVIDIA driver with performance optimizations lacking in our version. Our lab's iMac with the 680MX is running 10.8.2. The 'remote mad scientist' that volunteered to send us results for a 'late 2012' iMac equipped with the GeForce GTX 675MX was running OS X 10.8.3 beta.
That implies twice the potential when running a graphics intensive app with complex textures being rendered and stored by the GPU - though it's often true that graphics intensive apps don't saturate the full bandwidth. The 680MX has almost twice the Texture Fillrate. We emphasize this because the use of VRAM will not diminish in the future. Ditto for Apple's Aperture pro app when running the Noise Ninja plugin to removed noise from 50 raw images. In our testing, we used OpenGL Driver Monitor to confirm that the Heaven Benchmark (best settings, 8x AA, 16x Aniso, 2560x1440 rez) was using more than 1G of VRAM. The 680MX has twice the video memory of the 675MX. Since we can't be sure if those apps are reporting correctly, we used NVIDIA's rating for the 675MXs core clock in the table above.)ġ. However LuxMark rated the same iMac at 324MHz. (*NOTE: One owner of a 27' iMac with the 675MX reported 720MHz core clock rating when he ran CUDA-Z benchmark.