Leang Posted December 1, 2012 Share Posted December 1, 2012 I had a question relating to the new iMac too: Does anyone know whether the Nvidia Quadro cards offer any improved performance in resolve over the non-pro cards (GTX) other than clock speed and cores? Does the same apply to mobile(M) cards as well. On paper the Nvidia 680MX in the new top-end iMac kills the Quadro 4000, which is DaVinci Resolve approved. The 680MX is also cheaper. What's the downside? I know they lock out some 3D developer features, but is their any downside if you are just doing video editing? I don't know about the new GTX cards, but I had sold my GTX 590 with extra change to spare for a Quadro 2000 and the improvement was 4x better. I don't know much about the marketing schemes from nVidia but I can care less about the gaming performance. This just last year running CS5.5 I noticed a significant difference in realtime plugin performance for the 590 and the Quadro 2000. I'm still gunning a Quadro 2000 atm. My rig is an Intel i7 990x (extreme) with a 24g RAM board on Win8. No need atm to upgrade as everything is still running smoothly even with R3D files. Either I didn't do settings right for the GTX 590 or it's simply not meant for editing. I trust the Quadro's out of experience. Not sure about the new generation cards in dynamics for gaming and editing, but I prefer a non gaming card tbh, and I'm pretty sure Apple is trying to target a new generation of gamers as well and follow the PC boat. Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 I don't know about the new GTX cards, but I had sold my GTX 590 with extra change to spare for a Quadro 2000 and the improvement was 4x better. I don't know much about the marketing schemes from nVidia but I can care less about the gaming performance. This just last year running CS5.5 I noticed a significant difference in realtime plugin performance for the 590 and the Quadro 2000. I'm still gunning a Quadro 2000 atm. My rig is an Intel i7 990x (extreme) with a 24g RAM board on Win8. No need atm to upgrade as everything is still running smoothly even with R3D files. Either I didn't do settings right for the GTX 590 or it's simply not meant for editing. I trust the Quadro's out of experience. Not sure about the new generation cards in dynamics for gaming and editing, but I prefer a non gaming card tbh, and I'm pretty sure Apple is trying to target a new generation of gamers as well and follow the PC boat. That's pretty weird. One option is your work needs extreme amounts of RAM. One advantage of quadro vs Geforce is their RAM amount. Can't see how gfx can be slower than quadro in other cases. Rendering depends on CUDA cores and CORE frequency. GFX beats quadro in it by quite a bit. I've done very quick research so here is one of the first links to a CS6 AE benchmark I found:[url="http://forums.creativecow.net/thread/2/1019643"]http://forums.creativecow.net/thread/2/1019643[/url]cliffs: gf 680 gtx beats quadro 6000 discusscion about quadro vs gf (didn't have time to read it all though, but I'm assuming gfx is better. I would be shocked if not): [url="http://forums.adobe.com/message/4809602"]http://forums.adobe.com/message/4809602[/url] Quote Link to comment Share on other sites More sharing options...
MattH Posted December 1, 2012 Share Posted December 1, 2012 WTF? The GTX 680MX is a mobile GPU. These imacs aren't desktops, they are high end laptops on a fancy stand. Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted December 1, 2012 Administrators Share Posted December 1, 2012 I've heard from Timescapes (Tom Lowe) that 3GB video ram on the card makes a large difference over 1.5GB or 2GB in editing Red Raw with CUDA. Could be a reason some consumer GTX cards are out performed by a Quadro card. Just make sure you get a 3GB card, 580 or better, will be fine for the Blackmagic Cinema Camera. Budget option is a 560 Ti 1.5GB. Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted December 1, 2012 Administrators Share Posted December 1, 2012 WTF? The GTX 680MX is a mobile GPU. These imacs aren't desktops, they are high end laptops on a fancy stand. True it is all mobile hardware, not desktop like a Mac Pro or a PC. GTX 680MX is still a nice card though and will run Resolve well enough for basic editing and grading with Blackmagic Cinema Camera footage. Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 I've heard from Timescapes (Tom Lowe) that 3GB video ram on the card makes a large difference over 1.5GB or 2GB in editing Red Raw with CUDA. Could be a reason some consumer GTX cards are out performed by a Quadro card. Just make sure you get a 3GB card, 580 or better, will be fine for the Blackmagic Cinema Camera. Budget option is a 560 Ti 1.5GB. Yeah, that's what's pretty obvious as I've stated before. Similarly 4GB video ram makes large difference over 3GB ram and so on (assuming you need more than 3GB of video ram). In his case it's weird because 590 gtx has more RAM than quadro 2000 which I checked and has 1GB. Also probably better to get 2 cards for SLI to get twice the amount of RAM. Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 True it is all mobile hardware, not desktop like a Mac Pro or a PC. GTX 680MX is still a nice card though and will run Resolve well enough for basic editing and grading with Blackmagic Cinema Camera footage. That's a brilliant idea! Apple, I'm proud of you one more time! Make a Mac Book PRO which is worse (specs wise) and more expensive (I'm assuming) than iMac which is also laptop (hardware wise) and could be a laptop but instead is not a laptop (poratbility/oficially) which is still more expensive and slower than the most expensive PC-laptop. Genius. Not to mention PCs which are gonna kill iMac like BMCC is killing Canon in video department (yeah, yeah, I know they don't look as COOOOOOOL as iMACs and it's 2012 FOR GODS SAKE, everything should be 0.00001micron thick!!!!! ;)) Quote Link to comment Share on other sites More sharing options...
galenb Posted December 1, 2012 Share Posted December 1, 2012 tomekk, Do you get paid to try and convince people to not buy Apple products? Because judging by the amount of energy you expend in doing so, you should be. ;-) Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 Tomekk -- If you had $3000 to spend, what CUDA-accelerated video editing workstation and monitor combination would you get? This requires a bit of researching because I haven't been following workstation market for quite a bit (only laptops). If I have spare time or more ppl would like to see this - I'll look around. There are trade offs (6core vs 4core, SLI more memory older card vs faster card less memory etc). Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 tomekk, Do you get paid to try and convince people to not buy Apple products? Because judging by the amount of energy you expend in doing so, you should be. ;-) I wish =), looks like as of now I'm just doing it for nothing. Not the most EV+ approach for me, lol. Oh, I'm not trying to convince ppl. Just saying what I think, seriously. PPL can do WHATEVER they want. Including buying Apple's overpriced PC hardware ;). Quote Link to comment Share on other sites More sharing options...
powderbanks Posted December 1, 2012 Share Posted December 1, 2012 Yes, I saw. Not sure why they are doing that. Definitely the 27". something to do with keeping the size down, iirc. and i think it's safe to say the average person buying a 21.5" imac probably wouldn't be replacing the ram at any point anyways; so why not solder it in and safe a few millimeters i'm pricing out a windows workstation right now. i had thought about the imacs, but for the price i'd rather build my own rig and have more options for later upgrades Quote Link to comment Share on other sites More sharing options...
MattH Posted December 1, 2012 Share Posted December 1, 2012 Yeah, that's what's pretty obvious as I've stated before. Similarly 4GB video ram makes large difference over 3GB ram and so on (assuming you need more than 3GB of video ram). In his case it's weird because 590 gtx has more RAM than quadro 2000 which I checked and has 1GB. Also probably better to get 2 cards for SLI to get twice the amount of RAM. Just been looking into this a bit and apparently running SLI does not increase the amount of ram. The memory is instead mirrored. In a game it boosts the performance so much because it renders frames alternately so both cards are running the same game and so have the same data in their ram. So it looks for video editing that SLI is not worth it. Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 Just been looking into this a bit and apparently running SLI does not increase the amount of ram. The memory is instead mirrored. In a game it boosts the performance so much because it renders frames alternately so both cards are running the same game and so have the same data in their ram. So it looks for video editing that SLI is not worth it. Well, ok, but the rendering speed is still increased significantly. So it's worth it for performance increase. It's just when u exceed available RAM - performance decreases. hmmm looks like only advantage of quadro is their RAM amount and that's what NVIDIA is limiting on gamers cards. GTX 590 has 3GB. It should easily beat quadro 5000 with 2GB of video RAM IMHO. Quadro 6000 has 6GB so it'll be clear winner once someone needs more than 3GB of video ram. Before it's probably a loser. For anyone interested. You should be checking how much vram you need for your work. Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted December 1, 2012 Administrators Share Posted December 1, 2012 Also probably better to get 2 cards for SLI to get twice the amount of RAM. Resolve doesn't yet use two cards in SLI for CUDA I'm afraid. Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 ^bummer Quote Link to comment Share on other sites More sharing options...
Leang Posted December 1, 2012 Share Posted December 1, 2012 I think high performance video cards are more dynamic in structure than to just theorize that memory makes one better than the other. for instance a high class 1GB video card supplied with different chips not found on commercial 2gb+ gaming cards have significant roles for editing multi codecs, resolutions, modeling etc... I can say that I have a workhorse station and Adobe had recommended a Quadro card over the GTX. I don't care much for the specified science behind it. I just want to drive and test. Which I did, so I ditched the GTX 590 for the Quadro 2000 and everything was much sweeter. CS6 runs beautiful as always. I can have Colorista II and Magic Bullet (2 plugins) on one clip in Full resolution playback running smooth on the Quadro 2000. I couldn't get that with the 590. Aside from the plugins the 590 was a sweet card for sure. Quote Link to comment Share on other sites More sharing options...
QuickHitRecord Posted December 1, 2012 Author Share Posted December 1, 2012 WTF? The GTX 680MX is a mobile GPU. These imacs aren't desktops, they are high end laptops on a fancy stand. I've heard this before but I am not sure that I understand it. So this is a piece of hardware that has been watered down in size (and thus performance) to fit into a smaller device? That's disconcerting. How much of a performance hit could I expect? Quote Link to comment Share on other sites More sharing options...
HurtinMinorKey Posted December 1, 2012 Share Posted December 1, 2012 I've heard this before but I am not sure that I understand it. So this is a piece of hardware that has been watered down in size (and thus performance) to fit into a smaller device? That's disconcerting. How much of a performance hit could I expect? This "mobile card" is only slightly slower than their high end desktop cards. The 680MX absolutely destroys a 560 Ti 1.5GB.2GB will be plenty to edit BMC footage. Remember, there's a big difference between 2K RAW and 4K RED RAW. Nvidia locks out some "pro" features on their desktop and mobile cards. Some of these features are used for 3D rendering. The non Quadro cards will send these tasks to the CPU. I don't think any of these tasks have anything to do with editing video, but they might. Quote Link to comment Share on other sites More sharing options...
tomekk Posted December 1, 2012 Share Posted December 1, 2012 This "mobile card" is only slightly slower than their high end desktop cards. The 680MX absolutely destroys a 560 Ti 1.5GB.2GB will be plenty to edit BMC footage. Remember, there's a big difference between 2K RAW and 4K RED RAW. Nvidia locks out some "pro" features on their desktop and mobile cards. Some of these features are used for 3D rendering. The non Quadro cards will send these tasks to the CPU. I don't think any of these tasks have anything to do with editing video, but they might. I think high performance video cards are more dynamic in structure than to just theorize that memory makes one better than the other. for instance a high class 1GB video card supplied with different chips not found on commercial 2gb+ gaming cards have significant roles for editing multi codecs, resolutions, modeling etc... I can say that I have a workhorse station and Adobe had recommended a Quadro card over the GTX. I don't care much for the specified science behind it. I just want to drive and test. Which I did, so I ditched the GTX 590 for the Quadro 2000 and everything was much sweeter. CS6 runs beautiful as always. I can have Colorista II and Magic Bullet (2 plugins) on one clip in Full resolution playback running smooth on the Quadro 2000. I couldn't get that with the 590. Aside from the plugins the 590 was a sweet card for sure. [url="http://www.reduser.net/forum/archive/index.php/t-77102.html"]http://www.reduser.net/forum/archive/index.php/t-77102.html[/url] - here is a little bit of discussion about it. For best choice you need to know what's happening behind the scenes. On average though, most are probably better off buying GF cards.As for mobile and desktop version: according to this: [url="http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-680mx/specifications"]http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-680mx/specifications[/url] and [url="http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-680/specifications"]http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-680/specifications[/url] looks like 680mx is underclocked version of the desktop card by about 25%...Obviously it doesn't mean you'll notice 25% increase in performance between the two. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.