Tech Update for 2020 | Hardware and Software for Rendering
The hardware gloves are off this new decade with AMD continuing to shove its fist down Intels throat in a bid to take its lunch out of its guts... in a way at least. Or put more technically, AMD is about to release its 3990X CPU, and its a whopper! Nvidia is unveiling its new 3000 series cards, and intel is releasing a GPU? Wait, what? All three companies are now 'in the 7's' with new hardware being manufactured at 7nm, so less power draw, more performance, you know the story by now.
Software wise, Blender continues to add features onto of its major update that took us from 2.79b to 2.80 (in my opinion this was a MAJOR update, but anyway, we enjoy the benefits so props to the blender team!). We'll be seeing 2.82 release in a matter of weeks and 2.83 is already in alpha. 2.82 promises speedups for BVH building on windows.
Octane released its 2020 preview, with support for RTX cards, now claiming to be the fastest unbiased GPU rendering engine out there. Performance gains were roughly 2-3 times speed up according to this press release.
Now to the good stuff, new hardware, yes!
AMD Zen 3, RDNA2
Team Red has been running a bit rampant of late, since the release of the latest series of thread ripper chips, Intel has not really hit back with a new series of CPU that challenges thread ripper. 2020 can look forward to another whopping big CPU landing soon, the 3990X CPU with an insane 64 physical cores, at 2.9GHz base, boosting to 4.3GHz. Its power hungry as you'd expect for a chip with that many cores to at 280W TDP.
Which, oddly enough is the same power draw as the 3970X which has half the cores (though it does have a smidge more base and boost clock). They both use the 7nm process, so either its a typo, or the 3990X is going to run double the cores at the same power!! That will definitely be worth using in a render farm then!
You can read the specs here and by the way, the retail price on this is rumoured to be $4000 US. So its not 'cheap' as one might say. But apparently it beats chips that cost $20'000 from intel at rendering, so probably worth it... relatively? Might be worth looking at pricing up multiple systems to get you to 64 physical cores and see if its cheaper, but my guess is it probably won't be, think two 2990WX chips, plus the extra components!
AMD is also now working on Zen 3, due for release this year, with lower power and more performance, crazy.
RDNA2
So, AMD have had their RX5000 series cards out since July last year, so what can we look forward to this year? There are rumours a plenty about RNDA2, the next architecture for AMD's GPUs. The said rumours cite an AMD GPU showing up on an openVR benchmark site and handing a RTX 2080Ti its ass back. Well, it beat the 2080Ti by 17% and to be honest we know nothing of the setup, like were they using liquid nitrogen?
Thats not the only rumour though, others have cited double the performance of the current 5700XT cards which would be very nice indeed, particularly for rendering. They are said to also be more efficient with less power draw thanks to the enhanced 7nm process.
The AMD cards are also said to be including hardware support for ray tracing and using GDDR6 RAM, or maybe HMB2, who knows at this stage, it's all rumour.
NVIDIA
Ah team green! What does 2020 hold? A new architecture for the new 3000 series cards called Ampere is coming, to be manufactured on a 7nm process and have improvements in speed and power efficiency.
Also, more ray tracing, or rather more efficient ray tracing, again thanks to the 7nm thingy, which sounds great for rendering, especially now rendering software like cycles in blender, and Octane is able to use RTX cards to speed up their ray tracing engines.
Intel
In some ways saving the best for last, and being bit cheeky about it. Some might be wondering what Intel have been up to whilst AMD has been thread ripping them to bits. Well they've apparently stolen a lot of GPU talent from AMD, maybe as vengeance? The big talk for 2020 regarding Intel is their Xe DG1 GPU. Intel have been somewhat vague about their GPU plans, saying they'll focus on everything from laptops up to data centres, maybe starting with mid-range cards?
The first that has been seen of the new GPU outside of intel, well, wasn't truly impressing some people. However, that was a gaming context, we know that Intel is actually planning to use their new GPU tech in a supercomputer called Aurora at the Argonne National Laboratories in the US. That is far more interesting to folk that render since we're far more excited by compute performance than games, unless you're into real time render engines (like Blender's EEVEE), in which case, Intel didn't get off to a great start, but this is by no means the time to write them off.
In summary
So, as far as rendering is concerned, things are looking pretty good, we've got a serious fight over the CPU market between AMD and Intel, generating innovation, better value for money, etc. Now we have Intel entering the GPU market and AMD producing, in theory, a contender for the top end cards from Nvidia (lets see how that eventuates though).
More competition is generally a good thing for the consumer, it lowers prices and forces companies to compete based on either price or innovation.
There is a lot to look forward to, that is certain, the only thing that may be painful is whether to buy now or later. With new and more efficient CPUs and GPUs coming, one may be tempted to wait out the new tech. After all, efficiency in render farms is crucial as power draw is pretty much the main cost after you purchase your gear (not counting software, if you pay for software that is).
I see it like this, waiting shows patience, which is a virtue, one that may be rewarded this year, either with shiny new gear (with a price to match) or really decent gear on sale :)
Let me know what you think in the comments section!
Regardless, I love you're thinking outside the PC box, Pi's are also cool and I want one :D. Its also one to watch as the Pi continues to add more RAM, or if someone manages to bridge their busses together somehow (though that might be either hard or not worth it for speed reasons).
(EDIT) Also, nearly forgot, power draw! The tech specs for the Pi show it draws 15W total. Also the broadcom chip is not hyper threaded, so you'd need 128/4 Pi's, so 32 of them! Which means your total power draw is now 480W vs the 3990X's 280!!
Finally the Pi's chip is clocked at just 1.5Ghz vs the 3990X which has a base of just under…
Ah yes, I forgot that each instance has to load the whole project at once, not just their own individual responsibility of the render. Valid point, sir.
Hi Nick, nice! Though how much RAM will those Pi's support? Can you join the RAM together somehow? Last time I checked you could get maybe 4GB? There are a few scenes I can think of that would not fit in that, which might limit the usefulness of the pi farm. Having said that, the price can't be argued with! If you can manage your scenes to be under 4GB then yahoo!
Wowzers! Though, 64 cores could be obtained across 16RPi's for only ~$800USD. Lower clock and some more overhead but, in my experience, more cores is better than fast cores. Quadruple your core count to 256 and still be under $4000USD (and that's a fully operational system, not just a CPU). I'd wager your 256 cores will outperform the 64 core AMD rig over the course of an animation. Single frame probably won't compete with more overhead to kick off a render across that many individual nodes but, after the initial load, I think the Pi's could hold their weight.