True Pathfinding Raytracing will not be viable until a full generation of cards and technology comes about. They might find some ways to make it run smoother and be less of a performance hog, but, I doubt it. What its doing is genuinely remarkable. Like I said, its emulating, for the first time ever, a feature about reality, almost perfectly.
I dont know if you know the website GiantBomb, but they played Minecraft Raytraced a few years ago, that had FULL Raytracing... They did an experiment with some glass blocks and lined them up, one was glass but Yellow colored glass. The other was Blue colored glass, they shined a light through them so the beams would intersect - When the light beams met, they fucking turned GREEN.
Real raytracing has the fucking colorwheel built into it, its astounding what theyve programmed. But its just out of our grasp right now.
Get a 3080 for $750-850, 3080ti, 3090, 3090ti; and youll be fine.
I also wanted to point out the reason this Witcher 3 Raytrace update, on consoles, is showing that it needs to run at FSR 4kUpscaled30fps w/ Limited Raytracing... The consoles have 12 Teraflops. And AMD cards, which by every metric we've checked, perform worse than Nvidia cards when it comes to any form of Raytracing.
So, the console is doing 4kUpscaled30fps with limited Raytracing due to having 12 teraflops.
The 3080 is 32 Teraflops, so almost 3x stronger than the console's card, and then the 3080ti, 3090, and 3090ti only get stronger from there.
Im confident we will be able to play Witcher 3 at 4kUpscaled60fps DLSS2.0 with 3080 cards when the patch releases for PC.
If you were curious to see some pretty engaging content on Raytracing and to understand more about the performance and pathfinding and stuff, DigitalFoundry did a pretty entertaining video, included Nvidia and AMD cards of the Quake 1 Raytrace mod. To be clear, this is a fan mod, its probably not implemented the best, but you can see how the features work with the simple graphics, then just imagine the power hit it would take to put into a Modern game.
Putting the link for that here.
https://www.youtube.com/watch?v=IhhLcZQ2zD0
One last thing, not sure if youve already got a 4k TV/Monitor, but there is one other technology that is getting into households these days, HDR.
Wanted to warn you - TV/MONITORS LIE ABOUT HDR! - I got tricked by it myself. I still got a 4k monitor, but its not really HDR.
So, displays before HDR have been limited to 16.7million colors. With HDR, its a dramatic increase to 1.07BILLION colors.
Now, the way to make sure you really get an HDR TV/Monitor - dont look at the tagline for the product, check the specs. It will have it listed somewhere, Im going to give you an example below of how messed up the HDR situation is and these companies are lying to people.
So heres a Monitor on Amazon. I had to do some digging to find out the trick with this one, these companies need to be class action sued.
Description given on the monitor: HP U28 4K HDR Monitor - Computer Monitor for Content Creators with IPS Panel, HDR
Look at the specs on Amazon, theres no mention of color support. Cause these are a bunch of thieving bastards. Did a little digging around, and found a site that actually listed the monitor's color range.
https://www.tanotis.com/products/hp-u28-28-16-9-4k-hdr-ips-monitor
This is actually a company based out of India, they still lie in their tagline for the monitor, but they actually posted the Specs on it, unlike Amazon.
Scroll down a bit and youll see this.:
Bit Depth / Color Support 8-Bit (16.7 Million Colors)
It supports the same amount of colors every TV and Monitor youve ever owned has.
Dont get tricked by this - you see a good deal on something calling itself HDR, do research. I got burned by this by another company that needs sued, BenQ. The 4k is nice, but its not an HDR monitor. They just lie.
Ive been thinking about these factors for awhile and its why I did bite on the 3080 when they became available this year. Real Raytracing isnt happening anytime soon. The 4000s are powerful, but almost pointlessly so, they can run games at 8k, but no one has 8k, but they also cant handle "REAL" Raytracing in games with modern graphics. So theyre selling an Raytracing 8k card... that cant really do Raytracing on anything made in the last 10 years - and no one has 8k monitors.
The 4000 is likely going to be a bastard stepchild of a gpu generation.
Get 3080+, youll be fine.
Sorry for the wall, take care, just do research before you invest in a monitor; and pretty much useless to get a 4000 at this time, even if you had unlimited money.
Item | Current | Lowest | Reviews |
---|---|---|---|
HP U28 4K HDR Monitor - Computer Monitor for Cont… | - | - | 4.0/5.0 |
^Item Info | Bot Info | Trigger
Item | Current | Lowest | Reviews |
---|---|---|---|
HP U28 4K HDR Monitor - Computer Monitor for Cont… | - | - | 4.0/5.0 |
^Item Info | Bot Info | Trigger
28 inch HP U28 4K UHD HDR IPS Monitor with USB-C 65W charging $380 free ship
posted by @AmongUs58124042
^(Github) ^| ^(What's new)
Deal link: Amazon
Deal link: Amazon
Basically, go 3080+, youll be fine with any of them. "REAL" Raytracing is currently a Tech Demo. They have to use 15-20 year old games to show how it works. Not kidding, they did a full raytracing upgrade for Quake 1 and Quake 2, and actually Minecraft has one as well, and now Portal 1.
Theyre doing that because even the premiere card, the 4090, -cant- run it on anything with higher graphical fidelity or complexity than something like Portal.
I think in the end, we are going to see the 4000 series be the bastard step child here. They come with a TON of horsepower, for fucking sure. Youd be gaming at 8kUpscaled60+fps, without raytracing.
But, for that to matter, you need an 8k monitor/tv, and thats gonna be another $2000+ from you.
The 3000 series cards are going to coast through the 4k Generation just fine, along with scaled back Raytracing working on them, as you can see with Spiderman and the few other games that have added it.
True Pathfinding Raytracing will not be viable until a full generation of cards and technology comes about. They might find some ways to make it run smoother and be less of a performance hog, but, I doubt it. What its doing is genuinely remarkable. Like I said, its emulating, for the first time ever, a feature about reality, almost perfectly.
I dont know if you know the website GiantBomb, but they played Minecraft Raytraced a few years ago, that had FULL Raytracing... They did an experiment with some glass blocks and lined them up, one was glass but Yellow colored glass. The other was Blue colored glass, they shined a light through them so the beams would intersect - When the light beams met, they fucking turned GREEN.
Real raytracing has the fucking colorwheel built into it, its astounding what theyve programmed. But its just out of our grasp right now.
Get a 3080 for $750-900, 3080ti, 3090, 3090ti; and youll be fine.
Sorry that was super long and all over the place, but I also wanted to point out the reason this Witcher 3 Raytrace update, on consoles, is showing that it needs to run at FSR 4kUpscaled30fps w/ Limited Raytracing... The consoles have 12 Teraflops. And AMD cards, which by every metric we've checked, perform worse than Nvidia cards when it comes to any form of Raytracing.
So, the console is doing 4kUpscaled30fps with limited Raytracing due to having 12 teraflops.
The 3080 is 32 Teraflops, so almost 3x stronger than the console's card, and then the 3080ti, 3090, and 3090ti only get stronger from there.
Im confident we will be able to play Witcher 3 at 4kUpscaled60fps DLSS2.0 with 3080 cards when the patch releases for PC.
If you were curious to see some pretty engaging content on Raytracing and to understand more about the performance and pathfinding and stuff, DigitalFoundry did a pretty entertaining video, included Nvidia and AMD cards of the Quake 1 Raytrace mod. To be clear, this is a fan mod, its probably not implemented the best, but you can see how the features work with the simple graphics, then just imagine the power hit it would take to put into a Modern game.
Putting the link for that here.
https://www.youtube.com/watch?v=IhhLcZQ2zD0
One last thing, not sure if youve already got a 4k TV/Monitor, but there is one other technology that is getting into households these days, HDR.
Wanted to warn you - TV/MONITORS LIE ABOUT HDR!!! - I got tricked by it myself. I still got a 4k monitor, but its not really HDR.
So, displays before HDR have been limited to 16.7million colors. With HDR, its a dramatic increase to 1.07BILLION colors.
Now, the way to make sure you really get an HDR TV/Monitor - dont look at the tagline for the product, check the specs. It will have it listed somewhere, Im going to give you an example below of how messed up the HDR situation is and these companies are lying to people. So heres a Monitor on Amazon. I had to do some digging to find out the trick with this one, these companies need to be class action sued. Description given on the monitor: HP U28 4K HDR Monitor - Computer Monitor for Content Creators with IPS Panel, HDR https://www.amazon.com/HP-U28-HDR-Monitor-Calibration/dp/B08HVV2T3K/ref=sr_1_11?crid=3RIDJBKOH53LT&keywords=4k+hdr&qid=1669915928&sprefix=4k+hdr%2Caps%2C142&sr=8-11&ufe=app_do%3Aamzn1.fos.c3015c4a-46bb-44b9-81a4-dc28e6d374b3 Now, if you look at the specs on Amazon, theres no mention of color support. Cause these are a bunch of thieving bastards. Did a little digging around, and found a site that actually listed the monitor's color range. https://www.tanotis.com/products/hp-u28-28-16-9-4k-hdr-ips-monitor
This is actually a company based out of India, they still lie in their tagline for the monitor, but they actually posted the Specs on it, unlike Amazon.
Scroll down a bit and youll see this.:
Bit Depth / Color Support 8-Bit (16.7 Million Colors)
It supports the same amount of colors every TV and Monitor youve ever owned has.
Dont get tricked by this - you see a good deal on something calling itself HDR, do research. I got burned by this by another company that needs sued, BenQ. The 4k is nice, but its not an HDR monitor. They just lie. Really sorry for how long that went, but Ive been thinking about these factors for awhile and its why I did bite on the 3080 when they became available this year. Real Raytracing isnt happening anytime soon. The 4000s are powerful, but almost pointlessly so, they can run games at 8k, but no one has 8k, but they also cant handle "REAL" Raytracing in games with modern graphics. So theyre selling an Raytracing 8k card... that cant really do Raytracing on anything made in the last 10 years - and no one has 8k monitors.
The 4000 is likely going to be a bastard stepchild of a gpu generation.
Get 3080+, youll be fine. Sorry for the wall, take care, just do research before you invest in a monitor; and pretty much useless to get a 4000 at this time, even if you had unlimited money.