Hash Fellow robcat2075 Posted December 27, 2024 Hash Fellow Posted December 27, 2024 Diminishing returns for realistic rendering... NY Times: Video Games Can’t Afford to Look This Good (article is free but you may have to create free account at NY Times to read it) Quote The gaming industry spent billions pursuing the idea that customers wanted realistic graphics. Did executives misread the market? Quote Cinematic games are getting so expensive and time-consuming to make that the video game industry has started to acknowledge that investing in graphics is providing diminished financial returns. “It’s very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s,” said Jacob Navok, a former executive at Square Enix who left that studio, known for the Final Fantasy series, in 2016 to start his own media company. “But what does my 7-year-old son play? Minecraft. Roblox. Fortnite.” I suspect we all have a mental image of that "vocal class of gamers in their 40s and 50s". And the La-Z-Boy they are couched in. Quote
Fuchur Posted December 27, 2024 Posted December 27, 2024 I am much into VR games these days... those do not need to be that hyper realistic, because in VR space, the immersive factor is high anyway and people know, why the games do not look that great. In general, VR games need to be rendered at very high resolutions like 2x4k or higher at 90-120fps, often on portable systems – if the headset is a stand-alone one) and like that, hyper realism is not necessary there. Problem is, the VR space / niche is much smaller than the one of main stream gaming. I am not into the Metaverse but into PCVR, but the Metaverse is the one that is increasing. Not sure what will happen, but VR could be less expensive to develope... at least for now. The good thing is, there is no need for hyper realismn... games can be fun or bad no matter how nice the graphics look or not. Best regards *Fuchur* Quote
*A:M User* Roger Posted December 27, 2024 *A:M User* Posted December 27, 2024 I don't spend nearly the time playing video games that I did when I was younger. However, I can remember a time when the super high-end card from Nvidia or ATI/AMD was perhaps $500 or $600. Middle of the road was $200-$300ish and "budget" (meaning the cheapest graphics card you could get, but it was still better than an Intel integrated graphics) was $150 or lower. Here are the estimated prices for the latest generation of gaming cards from Nvidia: Flagship 5090 card: $2500 5080 tier: $1800 5070: $800-$1200 (depending on the variant: Super, Ti, Ti + Super) 5060: $500 Not many people are going to be willing to spend that kind of money to play games. In fact, if you "need" the kind of power in the 5090 series, I don't know why you wouldn't just spend the premium to get the Quadro/Pro variant of the card. At least you get white glove tech support for the "Pro" variants. Quote
*A:M User* Roger Posted December 27, 2024 *A:M User* Posted December 27, 2024 4 hours ago, robcat2075 said: Diminishing returns for realistic rendering... NY Times: Video Games Can’t Afford to Look This Good (article is free but you may have to create free account at NY Times to read it) I suspect we all have a mental image of that "vocal class of gamers in their 40s and 50s". And the La-Z-Boy they are couched in. Is this it? Quote
*A:M User* Roger Posted December 27, 2024 *A:M User* Posted December 27, 2024 (edited) 2 hours ago, Fuchur said: I am much into VR games these days... those do not need to be that hyper realistic, because in VR space, the immersive factor is high anyway and people know, why the games do not look that great. In general, VR games need to be rendered at very high resolutions like 2x4k or higher at 90-120fps, often on portable systems – if the headset is a stand-alone one) and like that, hyper realism is not necessary there. Problem is, the VR space / niche is much smaller than the one of main stream gaming. I am not into the Metaverse but into PCVR, but the Metaverse is the one that is increasing. Not sure what will happen, but VR could be less expensive to develope... at least for now. The good thing is, there is no need for hyper realismn... games can be fun or bad no matter how nice the graphics look or not. Best regards *Fuchur* Lots of people are happier with a lower fidelity experience. I personally think 4k gaming is the worst thing to happy to the hobby, since you're pushing around exponentially more pixels and you can't really see the difference anyway unless you have a 100 inch monitor. On a typical 24" or 32" monitor, you are better off playing at 1080p or 1440p. The "sweet spot" for realism is probably at the Xbox 360 or PS3 level of graphical fidelity, with the option to fall back to older graphic styles as a stylistic choice: pixel art, hand-drawn adventure games, that type of thing. Edited December 28, 2024 by Roger added additional content Quote
*A:M User* Roger Posted December 28, 2024 *A:M User* Posted December 28, 2024 I may be wrong though, and there is unlimited demand for graphical fidelity, to the point that you have a VR environment that is not distinguishable from reality. But that would probably require Neuralink (or something like it). At which point we may end up with something like the "San Junipero" episode of Black Mirror. Quote
Fuchur Posted December 30, 2024 Posted December 30, 2024 On 12/28/2024 at 12:57 AM, Roger said: Lots of people are happier with a lower fidelity experience. I personally think 4k gaming is the worst thing to happy to the hobby, since you're pushing around exponentially more pixels and you can't really see the difference anyway unless you have a 100 inch monitor. On a typical 24" or 32" monitor, you are better off playing at 1080p or 1440p. The "sweet spot" for realism is probably at the Xbox 360 or PS3 level of graphical fidelity, with the option to fall back to older graphic styles as a stylistic choice: pixel art, hand-drawn adventure games, that type of thing. We are no talking about a computer or TV screen here. VR stands for virtual reality and you put on VR headsets like a Meta Quest 3, a HTC Vive Focus Vision or a Pimax Crystal Super to play those. Depending on what we are talking about, these are stand a lone headsets meaning, you can play without attaching a computer to them or for higher / better quality stuff you can use a computer with them. (for instance the HTC Vive Focus Vision can do both) As the games run standalone sometimes on these headsets, they can produce with more or less a medium-to-high smartphone chip these images you need for that. 4K is very important in these scenarios, since the headsets will display 2 x 4k directly in front of your eyes. 4K gaming itself is pretty much not worth playing on a computer screen, if you ask me... 1080p or 1440p is fine for that, but with those headsets you need the high resolution because it just is too close to you eyes otherwise. These are designed to take a lot of your field of view to give you a greate imersion. Anyway: Nobody needs a 5090 or 5080 for that... those are totally overpriced bullshit cards very few people buy. Even with the high quality stuff for VR the games are designed to run on mid-range or upper mid-range cards just fine (I am using a RX 6800 for this for instance and it works fine) Best regards *Fuchur* Quote
a.quaihoi Posted May 20 Posted May 20 On 12/28/2024 at 11:40 AM, Roger said: I may be wrong though, and there is unlimited demand for graphical fidelity, to the point that you have a VR environment that is not distinguishable from reality. But that would probably require Neuralink (or something like it). At which point we may end up with something like the "San Junipero" episode of Black Mirror. I think true Immersive VR is a fair way away, guys doing simulation stuff over here explained to us, our eyes are like a box camera, we actually see our vision upside down and our brain computes the image the right way / flips it, calculates dept perception and so on, true immersive VR will require massive computational power to have zero latency and sync with our brains visualisation signal , every thing now is psuedo vr, which is why their simulations looked like low poly models which allows lower latency with visual synch, they were doing Ships / Aircrafts and building structure VR walk throughs . . . gear was out if this world and not available to open market - mind you that was like 10-12 years ago, assuming they have made significant progress by now . . . are we there yet . . . not too keen on a 2.5k graphics card . . . it would be different if AM did everything in GPU memory though Quote
Xtaz Posted May 21 Posted May 21 On 12/27/2024 at 7:50 PM, Roger said: I don't spend nearly the time playing video games that I did when I was younger. However, I can remember a time when the super high-end card from Nvidia or ATI/AMD was perhaps $500 or $600. Middle of the road was $200-$300ish and "budget" (meaning the cheapest graphics card you could get, but it was still better than an Intel integrated graphics) was $150 or lower. Here are the estimated prices for the latest generation of gaming cards from Nvidia: Flagship 5090 card: $2500 5080 tier: $1800 5070: $800-$1200 (depending on the variant: Super, Ti, Ti + Super) 5060: $500 Not many people are going to be willing to spend that kind of money to play games. In fact, if you "need" the kind of power in the 5090 series, I don't know why you wouldn't just spend the premium to get the Quadro/Pro variant of the card. At least you get white glove tech support for the "Pro" variants. Video cards have long since ceased to be used for gaming, now the focus of production is on cryptocurrency mining and AI. Hence these extravagant prices. Quote
*A:M User* Roger Posted May 21 *A:M User* Posted May 21 3 hours ago, Xtaz said: Video cards have long since ceased to be used for gaming, now the focus of production is on cryptocurrency mining and AI. Hence these extravagant prices. I had hoped that with crypto mining no longer being practical on GPUs (at least for the big crypto currencies) that things might return to normal, but then along comes AI which uses the same cards. I've just accepted that I am probably going to die with whatever computers I am using now, I can't ever see myself spending that kind of money on a GPU, regardless of what I'm planning to use it for. Not to mention, most modern computing seems to be trending towards taking away your freedom of choice. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.