• Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      43
      ·
      1 day ago

      That was 100% it’s downfall. I don’t know what executive said “But it Haaaass to run on them” - I guarantee every engineer there knew it would be a disaster on the older gen. I know they said at the beginning that it would, but they should have just sucked it up and said “Look, folks, we’re sorry, it just can’t run and you won’t have a good time, so it’s next gen only. We’ll see you in Night City when you can”

      • taladar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        17
        ·
        1 day ago

        I bet many of the engineers did and then their management told them that they have to do it anyway.

      • Viri4thus@feddit.org
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        5
        ·
        edit-2
        1 day ago

        It was quite literally the reason why it hit 70k players, NVIDIA. cdpr, allegedly, got wads of money to become the tech demo for the green team, which made the red engine require an overhaul that killed its performance on old gen consoles. Nowadays it’s used as a benchmark. If you look at the player chart it peaked during CES and cratered back down to almost half of the 70k Tassi (aka mouthpiece) is touting. What really is a team worth exploring is how CDPR knowingly deceived millions of people, and thanks to the short memory of the Internet and (likely paid) puff pieces like this, has regenerated its image (and stock price) to a point that quite literally they got away with what they did and saw no consequences. It’s appalling that companies no longer pay when they cheat customers.

        This is also a strong message on media literacy. There’s two main types of bait, rage and circlejerk, Tassi enacted the second with this puff piece. A real journalist, would have looked at the average player count of the last few months and used that, but no, “journalist” Paul Tassi chose to make a point from a blip caused by an event that EVERYONE could be aware of and, “journalist” Paul Tassi, should be aware of. This is so disingenuous that it becomes a master class on how media influences people with distorting information.

        Edit: As another point, “journalist” Paul Tassi purposefully omits the genial Stardew Valley which usually clocks in close to double the playerbase of Cyberpunk (among over 10 single player games that overtake or compare, like FM, Don’t Starve, Terraria, RDR2, HoIIV, et al).

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          16 hours ago

          The game looks absolutely amazing on an AMD GPU, without realtime raytracing.

          Several years ago now, I managed to get it to 4k90fps on a 6900XT, with basically all settings on ultra/psycho via some ini tweaks, custom FSR values, just no ray tracing, using ‘old school’ cube maps and light sources and what not.

          And that was all running in Proton, on linux, as well.

          Nvidia absolutely barged in and said hey guys guess what, we completely obliteraterated the entire history of how lighting works in game engines, here’s our new extremely pretty but extremely, astoundingly inefficient lighting engine, rewrite your entire game engine and game to make it work on your custom engine that’s been cooking for 10 years without any notion of this new lighting paradigm.

          EDIT: I should add that that is 90 ‘real’ fps, no frame gen, just early FSR.

          • Viri4thus@feddit.org
            link
            fedilink
            English
            arrow-up
            4
            ·
            14 hours ago

            It always cracks me up, the dudebros buying 2k GPUs because MUH PCMR and then use framegen to play with console input latencies…

            FML, we’re breeding intelligence out of the gene pool at speeds that threaten the survival of the species.

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 day ago

      I think trying to make the game run on the measly 8 gigs of ram, 1.8ghz cpu, and a GPU that’s worse than a Radeon 7790 would just make an awful experience. The minimum specs for 1080p all low settings is a CPU and GPU that are over twice as fast.

      • circuitfarmer@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 day ago

        Have to disagree on the GPU part – I first played it with a 970, mostly on medium. It can be nearly maxxed out (no RT) on a 6600XT, which I used for my second playthrough (on Linux).

      • fishbone@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        I feel like patch 2.2 specifically added in some performance issues and bugs. I played a lot on 2.12 and I remember it running a lot smoother/more consistent, but I haven’t gotten around to actually double checking it (I apparently have four versions installed like a crazy person).