“Well, first of all, they’re completely wrong,” Huang said in response to a question from Tom’s Hardware editor-in-chief Paul Alcorn about the criticism.

“The reason for that is because, as I have explained very carefully, DLSS 5 fuses controllability of the of geometry and textures and everything about the game with generative AI,” Huang continued.

Just a elongated way to say AI slop.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      26 days ago

      True. I mean we can’t exactly expect the CEO of an AI company to admit his AI tool produces AI slop. Got to think about shareholder value after all.

      • DraconicSun@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        26 days ago

        Anyway outlaw the stock market and shareholders and this won’t happen again. It’s just a way for rich gambling addicts to bet and rope regular people into it in the hopes of getting rich only for them to lose everything.

  • JcbAzPx@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    26 days ago

    “Well, first of all, they’re completely wrong,”

    Proceeds to explain exactly what everyone hates about it.

  • DraconicSun@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    26 days ago

    “completely wrong” and proceeds to say it’s just a “fusion”.

    It IS an AI slop filter, and you can take it and shove it up your ass alongside all of your stupid jackets.

  • A_Random_Idiot@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    25 days ago

    I’ve said this before, and I’ll say it again.

    Nvidia is destroying gaming.

    They started the destruction with the idea of 4k gaming.

    When they realized native 4k gaming wasnt going to be feasible… They knocked gaming to the ground and started kicking it in the ribs with this upscaling bullshit, because who doesnt love a 1080p picture shittily stretched to 4k?

    And they curb stomped it by making video cards cost more than what most people make in a fucking month.

    and they’re beating its unconscious body with bats over this DLSS5 AI obsession bullshit.

    and at every step of the way, the gamers were there to deliver dumptrucks of money because they don’t give a fuck about ruining everything as long as they can have their new shiny. Being in the cool kids club by having a new shiny is more important than the havok they are wrecking with their decisions to support this shit.

    But don’t worry…They’ll still go online and cry about the unfairness of it all.

    and AMD is desperately trying to play catchup so they can try to steal a sliver of the bullshit pie as well, before someone tries to point out me not addressing AMD (since its not the topic), or try to hail it as the saviour of gaming kind.

  • LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    26 days ago

    ROFL This means we are in fact correct in our assumptions and Jensen Huang is seething, trying to make AI Slop Lookscursing fetch. It will never be fetch Jensen…Sorry sweetie!

  • Techno-rat@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    26 days ago

    Trillions invested to make unoptimized games barely run, and make it look worse at the same time, instead of just investing like 1/10th into optimization during dev cycles.

    NVIDIA really is like a parasitic cancerous growth on the side of the games industry, it’s existence increasingly predicated on the destruction of current standards, overtaking their function to ensure survival and it’s continuous ever expanding cancerous growth

  • nightlily@leminal.space
    link
    fedilink
    English
    arrow-up
    1
    ·
    26 days ago

    So this dumb fuck’s own marketing material has said this operates off final pixel colour and motion vectors (for temporal stability presumably) - that says to me that it’s not working with actual geometry info at all. It probably has a step to infer geometry but it’s still just a fancy Instagram filter working with limited data and an obviously ill-suited training set.

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      0
      ·
      26 days ago

      the previous versions at least need the software to supply motion vectors. otherwise it’s just guesswork. i’m assuming there will be some way to supply lighting information as well.

      whatever the final product can do, they certainly didn’t show it off in their examples.

      • orgrinrt@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        26 days ago

        Technically, at least on vulkan, these things can be inferred or intercepted with just an injected layer, though it’s not trivial. If you store a buffer history for depth you can fairly accurately compute an approximation of actual (isolated) mesh surfaces from the pov of the view. But that isn’t the same as real polygons and meshes that the textures and all map to… pretty sure you can’t run that pipeline real time even with tiled temporal ss. Almost definitely works on the output directly, perhaps some buffers like motion vectors and depth for the same frame that they’ve needed since dlss2 anyway. But pretty suspect to claim full polygons, unless running with tight integration from the game itself, even then the frame budgets are crazy tight as it is, nevermind running extra passes on that level

        • SkunkWorkz@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          26 days ago

          Probably not meshes since it is way too expensive. But these guys write the GPU drivers, so they of course have access to the different frame buffers and textures buffers and light source data. So just from depth and normal map data you can get a good representation of geometry. Like deferred rendering lights the scene with the data in the G-Buffer, which is 2D, not geometry.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      26 days ago

      That’s what he’s saying. That it doesn’t change the geometry or textures (still completely controlled by the devs) and that the parts that it does change are also tunable by the devs.

      He’s responding to the backlash about how it changes models/textures (which it doesn’t) by saying those are still fully in the hands of the devs and the parts people are seeing in the demos can be fine tuned by the dev teams to match their vision for what they want it to do or not do (like change lighting on material surfaces and hair but not character faces as an example).

      • nightlily@leminal.space
        link
        fedilink
        English
        arrow-up
        1
        ·
        26 days ago

        It’s a post-processing screen space effect. At that point, there’s zero control the game can have over the geometry. If the AI model wants to change it, it can. It fundamentally can’t only operate on lighting like the marketing claims, it can only make a hallucinating best-effort statistical guess at what the geometry in the final image should be.

  • CosmoNova@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    26 days ago

    „No, no! You‘re wrong because…“

    And then this charlatan goes on to explain why we‘re right. It‘s exhausting to listen to these gilded clowns.

  • FreddiesLantern@leminal.space
    link
    fedilink
    English
    arrow-up
    1
    ·
    26 days ago

    By the amount and intensity that this man tries to sell his garbage through sheer bluff and bs you’d think he’s running to be President someday.

    • I Cast Fist@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      26 days ago

      Even some of those can read the room and reach the conclusion that “if people won’t buy it, I won’t make profits”

  • Yggstyle@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    25 days ago

    This is long winded but I firmly believe this explains a lot about the industries frenzied push into all these odd directions… All of it. Here seems as good as any place to dump this mess I’ve been stewing on:

    I really think it’s important that raytracing, while novel, wasn’t created to improve visuals. It wasnt created to make a programmers life easier. It was created because it was computationally difficult and could be optimized for. It was a fantastic play by nvidia. They created a feature that functionally did very little but they could get an entire cycle ahead of the competition in that optimization. Differentiation of products, in a duopoly, is a big deal. Amd dove right into it - knowing full well that this would leave them brutally behind… But this was a fortuitous event: despite the disadvantage.

    Why? Simple. GPUs have been struggling against Moore’s law. Framerates were exceeding ranges even monitors can refresh at. And worse yet there was another hard limit: our eyes. How do you sell cards that have no perceivable value?

    Reality is we may well be reaching a point where additional resolutions and framerates dont matter. Badly optimized games only buy so much time.

    These companies aren’t stupid. Crypto? They loved it. Computationally expensive. Always need faster… Until we didnt. What now? Demand was plummeting for overpriced high end cards.

    Go back and look at when AI and nvidia got in bed. The earnings call was due to be a bloodbath after all these cards were rotting on shelves, unpurchased, and depreciating daily. It was coming ro light that they had been selling cards to miners under the table and that was going to get ugly fast. I have never, in my life, heard a company talk so much about a product on a earnings call – that wasn’t theirs. Not a word breathed about unsold cards barely any numbers discussed. ChatGPT referenced so many times that there was confusion as to whether nvidia actually owned it. The Q/A at the end was comedy gold. People were so confused.

    AI was the perfect save. AI is a power virus. Want to fix the black box? Train a black box to mangage that black box. Its a computational sinkhole. They’ve extracted value from gamers to dimishing returns. Meanwhile they can sell the ultimate snake oil to investors: virtual slave labor. Unpaid workers. In floods private equity. Gamers stopped mattering immediately. All of these advances are software. From a GPU design company. Why? It shuts up the peasants while they continue rebranding the “snake oil” to get whoever is buying. Weve nearly achieved the panacea. Just a bit longer!

    Behold: we have dressed our industry in the finest of the emperors newest clothes. You can either start selling them or be the only one who doesn’t.

    🫧

    • Eximius@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      25 days ago

      From a programming and visuals standpoint: Ray tracing was always sought after, and it is peak graphical fidelity. It makes visuals better, and (shader) programming easier, more physics-based. It’s not just differentiation, the industry has been dreaming of realtime ray-tracing for 30 years. With slow, continuous movement in that direction.

      • Yggstyle@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        25 days ago

        Dont get me wrong. Its absolutely a very novel and useful feature. It made shit look great. I’m not down on the tech: I’m just saying the push for it wasn’t for the industry. It was to kill framerates and sell cards.

  • cecilkorik@piefed.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    26 days ago

    Calling it now: Jensen Huang’s mind has been emptied and replaced with AI chips. That’s why he just spouts AI generated nonsense.

  • wonderingwanderer@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    26 days ago

    “The consumers don’t know what they want. I, the CEO, know what the consumers want. And the consumers want to give me money!”

      • wonderingwanderer@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        25 days ago

        Jensen Huang has all the GPUs, he can probably play games where each character has its own dedicated GPU and every atom and molecule of the environment is rendered in real time with a hyper-realistic physics engine, with built-in AI that plays for you so that even your idle pastimes are automated giving you more time to WORK AND PRODUCE VALUE FOR THE OWNER-CASTE.

        “This game only needs two GPUs to run, what’s the problem?”