• rozodru@piefed.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      29 days ago

      the infuriating one is Capcom. after all these years, decades even, they STILL to this DAY do not understand PC games. they still have yet to figure out HOW to optimize their games for PC and would STILL keep using Denuvo even AFTER admitting that “yeah it slows our games down, yeah we remove it and then put it back”

      Either it’s old as Japanese execs at capcom that refuse to understand gaming on the PC or they just don’t care. But it boggles my mind how Capcom kept using Denuvo while admitting it fucks their shit up.

  • thedeadwalking4242@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    29 days ago

    On the one hand software freedom.

    On the other this has me thinking about how fascinating this problem is from academic standpoint.

    How can you ensure software can ONLY run on the machines you allow? Even if the user has ring 0 access?

    Is it mathematically impossible to achieve?

    • LedgeDrop@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      29 days ago

      It’s totally possible to achieve. TPM is the desktop equivalent of the technology that runs on your cellphone to have apps detect if you have an unlocked bootloader or root. It’s the same technology prevents your favorite concole (ie: switch 2, ect) from running pirated games.

      This improved security does come at a price: we/the users are the enemy and cannot be trusted. This means modifying your system will be prohibited and we (the consumer) will have to trust that Big Tech has our best interests in mind. /s

        • LedgeDrop@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          29 days ago

          To expand on this a bit:

          It’s all built on top of the concept of “a chain of trust”, starting at the hardware level.

          (as mentioned) TPM is a chip that’ll store encryption keys at a hardware level and retrieval of these keys can only happen if the hardware is unmodified.

          I assume that part of this key is derived from aspects of your OS (ie: all device drivers are signed by MS).

          The OS will fetch this key, if it’s valid - the OS knows that the hardware is untampered, it can then verify that the OS is unmodified, which can then be used by application to determine that their not modified, etc.

          Now you could spoof your own TPM chip (similar to how Switch 1’s are chipped/nodded), but the deal-breaker is that when you add your key to the TPM chip, you sign it with a hardware vendor specific public key. And that vendor private key is baked into the hardware (often into the CPU, so the private key never crosses the hardware bus).

          • meaansel@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            28 days ago

            But at the end of day, doesn’t app have to ask OS? At that stage, can’t you spoof “positive” responce of unmodified system?