I do not really have a body for this. I was not aware that this is a thing and still feel like this is bs, but maybe there is an actual explanation for HDMI Forum’s decision that I am missing.

  • chillpanzee@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    maybe there is an actual explanation for HDMI Forum’s decision that I am missing.

    HDMI has never been an open standard (to the best of my understanding anyway). You’ve always needed to be an adopter or a member of HDMI forum to get the latest (or future) specs. So it’s not like they’ve just rejected a new idea. The rejection is fully consistent with their entire history of keeping the latest versions on lockdown.

    Standards organizations like HDMI Forum look like a monolith from the outside (like “they should explain their thinking here”) but really they are loosely coupled amalgamations of hundreds of companies, all of whom are working hard to make sure that (a) their patents are (and remain) essential, and that (b) nothing mandatory in a new version of the standard threatens their business. Think of it more like the UN General Assembly than a unified group of participants. Their likely isn’t a unified thinking other than that many Forum members are also participants in the patent licensing pool, so giving away something for which they collect royalties is just not a normal thought. Like… they’re not gonna give something away without getting something in return.

    I was a member of HDMI Forum for a brief while. Standards bodies like tihs are a bit of a weird world where motivations are often quite opaque.

    • Kevlar21@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I think I’d like DisplayPort over a USB-C connector. It seems like this might be an easier sell too, since the general non-techy populace is already used to everything going to USB-C (thanks EU). Maybe one day we can actually just use the same cable for everything. I realize that not all USB-C cables are equal, but maybe if TVs used USB-C, we’d see more cables supporting power, data, and video.

      • Gamma@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        My monitor (tv) supports usb c and I like it! The flexibility was nice during my single battle station move

      • IMALlama@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 months ago

        Display port over USB-C is totally a thing. With things like USB-PD USB seem to be getting dangerously close to becoming the standard for everything. The cables are a wreck though and are way too hard for a layperson to tell apart.

          • Chronographs@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            It’s pretty simple and straightforward, all you have to so is buy the cable and a professional cable tester to see what specs it’s actually in compliance with

            • amorpheus@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              4 months ago

              These days a ~10€ gadget can tell you about the electricity going through a USB connection and what the cable is capable of. I don’t like the idea of basically requiring this to get that knowledge, but considering the limited space on the USB-C plugs I’m not sure anything is likely to improve about their labeling.

      • ramble81@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        I mentioned this in another thread but “DP Alt” (DP over USB-C) is not a default feature of the USB spec and is an optional extension that needs to be added via additional hardware and supported by the device. At that point you’re basically adding in DP with just a different port.

        To that end, it’s still the same thing that TV manufacturers just aren’t adding in DP support regardless of connector.

        • SlurpingPus@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Isn’t usb-c able to carry Thunderbolt, which subsumed DisplayPort at some point? I thought Thunderbolt and DisplayPort were thus merged into whatever the usb standard was at the time.

          • Cooper8@feddit.online
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Thunderbolt is a proprietary specification by Intel and Apple, while Displayport is an open standard developed by VESA.

            USB connector hardware can meet the Thunderbolt or Displayport specifications, but must conform. Most do not.

      • Captain Aggravated@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 months ago

        Mildly spicy take: USB is an unrecoverable disaster and we need an entirely unrelated team to invent something entirely new to replace it because we’re never getting this sleeping bag back in the little bag it shipped in.

        USB 1.1 was cool in 1996; it replaced PS/2, RS-232, Centronics parallel, several proprietary connectors, several use cases for SCSI, ADB, Apple’s DIN serial ports, and probably some stuff I’m missing. There was an A plug and a B plug, main problem was both weren’t very obvious which way up you were supposed to plug them. Speed was low but firewire existed for high speed connections.

        USB 2.0 was cooler in 2000. The plugs and sockets were identical, the cable was similar but with better shielding, it was as fast or faster than FireWire 400. They did start introducing more plugs, like Mini-B and Micro-B, mainly for portable devices. There were also Mini-A and Micro-A, I’ve never personally seen them. That pretty much finished off external SCSI. Higher speed FireWire was still there if you needed faster than USB but USB 2.0 did basically everything. To indicate USB 2.0 devices and ports, they made the tongues black in contrast with USB 1.1’s white tongues. Didn’t really matter in practice; by the time people had devices that needed the speed, USB 2.0 ports were all machines had.

        USB 3.0 took too long to arrive in 2008. The additional speed was sorely needed by then, FireWire was mostly an Apple thing, PCs had but often didn’t use it, so PCs mostly didn’t have anything faster than 480Mbit/s until Obama was sworn in. USB 3.0 is best thought of as a separate tech bolted on top of USB 2.0, they added 5 more wires, a ground wire and two pair of high speed data lines for 5Gbit/s full duplex. The original four wires are also in the cable for power and 480Mbit/s half-duplex. They managed to make the A plug and socket entirely forwards and backwards compatible, the 3B sockets are compatible with 2B plugs (same with micro) but 3B plugs are not compatible with 2B sockets (again, same with micro). Which means we’ve just added two more kinds of cable for people to keep track of! So a typical consumer now likely has a printer with a USB A-B cable, some bluetooth headset or mp3 player they’re still using that has a mini-B plug, an Android smart phone with a micro-B plug, an iPod Touch with a Lightning plug because Apple are special widdle boys and girls with special widdle needs, and now an external hard drive with a 3A to micro-3B plug, which just looking at it is obviously a hack job.

        Computer manufacturers didn’t help. It’s still common for PCs to have 2.0 ports on them for low speed peripherals like mice, keyboards, printers, other sundry HIDs, to leave 3.0 ports open for high speed devices. To differentiate these to users, 3.0 ports are supposed to be blue. In my experience, about half of them are black. I own a Dell laptop made in 2014 with 1 2.0 and 2 3.0 ports, all are black. I own two Fractal Design cases, all of their front USB ports are black. Only ports on my Asrock motherboards are blue. I’ve had that laptop for nearly 12 years now, I STILL have to examine the pinout to tell which one is the USB 2.0 port. My Fractal cases aren’t that bad because they have no front 2.0, but I built a PC for my uncle that does have front 2.0 and 3.0 ports, and they’re all black.

        USB 3.1 showed up in 2013, alongside the USB-C connector, and the train came entirely off the rails. USB 3.1 offers even higher 10Gbit/s duplex throughput, maybe on the same cable as 3.0. If the port supports it. How do you tell a 3.1 port from a 3.0 port? They’ll silk screen on a logo in -8 point font that’ll scratch off in a month, it is otherwise physically identical. Some motherboard manufacturers break with the standard in a good way and color 3.1 capable ports a slightly teal-ish blue. USB A-B cables can carry a USB 3.1 10Gbit/s signal. But, they also introduced the USB-C connector, which is its own thing.

        USB-C was supposed to be the answer to our prayers. It’s almost as small as a Micro-2B connector, it’s reversible like a Lightning port, it can carry a LOT of power for fast charging and even charging laptops, and it’s got not one, but two sets of tx/rx pins, so it can carry high speed USB data in full duplex AND a 4k60hz DisplayPort signal AND good old fashioned 480Mbit/s USB2.0 half-duplex for peripherals. In one wire. That was the dream, anyway.

        Android smart phones moved over to USB-C, a lot of laptops went mostly or entirely USB-C, PCs added one or two…and that’s where we are to this day. Keyboards, mice, wireless dongles, HIDs, still all use USB-A plugs, there doesn’t seem to have been any move at all to migrate. Laptops are now permanently in dongle hell as bespoke ports like HDMI are disappearing, yet monitors and especially televisions are slow to adopt DP over USB-C.

        Also, about half of the USB-C cables on the market are 4-wire USB 2.0 cables. There are no USB-C data cables, just D+ and D- plus power. They’re phone charging cables; they’re sufficient for plugging a phone into a wall wart or car charger but they often don’t carry laptop amounts of power and they don’t carry high speed data or video.

        USB 3.2 turned up in 2017, added the ability to do two simultaneous 3.1 10Gbit/s connections in the same cable, a boon for external SSDs, retroactively renamed 3.0 and 3.1 to 3.2 Gen 1 and 3.2 Gen 2, with 3.2 being 3.2 Gen 2x2, changed to different case logos to match, pissed in the fireplace and started jabbering about Thunderbolt. Thunderbolt was an Intel thing to put PCIe lanes out mini DisplayPort cables, usually for the purposes of connecting external GPUs to laptops but also for general purpose high speed data transfer. Well, around this time they decided to transition to USB-C connectors for Thunderbolt.

        Problem: They use a lighting bolt logo to denote a Thunderbolt port. Lightning bolt, or angled squiggle lines, have been used to mean “high speed”, “Power delivery”, “Apple Lightning”, and now “Thunderbolt.”

        “Power delivery” sometimes but not always denoted by a yellow or orange tongue means that port delivers power even with the device turned off…or something. And has nothing to do with the fact that USB-C cables now have chips in them to negotiate with power bricks and devices for how much power can be delivered, and nobody marks the cables as such, so you just have to know what your cables can do. They’re nearly impossible to shop for, and if you want to set up a personal system of “my low-speed cables are black, my high speed cables are white, my high power cables are red” fuck you, your Samsung will come with a white 2.0 cable and nobody makes a high power red cable.

        USB4 is coming out now, it’s eaten Thunderbolt to gain its power, it’ll be able to do even higher speed links if you get yet another physically indistinguishable cable, and if you hold it upside down it’ll pressure wash your car, but only Gigabyte Aorus motherboards support that feature as of yet.

        The “fistful of different cables to keep track of” is only getting worse as we head into the USB4 era and it needs to be kicked in the head and replaced entirely.

        • Swedneck@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          1 day ago

          i keep seeing this online but i have not once heard anyone complain about this IRL, all i’ve seen happen is that people went from “uhhh i need an apple charger, that’s a samsung charger” to “hey can i use your charger? sweet thanks”.

          I’m sorry but it seems like a completely fucking made up problem

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            20 hours ago

            So you’re a normie who charge thay phone, eat hot chip and lie.

            It becomes an issue when you’re in the habit of such poweruser tasks as plugging an external display or external graphics card into a laptop or dealing with bulk file transfers.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          You end with

          The “fistful of different cables to keep track of” is only getting worse as we head into the USB4 era and it needs to be kicked in the head and replaced entirely.

          But started with

          need an entirely unrelated team to invent something entirely new to replace it

          You want more cables?

          • Captain Aggravated@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            4 months ago

            I remember the original roll-out of USB, things like mice and keyboards very quickly transitioned to USB and came with one of those USB/PS2 dongles for awhile for compatibility with older computers, and then we were into the USB era.

            That hasn’t happened with USB-C, large market segments don’t seem interested in making it happen, it’s not getting better, in fact it seems to be getting worse. So kick it in the head and start over from scratch.

          • hcbxzz@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            4 months ago

            You want more cables?

            Yes, I absolutely want different cables with different connectors.

            Being able to physically plug two USB-C devices together is not a benefit if the devices can’t actually talk to each other properly on the cable. I’d much rather have three different connectors, each of them guaranteeing protocol compatibility, than USB-C for which any given device-cable-device combination, the behavior is nearly impossible to predict.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              The problem is that getting a new standard is gonna just mean more of the same shit with like a good ten years of swapping because USB is so widely used. USB ain’t perfect, I dislike a lot of things about it, but starting from scratch isn’t gonna improve things.

              If it was the sort of magical scenario where everyone swapped overnight, hell yeah.

            • Captain Aggravated@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              4 months ago

              You know what the problem with USB-C is? In 2010 or so, you could have a fistful of unique USB cables, A-B, A-MiniB, A-MicroB, 3A-3B, 3A-Micro3B, A-Lightning, they’re all different, but you can look at the cable and tell exactly what it does. Most of them are identical in capabilities but have physically different plugs, and the two USB 3 cables are also identical in capabilities but with different client side plugs. ALL of them will plug in and work in the same host-side port.

              With USB-C, I can have a fistful of visually similar cables, with drastically different capabilities, and I have no way of telling them apart. The USB consortium has been inconsistent with their branding, it has been applied even more inconsistently or even fraudulently by manufacturers, and there’s no way to inspect the cable’s features without trying it to see if it works.

        • Ferk@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          4 months ago

          I don’t think we would be throwing USB-C away completelly, because it even became mandated by law in EU with the goal of trying to slow down the rate at which people generate trash by getting new cables and power bricks for every new generation of connectors.

          But I agree that at the very least there should be a clear labeling mandated by consumer protection laws as well… it’s a nightmare and a scenario that opens the door for a lot of scams… this is even made worse by the fact that nowadays you can even have malicious software running inside of the connector of a cable plugged into an extremely capable port without realizing it, messing up with your device even though the only thing you wanted was to charge it.

        • webghost0101@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          4 months ago

          The renaming while still selling it with older packaging for years has been angering me since it happend.

          Honestly it would not be so much of a problem if things where actually labeled appropriately with all the actual specs and support features on the package but its more profitable to keep you guessing (and going for the higher priced one just in case)

          They do the same thing with Bluetooth audio transmission usb, their “high quality audio” and “ps5 compatible” but does not tell me wether it supports aptx or not?

          Also the whole “buy a product clearly pictured with usb A type connector, receive a usb C type connector variant, if lucky, with an added adapter.

  • HelloRoot@lemy.lol
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    but maybe there is an actual explanation for HDMI Forum’s decision that I am missing.

    Licensing money.

      • dohpaz42@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        This wouldn’t work to scale. If Valve paid to license the spec for the Linux kernel, it would have to pay for every person who downloaded the driver, which is far more than the amount of people who buy the Steam Cube.

        Unless of course you’re suggesting that the kernel driver for the new spec become closed source.

        • legion02@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          OK. Fine. Then it’s going to be reverse engineered and everyone will use it anyways and they’ll get nothing.

      • mkwt@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        The license holder is attaching additional terms and conditions that are incompatible with publicly disclosing the driver source code.

        • Fedizen@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          It still boggles my mind things can be licensed/copyrighted without being forced to disclose source code. The lack of transparency we’re okay with in society is absolutely unsustainable.

  • DonutsRMeh@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    That’s why HDMI needs to die and display port needs to take over. The TV industry is too big for that to happen of course. They make a shit ton of money off of HDMI

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    AMD should remove the HDMI port from all of their GPUs as a nice F.U. to the HDMI forum. They shouldn’t be paying the licensing fees if they are not allowed to make full use of the hardware.

      • Seefra 1@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        For now, but DP and specially DP over USB-C is becoming gradually more popular for computer hardware, someone paying 400 euros for a GPU doesn’t mind paying 10 bucks extra on an adapter if they have an HDMI monitor. But most monitors nowadays come with DP anyway.

        • djvdKu@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          The problem is not with monitors, but rather TVs, which are not using DP (almost?) at all

      • dohpaz42@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        There would be uproar, but like the audio jack on phones people would come around. All it would take is one big enough company to pull it off, and the rest would follow.

        • Jarix@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Just bought a new phone that has an audio jack. Some of us refuse to “come around”. They can fit a stylus and an audio jack in this thing. Why did they remove the audio jack again? Not enough room? Bullshit

          • dubyakay@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            tbh I looked at audio jacks in internals, and they do usually have double the footprint on a pcb than what you see outside of it, at least on low end consumer devices:

            That’s not to say that they couldn’t put anything more compact in a highend device like a smart phone.

            • Jarix@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              4 months ago

              Okay but I have a usbc slot, speakers, stylus, and an audio jack all on the bottom of my new phone. It’s bullshit that they needed the room as evidenced by this 2025 phone.

              It can also use an sdcard. Greedy fucking corporations just wanting you to repurchase stuff you already have.

              • moopet@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                arrow-down
                1
                ·
                4 months ago

                There are sane reasons to ditch an audio port. Like, physical connectors are fragile. Why use something that’s so often broken, when you don’t need to? Why include circuitry for something that you don’t need? At this point, physical audio ports are there for backwards compatibility. I’m not saying wired headphones are bad - I have wired headphones - but phones are the least useful place for them.

                • Jarix@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  4 months ago

                  None of those reasons are the reasons that were stated for removing it from devices by the manufacturers.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          Apple could remove the audio jack from iPhones because 1. They’re Apple. They could remove the eyes from their customers and 9/10ths of them would stay loyal. and 2. Eliminating the headphone jack mostly locked people out of $20 or less earbuds that might have come free with a previous phone anyway. People grumbled, and carried on using the Bluetooth headphones a lot of them already owned.

          AMD doesn’t have the following that Apple does; they’re the objectively worse but more affordable alternative to Nvidia. Eliminating the HDMI port would lock themselves out of the HTPC market entirely; anyone who wanted to connect a PC to a TV would find their products impossible to use, not without experience ruining adapter dongles. We’re talking about making machines that cost hundreds or thousands of dollars incompatible.

          • Zaemz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Considering most gaming consoles use AMD hardware, they’d be having to keep up on licensing for those products as well.