• acosmichippo@lemmy.world
    link
    fedilink
    English
    arrow-up
    147
    arrow-down
    2
    ·
    edit-2
    18 days ago

    article took forever to get to the bottom line. content. 8k content essentially does not exist. TV manufacturers were putting the cart before the horse.

    • themeatbridge@lemmy.world
      link
      fedilink
      English
      arrow-up
      93
      ·
      18 days ago

      4k tvs existed before the content existed. I think the larger issue is that the difference between what is and what could be is not worth the additional expense, especially at a time when most people struggle to pay rent, food, and medicine. More people watch videos on their phones than watch broadcast television. 8k is a solution looking for a problem.

      • Fredselfish@lemmy.world
        link
        fedilink
        English
        arrow-up
        34
        ·
        18 days ago

        Hell I still don’t own a 4k tv and don’t plan to go out of my way to buy one unless the need arises. Which I don’t see why I need that when a normal flat-screen looks fine to me.

        I actually have some tube tvs and be thinking of just hooking my vcr back up and watching old tapes. I don’t need fancy resolutions in my shows or movies.

        Only time I even think of those things is with video games.

    • jqubed@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      18 days ago

      I think it’s NHK, or one of the Japanese broadcasters anyways, that has actually been pressing for 8K since the 1990s. They didn’t have content back then and I doubt they have much today, but that’s what they wanted HD to be.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        edit-2
        18 days ago

        Not familiar with NHK specifically (or, to be clear, I think I am but not with enough certainty), but it really makes a lot of sense for news networks to push for 8k or even 16k at this point.

        Because it is a chicken and egg thing. Nobody is going to buy an 8k TV if all the things they watch are 1440p. But, similarly, there aren’t going to be widespread 8k releases if everyone is watching on 1440p screens and so forth.

        But what that ALSO means is that there is no reason to justify using 8k cameras if the best you can hope for is a premium 4k stream of a sporting event. And news outlets are fairly regularly the only source of video evidence of literally historic events.

        From a much more banal perspective, it is why there is a gap in TV/film where you go from 1080p or even 4k re-releases to increasingly shady upscaling of 720 or even 480 content back to everything being natively 4k. Over simplifying, it is because we were using MUCH higher quality cameras than we really should have been for so long before switching to cheaper film and outright digital sensors because “there is no point”. Obviously this ALSO is dependent on saving the high resolution originals but… yeah.

        • acosmichippo@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          edit-2
          18 days ago

          it’s not exactly “there is no point”. It’s more like “the incremental benefit of filming and broadcasting in 8k does jot justify the large cost difference”.

            • paraphrand@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              edit-2
              18 days ago

              I’m sorry, but if we are talking about 8k viability in TVs, we are not talking about shooting in 8k for 4k delivery.

              You should be pointing out that shooting in higher than 8k, so you have the freedom to crop in post, is part of the reason 8k is burdensome and expensive.

              • Knock_Knock_Lemmy_In@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                17 days ago

                So correct the person above me, they wrote about shooting in 8k.

                The RED V-Raptor is expensive for consumer grade but nothing compared to some film equipment. There are lenses more expensive than an 8k camera.

          • NuXCOM_90Percent@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            18 days ago

            Which, for all intents and purposes, means there is no point. Because no news network is going to respond to “Hey boss, I want us to buy a bunch of really expensive cameras that our audience will never notice because it will make our tape library more valuable. Oh, not to sell, but to donate to museums.” with anything other than laughter and MAYBE firing your ass.

            • acosmichippo@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              18 days ago

              the point is, the cost/benefit calculation will change over time as the price of everything goes down. It’s not a forever “no point”.

              • NuXCOM_90Percent@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                18 days ago

                … Almost like it would be more viable to film in higher resolution if more consumers had higher resolution displays?

    • Broken@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      Not only does it not exist, it isn’t wanted. People are content watching videos on YouTube and Netflix. They don’t care for 4k. Even if they pay extra for Netflix 4k (which I highly doubt they do) I still question if they are watching 4k with their bandwidth and other limiting factors, which means they’re not watching 4k and are fine with it.

  • Photuris@lemmy.ml
    link
    fedilink
    English
    arrow-up
    119
    arrow-down
    3
    ·
    18 days ago

    I don’t care about 8k.

    I just want an affordable dumb TV. No on-board apps whatsoever. No smart anything. No Ethernet port, no WiFi. I have my own stuff to plug into HDMI already.

    I’m aware of commercial displays. It just sucks that I have to pay way more to have fewer features now.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      18 days ago

      You can have a smart TV but never set up any of the smart features. I have two LG OLED TVs but rarely touch anything on the TV itself. I’ve got Nvidia Shields for streaming and turning it on or off also turns the TV on or off. Same with my Xbox.

      I just need to figure out if I can use CEC with my SFF gaming PC (so that turning it on also turns the TV on, and turning it off turns the TV off), then I won’t have to touch the TV’s remote again.

      Ethernet port or wifi are good for controlling the TV using something like Home Assistant. I have my TVs on a separate isolated VLAN with no internet access. I have a automation that runs when the TV turns on, to also turn on some LED lights behind the TV.

      • Photuris@lemmy.ml
        link
        fedilink
        English
        arrow-up
        25
        arrow-down
        3
        ·
        18 days ago

        Fine, but I don’t want the smart features to be installed at all in the first place.

        I don’t want a WiFi antenna or Ethernet port in there.

        I know that sounds ridiculous, since I can “simply not use them,” but I want to spend my money on an appliance, not a consumer data collection tool.

        I don’t want them to have any of my data, and I don’t want to spend money “voting” with my dollar for these data collection devices.

        Some of these devices have even been known to look for other similar devices within WiFi range, and phone home that way (i.e., send analytics data via a neighbor’s connected TV as a proxy).

        Fuuuck that. I don’t want my dollar supporting this, at all, plain and simple. And I don’t want to pay a premium for the privilege of buying a technically simpler device. I do, but it’s bullshit, and I’m unhappy about it.

        • Null User Object@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          2
          ·
          18 days ago

          Some of these devices have even been known to look for other similar devices within WiFi range, and phone home that way (i.e., send analytics data via a neighbor’s connected TV as a proxy).

          Ummm, wut? I’m going to need some quality sources to back this claim up.

          • BassTurd@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            3
            ·
            18 days ago

            Yea, this paragraph feels like fear mongering. I’m not saying OP didn’t see that somewhere, but from a tech standpoint, the TV still has to authenticate with any device it’s trying to piggy back off the wifi for. Perhaps if there were any open network in range it could theoretically happen, but I’m guessing that it’s not.

            I do remember reading that some smart TV was able to use the speakers as a mic to record in room audio and pass that out if connected. It may have been a theoretical thing but it might have been a zero day I read about. It’s been some years now.

            • shortwavesurfer@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              17 days ago

              Actually, it’s true. Amazon’s sidewalk works in a similar way, where if the sensor is not connected to the internet, it will talk to local Echo devices like your speakers that are connected to the internet and pass the data to Amazon through your device’s network.

              TVs will look for open Wi-Fi networks. And failing that, they could very well do this exact same thing.

              Edit: The way it works is that the echo devices contain a separate radio that works over the 868 to 915 megahertz industrial scientific and medical band, so the sensor communicates with your echo that way, and then your echo communicates it to the network as if it’s coming from the echo itself, not another device. So the sensor gets connected to the network without your network realizing that it’s actually a third-party device. To your network, the only thing it sees is the Echo, but to the Echo, it sees both your network, which it’s connected to, and the sensor, so it’s acting as a relay.

              • BassTurd@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                17 days ago

                I forgot the Sidewalk is a thing. While that tech does kind of do what OP was saying, Sidewalk is limited to only Amazon Sidewalk compatible devices, like the echo line and ring. Just at a quick glance, there are no smart TVs that can connect to that network.

                That said, it is an opt out service, which it awful. No smart TVs will connect, but I’d recommend disabling for anyone that uses Amazon devices.

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          8
          ·
          18 days ago

          I totally get where you’re coming from. It’s hard to find devices like that. I think the issue is that regular customers are demanding the smart features, and using them without caring about privacy aspects.

        • vithigar@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          ·
          18 days ago

          I know that sounds ridiculous, since I can “simply not use them,” but I want to spend my money on an appliance, not a consumer data collection tool.

          For what it’s worth you’re actually spending the manufacturer’s money (or at least some of their profit margin) on a data collection device that they won’t get to use.

          Smart devices are cheaper because the data collection subsidizes them.

        • ccunix@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          18 days ago

          They are called “Digital Signage Panels” and they cost an arm and a leg.

          The data collection subsidises the cost of your TV, so that brings the cost down. Also, digital signage panels are rated for 24/7 use, which significantly increases their cost.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          18 days ago

          Some of these devices have even been known to look for other similar devices within WiFi range, and phone home that way (i.e., send analytics data via a neighbor’s connected TV as a proxy).

        • olympicyes@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          18 days ago

          Your tv price is subsidized by the presence of those network connections. I recommend using universal remote.

      • 4am@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        18 days ago

        Sometimes that doesn’t even matter anymore; they’ll refuse to work now without a network set up.

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          1
          ·
          18 days ago

          If it wants a network then stick it on an isolated VLAN with no internet access.

          • grue@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            18 days ago

            That’s not what that means and you know it. It refuses to work unless it can successfully phone home over the Internet.

            • dan@upvote.au
              link
              fedilink
              English
              arrow-up
              1
              ·
              18 days ago

              So people in rural areas without good internet, or places where the network is airgapped, can’t use them at all? Seems like there’s be a way around it.

    • olympicyes@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      18 days ago

      I blacklist the TVs Ethernet and WiFi MAC addresses. I strongly encourage using a computer, Apple TV, or anything that can’t fingerprint everything you use your tv for.

    • iopq@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      No, I want only one DP port and to have a separate box that selects sources. That way I have the ports I want

  • MeekerThanBeaker@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    91
    ·
    18 days ago

    I don’t want 8K. I want my current 4K streaming to have less pixilation. I want my sound to be less compressed. Make them closer to Ultra BluRay disc quality before forcing 8K down our throats… unless doing that gives us better 4K overall.

    • ramble81@lemmy.zip
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      1
      ·
      18 days ago

      Yeah 4K means jack if it’s compressed to hell, if you end up with pixels being repeated 4x to save on storage and bandwidth, you’ve effectively just recreated 1080p without upscaling.

      Just like internet. I’d rather have guaranteed latency than 5Gbps.

    • Geometrinen_Gepardi@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      25
      ·
      18 days ago

      Yep, just imagine how bad the compression artefacts will be if they double the resolution but keep storage/network costs the same.

    • Komodo Rodeo@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      18 days ago

      Bingo, if I were still collecting DVDs/HD DVDs like I was in the 90’s, it might be an issue. Streaming services and other online media routed through the TV can hardly buffer to keep up with play speed at 720, so what the fuck would I want with a TV that can show a higher quality of picture which it can also not display without stutter-buffering the whole of a 1:30:00 movie?

      • FreedomAdvocate@lemmy.net.au
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        18 days ago

        Streaming services and other online media routed through the TV can hardly buffer to keep up with play speed at 720

        This is a problem with your internet/network, not the TV.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    60
    ·
    18 days ago

    I would much rather have 1080p content at a high enough bitrate that compression artifacts are not noticeable.

  • happydoors@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    1
    ·
    17 days ago

    I am a filmmaker and have shot in 6k+ resolution since 2018. The extra pixels are great for the filmmaking side. Pixel binning when stepping down resolutions allows for better noise, color reproduction, sharpened details, and great for re-framing/cropping. 99% of my clients want their stuff in 1080p still! I barely even feel the urge to jump up to 4k unless the quality of the project somehow justifies it. Images have gotten to a good place. Detail won’t provide much more for human enjoyment. I hope they continue to focus on dynamic range, HDR, color accuracy, motion clarity, efficiency, etc. I won’t say no when we step up to 8k as an industry but computing as a whole is not close yet.

    • Natanael@infosec.pub
      link
      fedilink
      English
      arrow-up
      16
      ·
      17 days ago

      The same argument goes for audio too.

      6K and 8K is great for editing, just like how 96 KHz 32+ bit and above is great for editing. But it’s meaningless for watching and listening (especially for audio, you can’t hear the difference above 44khz 16 bit). When editing you’ll often stack up small artifacts, which can be audible or visible if editing at the final resolution but easy to smooth over if you’re editing at higher resolutions.

    • obsoleteacct@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      17 days ago

      Imagine you’re finishing in 8k, so you want to shoot higher resolution to give yourself some options in reframing and cropping? I don’t think Red, Arri, or Panavision even makes a cinema camera with a resolution over 8k. I think Arri is still 4k max. You’d pretty much be limited to Blackmagic cameras for 12k production today.

      Plus the storage requirements for keeping raw footage in redundancy. Easy enough for a studio, but we’re YEARS from 8k being a practical resolution for most filmmakers.

      My guess is most of the early consumer 8k content will be really shoddy AI upscaled content that can be rushed to market from film scans.

      • mojofrododojo@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        17 days ago

        film scanning at 4k res already reveals the granular structure of film, at 8k it’s going to become hard to ignore. And you’re spot on - they’ll do crappy 8k upres garbage for ages before the storage and streaming become practical.

  • Peffse@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    ·
    18 days ago

    I don’t know if it changed, but when I started looking around to replace my set about 2 years ago, it was a nightmare of marketing "gotcha"s.

    Some TVs were advertising 240fps, but only had 60fps panels with special tricks to double framerate twice or something silly. Other TVs offered 120fps, but only on one HDMI port. More TVs wouldn’t work without internet. Even more had shoddy UIs that were confusing to navigate and did stuff like default to their own proprietary software showing Fox News on every boot (Samsung). I gave up when I found out that most of them had abysmal latency since they all had crappy software running that messed with color values for no reason. So I just went and bought the cheapest TV at a bargain overstock store. Days of shopping time wasted, and a customer lost.

    If I were shown something that advertised with 8K at that point, I’d have laughed and said it was obviously a marketing lie like everything else I encountered.

      • Vik@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        18 days ago

        in that situation, Asus are the shitty part, though it is nice to see more TV-sized monitors. Fuck HDMI.

          • Vik@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            18 days ago

            I’ll consider you lucky. I’ve had many experiences with their hardware across different segments (phones, tablets, laptops, mainboards, NICs, displays, GPUs).

            They’re an atrocious vendor with extremely poor customer support (and shitty SW practicies for UMA systems and motherboards).

            I don’t think many people have been as unfortunate as I have with them, the general consensus is they mark their products up considerably relative to competition (particularly mainboards & GPUs).

            To be fair, their contemporaries arent much butter.

            • Poem_for_your_sprog@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              18 days ago

              Dang.

              I switched to ASRock for my AMD build for specific feature sets and reading ASUS AM5 stuff it looks like that was a good idea.

              • Vik@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                17 days ago

                But ASRock 800 series AM5 boards are killing granite ridge 3D CPUs en masse. Funny enough, it happened to me.

                I begrudgingly switched to Asus after my CPU was RMA’d as that was the only other vendor to offer ECC compat on a consumer platform.

          • Gerudo@lemmy.zip
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            18 days ago

            ASUS used to be the goat brand. They have since enshittified, and the biggest hit was their customer service. It’s 100% ass now. The product itself is really hit or miss now too.

  • BlackVenom@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    17 days ago

    For what content? Video gaming (GPUs) has barely gotten to 4k. Movies? 4k streaming is a joke; better off with 1080 BD. If you care about quality go physical… UHD BD is hard to find and you have to wait and hunt to get them at reasonable prices… And these days there are only a couple UHD BD Player mfg left.

  • n1ckn4m3@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    ·
    17 days ago

    As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.

    High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.

    I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.

    Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.

    8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    34
    ·
    17 days ago

    The difference between 1080 and 4K is pretty visible, but the difference between 4K and 8K, especially from across a room, is so negligible that it might as well be placebo.

    Also the fact that 8K content takes up a fuckload more storage space. So, there’s that, too.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      17 days ago

      Even 1080p isn’t hugely different from 4k in many cases. Yeah, you can probably notice it, but both are fantastic resolutions. I’ve had a 4k TV for years, and I can count the number of times I’ve actually watched 4k content on it on two hands because it generally isn’t worth the storage space or extra cost.

      • Showroom7561@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        17 days ago

        I find that it really depends on the content on the size of the display.

        The larger the display, the more you’d benefit from having a higher resolution.

        For instance, a good quality 1080p stream vs a highly compressed 4k stream probably won’t look much different. But a “raw” 4k stream looks incredible… think of the demos you see in stores showing off 4k TVs… that quality is noticeable.

        Put the same content on a 50"+ screen, and you’ll see the difference.

        When I had Netflix, watching in 4k was great, but to me, having HDR is “better”.

        On a computer monitor, there’s a case for high-resolution displays because they allow you to fit more on the screen without making the content look blurry. But on a TV, 4k + HDR is pretty much peak viewing for most people.

        That’s not to say that if you create content, 8k is useless. It can be really handy when cropping or re-framing if needed, assuming the desired output is less than 8k.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          17 days ago

          think of the demos you see in stores showing off 4k TVs… that quality is noticeable.

          Sure. But remember that much of the time, the content is tuned for what the display is good at, which won’t necessarily reflect what you want to watch on it (i.e. they’re often bright colors with frequent color changes, whereas many movies are dark with many slow parts). At least at the start, many 4k TVs had a worse picture than higher end 1080p TVs, and that’s before HDR was really a thing.

          So yeah, it highly depends on the content. As you mentioned, in many cases, 1080p HDR will be better than 4k non-HDR. Obviously 4k HDR on a good display is better than 1080p HDR on a good display, but the difference is much less than many people claim it to be, especially at a typical TV viewing distance (in our case, 10-15 ft/3-5m).

          computer monitor

          I find the sweet spot to be 1440p. 4k is nicer, but the improvement over 1440p is much less than 1440p vs 1080p. My desktop monitor is a 27" 1440p monitor w/ approx 109 ppi, and my work laptop is a Macbook Pro w/ 3024x1964 resolution w/ approx 254 ppi, more than double. And honestly, they’re comparable. Text and whatnot is certainly sharper on the nicer display, but there are certainly diminishing returns.

          That said, if I were to watch movies frequently on my computer, I’d prefer a larger 4k monitor so 1080p content upscales better. But for games and normal computer stuff, 1440p is plenty.

          Given that I don’t find a ton of value in 4k over 1080p, 8k will be even more underwhelming.

        • SaveTheTuaHawk@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 days ago

          think of the demos you see in stores showing off 4k TVs… that quality is noticeable.

          Because stores use a high quality feed and force you to stand withing 4ft of the display. There is a whole science to how Best Buy manipulates TV sales. They will not let you adjust TV picture settings on lower margin TVs.

          • Showroom7561@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            17 days ago

            Because stores use a high quality feed

            Yes, obviously, and consumers who are buying such high-end displays should do their best to provide the highest quality source to play back on those displays.

            Distance from the display is important, too. On a small TV, you’ll be close to it, but resolution won’t matter as much.

            But from across the room, you want a higher resolution display up to a certain point, or else you’ll see large pixels, and that looks terrible.

            Personally, going with a 4k TV was a big leap, but the addition of HDR and an OLED display (for black blacks) had the most impact.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 days ago

          True. Our TV is 10-15 ft/3-5m away on a ~60in screen, and at that distance, the difference is noticable, but not significant. We have a 40" screen with much closer viewing distance (usually 5-8 ft/~2m), and we definitely notice the difference there.

          If I was watching movies at a desk w/ a computer monitor, I’d certainly notice 1080p vs 4k, provided the screen is large enough. In our living room with the couch much further from the screen, the difference is much less important.

  • afk_strats@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    18 days ago

    I haven’t seen this mentioned but apart from 8K being expensive, requiring new production pipelines, unweildley for storage and bandwidth, unneeded, and not fixing g existing problems with 4K, it requires MASSIVE screens to reap benefits.

    There are several similar posts, but suffice to say, 8K content is only perceived by average eyesight at living room distances when screens are OVER 100 inches in diameter at the bare minimum. That’s 7 feet wide.

    1000009671

    Source: https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      18 days ago

      Not sure where 1440p would land, but after using one for a while, I was going to upgrade my monitor to 4k but realized I’m not disappointed with my current resolution at all and instead opted for a 1440p ultrawide and haven’t regretted it at all.

      My TV is 4k, but I have no intention of even seriously looking at anything 8k.

      Screen specs seem like a mostly solved problem. Would be great if focus could shift to efficiency improvements instead of adding more unnecessary power. Actually, boot time could be way better, too (ie get rid of the smart shit running on a weak processor, emphasis on the first part).

    • snugglesthefalse@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      18 days ago

      4k 25" was worth it for me but I only spent about £140 on it so YMMV it’s nice but not essential and after 1080p the extra pixels only add so much

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      18 days ago

      8K content is only perceived by average eyesight at living room distances when screens are OVER 100 inches in diameter at the bare minimum.

      65-75" tv’s are pretty much the standard these days. I’ve got a 75" and I’ll want the next one I replace it with to be even bigger, so 100"-ish will be what I’ll be after.

          • Lumisal@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            18 days ago

            I like a big screen for gaming too, but just wanted to mention it also means you’ll do worse at games. You can look it up, but a smaller screen gives you better performance, because your brain can properly see everything that’s happening on screen at once.

            Unless your screen is significantly far away that is.

            • FreedomAdvocate@lemmy.net.au
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              15 days ago

              Thats only if you’re sitting somewhere where you can’t see the whole screen at once. I can see everything that’s happening on my big tv. I’ve found I do worse on a smaller tv/monitor, moved my gaming pc from my 144hz 34” monitor to the 75” 120hz tv and my results are much better.

      • Lumisal@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        18 days ago

        Yeah but do you also watch tv from about 1.8 meters away? Look at that chart again.

          • Lumisal@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            17 days ago

            Well if you’re watching it that close, sounds like what you want is total immersion. Have you considered an Apple headset? Or something similar

  • HugeNerd@lemmy.ca
    link
    fedilink
    English
    arrow-up
    34
    ·
    18 days ago

    So many things have reached not only diminishing returns, but no returns whatsoever. I don’t have a single problem that more technology will solve.

    I just don’t care about any of this technical shit anymore. I only have two eyes, and there’s only 24 hours in a day. I already have enough entertainment in perfectly acceptable quality, with my nearly 15 year old setup.

    I’ve tapped out from the tech scene.

    • shortwavesurfer@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 days ago

      I’ve hit that same wall. I’m perfectly happy with a $300 smartphone, because it does absolutely everything I need to do, fast enough to not make me want to throw it across the room, and well enough that I don’t notice the difference between it and a high-end device.

      Do I notice the difference after three or four years of having the device and finally upgrading it to a new device in that price range? Sure, I notice it. But day to day use, I don’t notice it and that’s what matters.

      • HugeNerd@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        17 days ago

        I don’t understand most of the things I used to enjoy as a kid. I went from radio to cassette to CD to MiniDisc to MP3s. Now I’m supposed to endlessly change things around to keep up with media players and codecs and whatevers. No thanks.

        I used to enjoy programming and tinkering with computers and microcontrollers.

        Now I have to be an expert in 15 unrelated fields and softwares because even a simple job of turning a button press into a single output pulse is a weeks-long nightmare of IDEs and OSes and embedded Linuxes and 32 bit microcontrollers and environments, none of which are clear and straightforward, and all have subtle inter-dependencies.

        So to turn on a LED with a switch now requires a multi-core 16GB main PC (so limited! You need more!) so I can open a multi-GB IDE (that can support every language ever invented) that requires an SSD just to be able to navigate the 35 windows it opens in less than an hour, so I can use AI to copy-paste hundreds of lines of boiler plate code I don’t understand, so I can type a few lines of code?

        And that’s not counting all the new companies and architectures.

  • Rooty@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    18 days ago

    I watch torrented shows with VLC on my laptop. Why would I want a giant smarphone that spies on me?

  • Solitaire20X6@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    31
    ·
    18 days ago

    Most Americans are out of money and can’t find good jobs. We are clinging to our old TVs and cars and computers and etc. for dear life, as we hope for better days.

    And what can you even watch in true 8K right now? Some YouTube videos?

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      18 days ago

      If I were in the market for a new monitor and I could get an 8k monitor for under $1000 I’d consider it, but right now if one of my monitors broke I’d just be getting another 4k to replace it. The price isn’t worth it for me to have high DPI.

      For TV my only justification for my 4k TV is that it was free.