• gressen@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    5 months ago

    It’s called a classifier and it could easily detect an embedded ad. The issue is now everyone needs to run it on their hardware to detect and this will cost some electricity.

      • gressen@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        Well I’m not happy about potentially adding new type of load on the electrical grids around the world.

        • archchan@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          Ads have definitely added more load on electrical grids in aggregate than locally hosted and lightweight models, especially given that ads are fucking everywhere all the time. Websites, apps, the servers, even 24/7 electric billboards. I’m not worried about a few nerds using slightly more electricity sometimes for their own benefit and joy (it’s still less power than gaming), as opposed to a corp that burns through power and breaks their climate pledges (Microsoft) for the benefit of their bottom line and nothing else. Corps don’t get to have a monopoly on AI that was built with our data, only to have it fed back to us to pull more data and siphon more money.

          So basically fuck Google and fuck ads.

        • interdimensionalmeme@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          Do you understand what we’re still talking less energy than the monitor it displays on. I would bet even untuned VGG16 could do that without even a fine tune. Advertising is starkly different to content and the output is a “ad=yes/no” signal. It’s a very small amount of data, probably less than the plain hardware video decoder. It’s also not a new type of load, it runs off the same power supply as any computer, a slight capacitive load, it won’t even change the grid powerfactor.