The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its “poor” and “dangerous” results. The algorithm has been trained only with data from white patients.

  • Leon@pawb.social
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    19 days ago

    It is a direct result of structural racism, as it’s a product of the treatment of white men as being the default. You see it all the time in medicine. There are conditions that disproportionately affect black people that we don’t know enough about because time and money hasn’t been spent studying it.

    Women face the same problem. Lots of conditions apply differently in women. An example of this being why women historically have been underrepresented in e.g. autism diagnoses. It presents differently so for a while the assumption was made that women just can’t be autistic.

    I don’t think necessarily that people who perpetuate this problem are doing so out of malice, they probably don’t think of women/black people as lesser (hell, many probably are women and/or black), but it doesn’t change the fact that structural problems requires awareness and conscious effort to correct.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Again, no.

      There are actual normal reasons that can explain this. Don’t assume evil when stupidity (or in this case, physics) does it. Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

      Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.

      • Leon@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Don’t assume evil when stupidity

        I didn’t, though? I think that perhaps you missed the “I don’t think necessarily that people who perpetuate this problem are doing so out of malice” part.

        Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.

        I didn’t invent this term.

        Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

        Computers don’t see things the way we do. That’s why steganography can be imperceptible to the human eye, and why adversarial examples work when the differences cannot be seen by humans.

        If a model is struggling at doing its job it’s because the data is bad, be it the input data, or the training data. Historically one significant contributor has been that the datasets aren’t particularly diverse, and white men end up as the default. It’s why all the “AI” companies popped in “ethnically ambiguous” and other words into their prompts to coax their image generators into generating people that weren’t white, and subsequently why these image generators gave us ethnically ambigaus memes and German nazi soldiers that were black.