While the motives may be noble (regulating surveillance) it might happen that models like Stable Diffusion will get caught in the regulatory crossfire to a point where using the original models becomes illegal and new models will get castrated until they are useless. Further this might make it impossible to train open source models (maybe even LoRAs) by individuals or smaller startups. Adobe and the large corporations would rule the market.

  • voluntaryexilecat@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    They will just claim it “can be used to create illegal pornography” and ban it entirely. And “fix faces” will be a high-risk AI because it can detect faces. 🤦

    Only idea I’ve got to combat this is to make it available to as many people as possible so they can speak in favor of it. Other ideas welcome.