• Fermion
    link
    fedilink
    arrow-up
    86
    arrow-down
    3
    ·
    8 months ago

    The weird phrasing is probably the most unpopular part of your opinion.

  • jeffw@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    4
    ·
    edit-2
    8 months ago

    True AI doesn’t exist yet, worth keeping in mind. LLMs are basically just going for test rides now.

    In terms of corporations, most people i know are being told by their bosses not to use it because companies are scared that their data will be stolen

    • Thorny_Insight@lemm.ee
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      8 months ago

      AI has existed for decades. Email spam filters and autocorrect for example are AI systems. What you’re probably thinking is AGI which indeed doesn’t exist but it’s not a synonym for AI.

    • JackGreenEarth@lemm.ee
      link
      fedilink
      arrow-up
      11
      arrow-down
      6
      ·
      8 months ago

      True AI does exist, many things are AI. What you may be thinking of is AGI, (artificial general intelligence), which doesn’t yet exist.

        • Lmaydev@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          It uses neural networks which are some of the best established AIs in existence and have been for decades.

          What definition of AI are you using? Because it’s wrong hehe

        • piecat@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          8 months ago

          Well, machine learning is considered a subfield of artificial intelligence.

          You definitely mean “Artificial General Intelligence”

          • jeffw@lemmy.world
            link
            fedilink
            arrow-up
            6
            arrow-down
            10
            ·
            8 months ago

            Really? Because AI has been around in sci fi for a long time. Generative AI is what tried to redefine the definition

            • JackGreenEarth@lemm.ee
              link
              fedilink
              arrow-up
              7
              arrow-down
              1
              ·
              8 months ago

              The AI you’re thinking of in sci fi might be AGI. AI has long meant a field of computer science where you get computers to do tasks that require human intelligence, such a a recognising images, playing chess, generating text of images, etc.

  • ninjaturtle@lemmy.today
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    8 months ago

    By A.I. are you mostly talking about the chat bot kind like ChatGPT? Because A.I. is a pretty vast field with many benefits.

  • The shit they’re trying to sell to everyday users is bullshit, for sure. It’s a pointless new gimmick that doesn’t help do anything better than the tools that existed before to do the same thing. In many ways, it’s actually worse.

    • A_Very_Big_Fan@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Idk, text to image AI is a lot faster than sifting through Google Images nowadays if I just need to render an idea rather than get a specific image.

      And GPT is good at finding quotes, explaining academic concepts, and explaining how to fix tech issues without having to sift through the modern SEO bullshit we all have to deal with nowadays. It hasn’t failed me so far.

  • M500@lemmy.ml
    link
    fedilink
    arrow-up
    10
    arrow-down
    4
    ·
    8 months ago

    I have a very small business and it has helped me tremendously in many different ways.

    Anytime someone says it’s not good or stupid, just doesn’t have a good use case for it yet.

    • KidnappedByKitties@lemm.ee
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      8 months ago

      Other than blog and copy generation, what are you using it for?

      I find most AI I have access to either bloats text too much, or is just too wrong to actually save me time. I work in a technical field however, so might have higher requirements than others.

      • M500@lemmy.ml
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        8 months ago

        There is a lot of do with it, but off the top of my head. I use it to quickly display definitions and parts of speech for lists of words.

        I copy text from a pdf which isn’t always great as I use ocrmypdf so the characters sometimes are wrong and there is a bunch of extra returns. So, I just paste that mess into chatgpt and ask it to format the text.

        I can copy homework assignments and then ask it to make a new assignment on the same topic. It literally saved me like 40 minutes today by creating homework assignments for my students.

        I still proofread for accuracy, but I rarely have any issues with it.

        In addition to work stuff, in my personal life it’s been able to answer tech questions, help me write some python scripts quickly, it also is able to get me pricing information for random stuff like cat vaccines in my area. The info wasn’t posted on any of the vets websites. It gave me a range and when I finally got the price, it was directly in the middle.

        Today for example, my wife and I were wondering about the thick hotel pillows and we just opened the app and asked with our voice. We got all the info we needed in like 5 seconds.

        Sorry to rant, but it’s just been an extremely useful tool.

        I also run it locally and plan to further its use by having it create lessons for me from books that I paste into the prompt. But im not sure how well this will work yet.

        • KidnappedByKitties@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          8 months ago

          What tools are you using for this? The one’s I’ve tried have had nowhere near acceptable quality for these tasks.

          • M500@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            ChatGPT 3.5 and Google Gemini mostly. Someone Microsoft copilot.

            I mostly choose the one that’s most convenient in that moment. I don’t ever measure performance or capability.

            My local home server doesn’t have a gpu yet, but I might get one to run some models locally, but I’m not sure whether I care to spend the money to do that or not.

            • KidnappedByKitties@lemm.ee
              link
              fedilink
              arrow-up
              3
              ·
              8 months ago

              Wow, either you are much more skilled than anyone I know at these tasks, and/or you work in a very different way. I tried for two weeks to figure out how to get something useful out of it, but got only garbage.

              It was very good at generic text, much less so at concise, insightful, technical, or argumentative text, which is most of what I sell.

            • A_Very_Big_Fan@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              There are some models you can use on as little hardware as a RaspberryPi. I’m willing to bet there’s a pre-trained model out there that suits your needs with whatever hardware you have. Could be worth a try

  • echo64@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    9
    ·
    8 months ago

    Ai could be good, great even.

    99% of it is being built not to be good, but to replace workers so that corporations can profit more, not to improve anything.

    This especially includes all the shitty ai “art”

    • Thorny_Insight@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      This especially includes all the shitty ai “art”

      You can’t tell the difference between human and AI art. There’s a ton of art made by AI that passes for human art and vice versa. Both humans and AI create shitty art aswell as great art.

      • A_Very_Big_Fan@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        Both humans and AI create shitty art aswell as great art.

        I’ve seen plenty of AI art that passes for human art on Lemmy, but I’ll give him one thing: when AI art is bad you can really tell it’s AI lol

    • nayminlwin@lemmy.ml
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      8 months ago

      AI would have been a great help to workers doing tedious tasks but that’s not investor friendly.

  • Zier@fedia.io
    link
    fedilink
    arrow-up
    8
    arrow-down
    3
    ·
    8 months ago

    Corporations are banking on AI taking jobs so the profit margin skyrockets. A computer cannot make human decisions. It can only make decisions, that humans decide to let it. AI will never be the answer, but it will be one of our biggest problems.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    arrow-up
    5
    arrow-down
    4
    ·
    8 months ago

    People have long said that new technology only creates more jobs. To those people, I would like to direct your attention to the cart-horse. Around a hundred years ago, before electric cars, people used to go around on horses, or in carts and wagons pulled by horses. Horses were an integral part of the transport system, and most horses were employed as such, even being bred specifically to cope with higher demand on people needing to go places. With the advent of the car, large swathes of the horse population became unnecessary, and the population dwindled to a new equilibrium as fewer horses were needed in transport, but fewer horses were also bred. Compared to the busy, hard life horses had to put up with only a few decades ago, most horses nowadays, although there a fewer of them, live a life of comparative luxury, living in fields most of the day where they are free to graze, are given good food by their owners that care about them, and are only occasionally ridden by humans, and even when they are, it is far more relaxed and more of an enjoyable activity than horse-riding was when it was the only way to get somewhere, and done on a daily basis.

    Humans often have this idea that they are special. That they are the only ones that can weave cloth – until it is automated. That they are the only ones who can make pottery – until it is automated. That human labour is the only way to get power – until power production is automated with the advent of electricity. That they are the only ones can be ‘creative’, who can write stories, make art, play music – until that is automated too. True, in all those cases, humans were still involved in the process to some extent, mostly for quality control and maintenance, but far fewer humans are needed to create the same amount of stuff – whether physical goods or more ‘idea-like’ stuff such as art – than before. In fact, recent progress has shown video games that were even tested and quality controlled by AI, as well as being programmed by AI and using AI generated assets, doing away with the need for humans entirely. This is analogous to the true scenario that I outlined in the first paragraph, and is not necessarily a bad thing.

    It is quite likely that, in an impossible to predict timespan (it may be 20 years, it may be much more), humans will have developed technology with the capacity to completely create all the things we need, and more – good food, comfortable shelter, entertainment, and so on. Some will argue that this cessation of the need for humans to work will results in economic collapse and mass hardships, but this is a small minded perspective, often viewed through a capitalistic lens. The horses didn’t have a population explosion and lack of resources due to their work being gone, on the contrary, their numbers dwindled – which is not a bad thing, as long as it is through natural means, which it was, it just means that every individual has more attention and resources – and their lives improved, since they no longer had to endure hard labour every day just to survive. It is certainly attainable for the same thing to happen to us. Population growth is already falling in developed countries, and only people who are unable to image a world without human labour see this as a bad thing. If less humans work every year, and more AIs do their jobs, it balances out, and is a way to ease into a world where there is very little to no human labour, and all our needs and most of our wants are produced by AI.

    As much as many people dislike the sentiment, this would not work in a capitalistic world where what someone gets is dependent on what they contribute to society, for self-evident reasons (those being that no one would need to contribute anything to society if it is all being done by robots), and therefore in a world where all necessary labour is done by AI, we would have to move to a system where everyone gets resources simply by dint of existing, rather than needing to contribute anything themselves. You can call this socialism if you want, it doesn’t really matter what you call it. This system would have the benefit of reducing stress caused by the feeling that you are obligated to do something, while not removing the ability to contribute something if you want – after all, it is necessary labour that has been abolished, to all labour, and just as horses are still used as a novelty and entertainment today, and many people value hand-made pottery, food, etc., over manufactured counterparts, there is likely to still be a desire for art, objects, and stories made by humans even in such a world where all necessary labour has been abolished.

    This also deals with the counterpoint made by many that people will struggle for a sense of meaning and purpose in a world where there is no necessary labour – first of all, people struggle for meaning and purpose even when they do work necessarily, and second of all, as mentioned above, they can still do unnecessary, but still valued labour, and get the same meaning and purpose from that. Some people, myself included, think that although the above scenario may work in theory, in practise it would be difficult to get the billionaires and billionaires’ puppets in government to agree to such a sensible system when the huge benefit to everyone may come at a small cost to themselves – even if the cost is just ego, even if they could still keep all their material resources. I admit, I don’t see a good solution to this problem myself, but, in conclusion, I hope we can think of one together, as this is a world many, including myself, would like to live in.

    • PeepinGoodArgs@reddthat.com
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago
      1. You overestimate that likelihood that the abundance of wealth and income created by AI will be shared. This is simply not requirement. I see no reason why automating everything must result in owners of capital more evenly distributing their gain such that we can live more easily. Elon Musk and his billionaire company currently have hundreds of billions of dollars while people literally starve in the streets and communities drink leaded water in the same country. So, you’re right, it’s not going to work in a capitalistic world. We’ll just transition back to feudalism.
      2. I hate that extended horse analogy. I get what you’re saying, but…umm, no. I don’t want to be cared for and groomed like a horse in a stable. Also, horses don’t make meaning like humans do. Their horse-cart pulling isn’t “working” in the same way a low-wage employee at Amazon that hates their job works.
  • Admiral Patrick@lemmy.worldM
    link
    fedilink
    arrow-up
    3
    arrow-down
    7
    ·
    edit-2
    8 months ago

    Downvoting because I agree.

    So sick of it being shoved into everything, all the hype about it, and all the drivel it spits out.

    The only reason that’s the case is because OpenAI had something people ooh’d, ahh’d, clapped, and threw money at and all the other tech companies said “me too!” and threw their half-baked abominations into everything in order to suck up that sweet, sweet training data.