Don’t get me wrong, there are problems with it, both in the process that modern AI uses as well as the sources that it draws from, however, as of right now ai is just a tool like auto-tune or photoshop.
Even though it will change the media formats that it is attached to, it will not supplant them within the next 5 to 10 years, it will simply transform them.
As someone who daily drives ChatGPT for a lot of stuff, I agree. I heard someone put it this way “AI is perfect for stuff that is hard to find but easy to verify”.
The other day I took a photo of my liquor cabinet and told it to make a cocktail recipe with ingredients on hand. Or if I encounter an error on my PC I’ll just describe the problem. Or for movie recommendations when I have a very specific set of conditions. Or trying to remember a show from my childhood with only a vague set of memories. The list goes on.
Particularly for anything coding. If I’m trying to learn something I always learn best when I can just see an example of the thing in action as documentation is not always great. Or if I’m doing data manipulation and I have the input and output and just need the function to convert one to the other. I recently saved a whole afternoon of effort with that one. Or spec tests I’ll just drop my whole code file in and ask it for full coverage.
These are all things that traditional search engines are poor or incapable of. I’d have a hard time going back if they just turned all this off tomorrow.
I think there’s a lack of education around how to use AI which is actually a problem. Like you shouldn’t be using it to identify if a mushroom is safe to eat. You shouldn’t be using really for anything food or health related for that matter. You should ask it for its sources when you are unsure of its answers.
Chatbots are just the next stage of evolution in search engines.
Yep. The few times I used chatGPT was when I wanted to look something up but didn’t know what keywords to use and it was easier to just describe it. Or when I just didn’t feel like digging through links and just wanted an answer to a not so important question.
Looks like he’s okay with it after all.
Didn’t he literally make friends with an AI in that movie?
I just wish companies would stop cramming it into every product and making it the selling point for everything because they can’t think of anything else to do. It’s kind of fun to play with sometimes but it just isn’t useful at all for most tasks that companies seem to want us to use it for. I’m hoping it will be mostly forgotten outside of niche uses in a few years the same way we forgot about 3D TVs or NFTs as soon as the next new tech buzzword got invented
It will transform mediocre content into crap, and then it will regurgitate that crap into worse and worse crap. All of that at the expensive of untold Terawatt-hours of energy. AI will not take over the world, but it sure will help destroy it anyway.
Its energy usage and effectiveness vs human laziness that worries me.
deleted by creator
I agree with you almost 100% (except the copyright stuff), but,
The biggest source of resistance is people fearing for their jobs. That said, a lot of them have never actually tried AI, so they don’t know the limitations and why I doubt serious businesses will replace any serious creative work for years to come
…the business owners are just as ignorant. They are trying to replace people with AI, which will disrupt our lives while the CEOs refuse to admit their error and force us all to deal with it anyway. It’s a lot like outsourcing. It’s not as cheap and effective as businesses hoped, customers largely hate it, and we’re still doing it anyway.
AI will be disruptive, but over the long term it will settle down to a small disruption. But the journey to get there might suck a bit.
But wev hit the part of the cycle where businesses who were stupid enough to believe it have already laid people off, and now we see them quietly starting to rehire. Seems like they still needed real people after all.
I agree with this take.
AI will definitely make some white collar jobs way more productive, and thus change the nature of that work and reduce the number of people employed in those jobs.
A good example is translation, where translators are now mostly reviewing translated texts instead of translating from scratch.
This means the ability to read fast and take on the role of editor is what remains important in the remaining jobs for translators.
The problem is not the tech, as usual, it’s the people, who have been led to believe it’s AGI, who equate forming syntactically correct sentence with intelligence, and that that is enough to perform most white collar tasks.
There are ethical problems with how many of the models have been created and some of what they’re being used for
But I generally agree, it’s pretty much the same thing we see with every technological innovation—something big changes and a load of things get disrupted, a group of people then get angry about said innovation, eventually those people dwindle and the innovation gets absorbed into the general public’s idea of what modern life consists of.
I can’t think of any big innovation over the past few decades that hasn’t really followed a similar trajectory
The dangers of AI should be determined by application, not capability. It’s a tool, like any other. You can use a hammer to build a house or cave in a skull.
To offer a counternarative, we don’t really know where AI will plateau, what will a 1 quadrillion parameter language model look like? And will it make my degree worthless by comparison lol?
Eve when when it does, we will still have hallucination. It will still be impossible to guarantee a correct answer. It’ll be more accurate, but those problems require heavy rethinking to models themselves, more accurate models will lessen it but it won’t go away
What is your degree to you? A means by which to qualify for a job? Or the lens through which you perceive and shape the world around you for the better?
Because AI might some far-flung day make the former obsolete, but I don’t think ever the latter.
That’s exactly what AI would tell you
The problem is that not the tech. The problem, as always, is Capitalism.
True but I can’t think of anything that capitalism hasn’t fucked up one way or another.
I would be against making AI responsible for what capitalism has done.
Neither am I. I’m just pointing out that Capitalism has ruined ai like it ruined everything from the loom to the printing press to the internet
It isn’t as great as it’s proponents say either.
Even AI porn is getting good.
AI is awesome! Don’t let the luddites convince you otherwise.