• 3 Posts
  • 136 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle

  • jas0n@lemmy.worldtoScience Memes@mander.xyzAcademia to Industry
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    4
    ·
    1 day ago

    All aboard the hype train! We need to stop using the term “AI” for advanced auto complete. There is not even a shred of intelligence in this. I know many of the people here already know this, but how do we get this message to journalists?! The amount of hype being repeated by respectable journalists is sickening.






  • 100% this. The base algorithms used in LLMs have been around for at least 15 years. What we have now is only slightly different than it was then. The latest advancement was training a model on stupid amounts of scraped data off the Internet. And it took all that data to make something that gave you half decent results. There isn’t much juice left to squeeze here, but so many people are assuming exponential growth and “just wait until the AI trains other AI.”

    It’s really like 10% new tech and 90% hype/marketing. The worst is that it’s got so many people fooled you hear many of these dumb takes from respectable journalists interviewing “tech” journalists. It’s just perpetuating the hype. Now your boss/manager is buying in =]





  • jas0n@lemmy.worldtoAtheist Memes@lemmy.worldBased
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    13 days ago

    It reminded me of this quote from Max Planck (emphasis mine):

    As I began my university studies I asked my venerable teacher Philipp von Jolly for advice regarding the conditions and prospects of my chosen field of study. He described physics to me as a highly developed, nearly fully matured science, that through the crowning achievement of the discovery of the principle of conservation of energy it will arguably soon take its final stable form. It may yet keep going in one corner or another, scrutinizing or putting in order a jot here and a tittle there, but the system as a whole is secured, and theoretical physics is noticeably approaching its completion to the same degree as geometry did centuries ago. That was the view fifty years ago of a respected physicist at the time.

    Basically, there isn’t much left to be discovered in physics, so don’t bother. (Good thing he didn’t follow that advice.) Then, Einstein comes along and is like… you know Newton’s “laws” of motion? I broke 'em. He also broke the aforementioned “law” of conservation of energy.

    So, while we actually do understand the physics of the Big Bang until about the first few milliseconds (not much left to be discovered), we don’t know what we don’t know.




  • While I agree with the general sentiment of your comment, I refuse to believe in anything without empirical evidence of such. These are gaps in our current understanding of our reality. History has shown, there is a logical explanation for just about everything. Nothing… ever… literally… EVER… has pointed toward the existence of such a god…ever.


  • jas0n@lemmy.worldtoAtheist Memes@lemmy.worldBased
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    13 days ago

    Tiny gaps are subjective. Sure.

    god has been attributed to everything that science had no explanation for at the time. Earthquakes, weather events, cosmological events, etc. Now… the general theory has been relegated to one of the very few things that we don’t understand with near certainty. While I agree it’s not exactly a small gap, but I would argue, in the scale of all of science, microscopic is being generous.




  • jas0n@lemmy.worldtoProgrammer Humor@programming.devC++
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    14 days ago

    Just watched this. Thank you. I think I’d agree with most of what he says there. I like trying languages, and I did try rust. I didn’t like fighting with the compiler, but once I was done fighting the compiler, I was somehow 98% done with the project. It kind of felt like magic in that way. There are lots of great ideas in there, but I didn’t stick with it. A little too much for me in the end. One of my favorite parts C is how simple it is. Like you would never be able to show me a line of C I couldn’t understand.

    That said, I’ve fallen in love a language called Odin. Odin has a unique take on allocators in general. It actually gives you even more control than C while providing language support for the more basic containers like dynamic arrays and maps.



  • Hahaha. I knew I was wrong about the polymorphism there. You used big words and I’m a grug c programmer =]

    We use those generic containers in c as well. Just, that we roll our own.

    Move semantics in the general idea of ownership I can see more of a use for.

    I would just emphasize that manual memory management really isn’t nearly as scary as it’s made out to be. So, it’s frustrating to see the ridiculous lengths people go to to avoid it at the expense of everything else.