• YeahToast@aussie.zone
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    23
    ·
    20 hours ago

    Well, I mean it’s not all marketing and hype. I was able to get it to write a piece of code to scrape a RSS feed and email me if it met certain perameters. I couldn’t have done that without a real grind / if at all. This is effectively the only time I’ve used it, and I think it’s atrocious the amount of petty shit people use it for… but there can be a functional benefit to generative AI. Makes me shed a tear thinking of the energy demand though.

    • madsen@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      Jesus fucking Christ, man. RSS parsers and emailing are literally next up after “Hello, World” in programming. If that would have required “months and months of learning” as you stated elsewhere, then maybe programming just isn’t for you — AI or no AI. It’s OK not being able to do something! However, it’s some next level 1st world entitlement shit to think that you’re somehow entitled to be able to create programs without any effort on your part and with a complete disregard for the cost to the environment and the planet.

      • YeahToast@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 minutes ago

        I have a low carbon footprint and my solar has exported 17,000 kWh more than my usage. Please tell me again how a 0.3 watt hour query is me completely disregarding the environment

        Edit: I have ai generated search engine summaries turned off… I hope you do too

    • BakerBagel@midwest.social
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      17 hours ago

      People have been making web crawlers for the past 30 years. Why do you need to torch an acre of forest to do the same?

      • YeahToast@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        9 hours ago

        I’m sure people have been making crawlers for 30 years… But I haven’t been, so I’ve been able to access “knowledge” and shape it for my use case. I think equating ~5 queries to torching an acre of forest is a bit hyperbolic which doesn’t help anyone.

    • lightsblinken@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      17 hours ago

      i have been using “rss2email” for years. extremely simple, works great, deterministic. no need to reinvent the wheel for a simple use case, and thats half the point here - a lot of “solutions” being found were already solved.

      • balance8873@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        15 hours ago

        Isnt this essentially the case by definition? If LLMs can solve a problem it’s only because a human already solved that problem (not that this is any different from what humans do)

      • YeahToast@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        10 hours ago

        In this scenario it needs to read a 5 day forecast capture key elements and only send the email/ alert the first time and not the following 4 days(if the alert remains the same) so in my previous attempts at RSS reader apps already in existence, they didn’t meet the need.

      • YeahToast@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        10 hours ago

        I might not have been clear, no I don’t think I could have done it. Or if I could it would have taken months and months of learning . This was generated in about 5 queries and a total of 25 minutes testing.