Lots of people on Lemmy really dislike AI’s current implementations and use cases.

I’m trying to understand what people would want to be happening right now.

Destroy gen AI? Implement laws? Hoping all companies use it for altruistic purposes to help all of mankind?

Thanks for the discourse. Please keep it civil, but happy to be your punching bag.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    8
    ·
    45 minutes ago

    I’m not against it as a technology. I use it for my personal use, as a toy, to have some fun or to whatever.

    But what I despise is the forced introduction everything. AI written articles and AI forced assistants in many unrelated apps. That’s what I want to disappear, how they force in lots of places.

  • Saleh@feddit.org
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    54 minutes ago

    First of all stop calling it AI. It is just large language models for the most part.

    Second: immediate carbon tax in line with the current damage expectations for emissions on the energy consumption of datacenters. That would be around 400$/tCO2 iirc.

    Third: Make it obligatory by law to provide disclaimers about what it is actually doing. So if someone asks “is my partner cheating on me”. The first message should be “this tool does not understand what is real and what is false. It has no actual knowledge of anything, in particular not of your personal situation. This tool just puts words together that seem more likely to belong together. It cannot give any personal advice and cannot be used for any knowledge gain. This tool is solely to be used for entertainment purposes. If you use the answers of this tool in any dangerous way, such as for designing machinery, operating machinery, financial decisions or similar you are liable for it yourself.”

  • boaratio@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    57 minutes ago

    For it to go away just like Web 3.0 and NFTs did. Stop cramming it up our asses in every website and application. Make it opt in instead of maybe if you’re lucky, opt out. And also, stop burning down the planet with data center power and water usage. That’s all.

    Edit: Oh yeah, and get sued into oblivion for stealing every copyrighted work known to man. That too.

    Edit 2: And the tech press should be ashamed for how much they’ve been fawning over these slop generators. They gladly parrot press releases, claim it’s the next big thing, and generally just suckle at the teet of AI companies.

  • kittenzrulz123@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    15
    ·
    2 hours ago

    I do not need AI and I do not want AI, I want to see it regulated to the point that it becomes severly unprofitable. The world is burning and we are heading face first towards a climate catastrophe (if we’re not already there), we DONT need machines to mass produce slop.

  • Detun3d@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    1 hour ago

    Gen AI should be an optional tool to help us improve our work and life, not an unavoidable subscription service that makes it all worse and makes us dumber in the process.

  • Taleya@aussie.zone
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    2 hours ago

    What do I really want?

    Stop fucking jamming it up the arse of everything imaginable. If you asked for a genie wish, make it it illegal to be anything but opt in.

    • blackn1ght@feddit.uk
      link
      fedilink
      arrow-up
      5
      ·
      2 hours ago

      I think it’s just a matter of time before it starts being removed from places where it just isn’t useful. For now companies are just throwing it at everything to see what sticks. WhatsApp and JustEat added AI features and I have no idea why or how it could be used for those services and I can’t imagine people using them.

    • 4am@lemm.ee
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 hour ago
      • Trained on stolen ideas: ✅
      • replacing humans who have little to no safety net while enriching an owner class: ✅
      • disregard for resource allocation, use, and pollution in the pursuit of profit: ✅
      • being forced into everything as to become unavoidable and foster dependence: ✅

      Hey wow look at that, capitalism is the fucking problem again!

      God we are such pathetic gamblemonkeys, we cannot get it together.

  • DeathsEmbrace@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    2 hours ago

    Ruin the marketing. I want them to stop using the key term AI and use the appropriate terminology narrow minded AI. It needs input so let’s stop making up fantasy’s about AI it’s bullshit in truth.

  • helpImTrappedOnline@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    3 hours ago

    (Ignoring all the stolen work to train the models for a minute)

    It’s got its uses and potential, things like translations, writing prompts, or a research tool.

    But all the products that force it in places that clearly do not need it and solving problems could be solved by two or three steps of logic.

    The failed attempts at replacing jobs, screen resumes or monitoring employees is terrible.

    Lastly the AI relationships are not good.

  • sweemoof@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    4 hours ago

    The most popular models used online need to include citations for everything. It can be used to automate some white collar/knowledge work but needs to be scrutinized heavily by independent thinkers when using it to try to predict trend and future events.

    As always schools need to be better at teaching critical thinking, epistemology, emotional intelligence way earlier than we currently do and AI shows that rote subject matter is a dated way to learn.

    When artists create art, there should be some standardized seal, signature, or verification that the artist did not use AI or used it only supplementally on the side. This would work on the honor system and just constitute a scandal if the artist is eventually outed as having faked their craft. (Think finding out the handmade furniture you bought was actually made in a Vietnamese factory. The seller should merely have their reputation tarnished.)

    Overall I see AI as the next step in search engine synthesis, info just needs to be properly credited to the original researchers and verified against other sources by the user. No different than Google or Wikipedia.

  • MisterCurtis@lemmy.world
    link
    fedilink
    arrow-up
    19
    ·
    5 hours ago

    Regulate its energy consumption and emissions. As a whole, the entire AI industry. Any energy or emissions in effort to develop, train, or operate AI should be limited.

    If AI is here to stay, we must regulate what slice of the planet we’re willing to give it. I mean, AI is cool and all, and it’s been really fascinating watching how quickly these algorithms have progressed. Not to oversimplify it, but a complex Markov chain isn’t really worth the energy consumption that it currently requires.

    A strict regulation now, would be a leg up in preventing any rogue AI, or runaway algorithms that would just consume energy to the detriment of life. We need a hand on the plug. Capitalism can’t be trusted to self regulate. Just look at the energy grabs all the big AI companies have been doing already (xAI’s datacenter, Amazon and Google’s investments into nuclear). It’s going to get worse. They’ll just keep feeding it more and more energy. Gutting the planet to feed the machine, so people can generate sexy cat girlfriends and cheat in their essays.

    We should be funding efforts to utilize AI more for medical research. protein folding , developing new medicines, predicting weather, communicating with nature, exploring space. We’re thinking to small. AI needs to make us better. With how much energy we throw at it we should be seeing something positive out of that investment.

    • medgremlin@midwest.social
      link
      fedilink
      arrow-up
      1
      ·
      26 minutes ago

      These companies investing in nuclear is the only good thing about it. Nuclear power is our best, cleanest option to supplement renewables like solar and wind, and it has the ability to pick up the slack when the variable power generation doesn’t meet the variable demand. If we can trick those mega-companies into lobbying the government to allow nuclear fuel recycling, we’ll be all set to ditch fossil fuels fairly quickly. (provided they also lobby to streamline the permitting process and reverse the DOGE gutting of the government agency that provides all of the startup loans used for nuclear power plants.)

  • Goldholz @lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    5 hours ago

    Shutting these "AI"s down. The once out for the public dont help anyone. They do more damage then they are worth.

  • Rose@slrpnk.net
    link
    fedilink
    arrow-up
    24
    ·
    7 hours ago

    The technology side of generative AI is fine. It’s interesting and promising technology.

    The business side sucks and the AI companies just the latest continuation of the tech grift. Trying to squeeze as much money from latest hyped tech, laws or social or environmental impact be damned.

    We need legislation to catch up. We also need society to be able to catch up. We can’t let the AI bros continue to foist more “helpful tools” on us, grab the money, and then just watch as it turns out to be damaging in unpredictable ways.

    • theherk@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      6 hours ago

      I agree, but I’d take it a step further and say we need legislation to far surpass the current conditions. For instance, I think it should be governments leading the charge in this field, as a matter of societal progress and national security.

  • 𝕱𝖎𝖗𝖊𝖜𝖎𝖙𝖈𝖍@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    edit-2
    7 hours ago

    I’m perfectly ok with AI, I think it should be used for the advancement of humanity. However, 90% of popular AI is unethical BS that serves the 1%. But to detect spoiled food or cancer cells? Yes please!

    It needs extensive regulation, but doing so requires tech literate politicians who actually care about their constituents. I’d say that’ll happen when pigs fly, but police choppers exist so idk