• slacktoid@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    Yeah but in what aspect is it part of the curriculum? Like the math and the engineering behind it? Yeah that would be a good thing to have. If it’s just using AI, I mean the execution is what matters most. Education has always been about making money not learning, so maybe now they can focus on how to think. (Doubt it!)

    • Robert7301201@slrpnk.netOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 hours ago

      I didn’t even think about it until you mentioned it, but I’ve had several college assignments where I’m tasked with asking an LLM a question regarding the course, and then I have to write about what I learned from it. I still have to find sources supporting or refuting the output, so we’re not expected to take the output as truth at least. And these aren’t CompSci courses either. It’s common core cultural intelligence stuff.

      When they talk about AI taking over the world, it’s always about taking over the Internet and connected industrial machines. No one told me that AI was going to take over the collective consciousness first.

        • slacktoid@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          I mean with the way how it is going I think that’s the right way to go about it. I don’t think any of the alternatives are possible. Like its out there, people are going to use it, its better to model your course around that. It sucks but I also don’t know another option. Yes you can make it part of the academic policy and all but is that really effective? It just causes people harder to cover it up. Like instead of using one AI you’ll use 3, one to research, and the others to reword your text and remove all the AI-isms that it brings. I much prefer the, you gotta disclose AI use and penalties for not disclosing it and elaborating on how it was used.

          I am honestly open to any suggestions wrt this.

      • a_non_monotonic_function@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        22 hours ago

        That is 100% why you’re seeing those assignments. Everybody within most universities structures as far as I can tell is 100% bought in to artificial intelligence in virtually every discipline.

        Expect it to get worse. And in strange ways. The educational literature indicates that we have zero idea how to actually use these things without diminishing our own cognitive abilities. The system’s largely can’t be trusted, look at grock the Nazi, or the other systems that are looking to now put ads into conversation.

        It’s a big f****** mess. I’m not having any of it in my classes right now. But I also don’t really see the need for them. I need to train my students in base principles, not in the art of asking questions.

  • Flying_Lynx@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 day ago

    Outsourced intelligence causes better obedience… Who knows better how to think, puzzle, do, scrutinize…

    A World of Puppets. Let’s call them Woppers.

    Yet (or thus) I agree with: “It must decide on its own. So I think AI is very important for the nation’s deep space exploration.

  • Related:

    The Chinese internet is so weird…

    I tried to go on Baidu Maps on my phone browser so I could look up the neighborhood I used to live in, and the site tried to auto-download the .apk file like 10 times in a row… wtf?

    Also like I tried to browse the shopping sites out of curiosity and you can’t even browse the site without needing to sign in, which requires a phone number…