What are pros and cons of doing this? What impact it will have on the personality / mind of the person down the line after say 10 yrs?

  • AnarchistArtificer@slrpnk.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    7 months ago

    A friend of mine is a French teacher, and I was discussing with her an idea for how to incorporate Chat-GPT into the curriculum. Specifically, her idea was to explore its limitations as a tool, by having a lesson in the computer suite where students actively try to answer GCSE (exams for 15/16 year olds) French questions using Chat-GPT, and then peer mark them, with the goal of “catching out” their peers.

    The logic was that when she was learning French in school, Google translate was still fairly new, and whilst many of her teachers desperately tried to ignore Google Translate, one teacher took the time to look at how one should (and shouldn’t) use this new tool. She said that it was useful to actually be able to evaluate the limitations of online translators, rather than just saying they’re always wrong and should never be used.

    We tried out a few examples to see whether her idea with Chat-GPT had merit and we found that it was pretty easy to generate errors that’d be hard to spot if you’re a student looking for a quick solution. Stuff like “I can’t answer that because I’m a large language model” or whatever, but in French.

  • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    7 months ago

    It’s just AI chatbot, I don’t see how it would be dangerous.

    And I am also pretty sure a 16 year old knows to expect inaccurate results from it, unless they’ve been living restricted from the outside world until now.

    The only negative thing I see from it so far is kids using it to create essays, but it’s not like there wasn’t a countless number of them available on the internet before. It was just easier to detect as you could search up the text and see if you can find it online.

    Anyway, for just playing around it gets boring after 15 minutes.
    Why don’t you try?

    • LWD@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      7 months ago

      Something that appears more human is more likely to elicit them sending their private data. And that data is then sold, obviously without consent, and used however the buyers feel.

      Instead of being scared to share information with it, you will volunteer your data…

      – Vladimir Prelovac, CEO of Kagi AI and Search

      Remember Replika, the AI chatbot that sexually harassed minors and SA victims, and (allegedly) repeated the contents of other people’s messages verbatim?

      It might not be as mind-rotting as TikTok but it’s not good.

  • AlwaysNowNeverNotMe@kbin.social
    link
    fedilink
    arrow-up
    12
    ·
    7 months ago

    The context of the word “let” is interesting here.

    I would recommend a collaborative approach, it’s not as if they can’t use it because you tell them no. They don’t need a credit card or a driver’s license or even a computer.

  • amio@kbin.social
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    7 months ago

    It’s not even a good idea to let quite a lot of adults use ChatGPT. People don’t know how it works, don’t treat the answers with anything close to appropriate skepticism, and often ask about things they don’t have the knowledge/skills to verify. And anything it tells you, you likely will need to verify.

    It’s quite unlikely to affect their personality, but it might make them believe a bunch of weird shit that some unknowable, undebuggable computer program hallucinated up. If you’ve done an uncommonly great job with their critical thinking skills, great. If not, better get started. That is not specific to “AI” though.

    • NoiseColor@startrek.website
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      People don’t know how TV works and we are hardly gonna tell people not to use it.

      As long as people are aware that some responses might be made up it should be fine for anyone to use it.

  • 𝘋𝘪𝘳𝘬@lemmy.ml
    link
    fedilink
    arrow-up
    7
    ·
    7 months ago

    To use as a tool? Yes.
    To use as a friend? No.

    A person using a tool for a longer time will become better in using said tool.

  • CanadaPlus@lemmy.sdf.org
    link
    fedilink
    arrow-up
    6
    ·
    7 months ago

    At 16 they should be just as capable of understanding the limitations as anyone. Just be sure to explain that it has no interest in truth, but only writing convincingly.

    I doubt it will have personality impacts. The one thing that could be an issue is if they use it as a replacement for real human friends.

  • TheEntity@kbin.social
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    7 months ago

    You cannot let or forbid a 16yo to use stuff. You can only decide whether they will do it in the open or in hiding. Personally I’d rather have them talk to me about it than hide it from me.

  • PeepinGoodArgs@reddthat.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    7 months ago

    Have you asked ChatGPT? Jk lol

    Honestly, whatever they use ChatGPT for is probably fine. If you feel like they’re going to cheat on their homework or something, you can just ask them to do a small sample in front of you. Plus, it’s not like ChatGPT is going away, no matter how much the NYT and Disney complain. Best bet is for them to get familiar with the technology now.

    Also, there’s literally no way to the long-term effects of AI. I strongly suspect that if people use it as a crutch, it will create intellectually and creatively stunted people. But it’s not like we don’t have that now…

  • Saigonauticon@voltage.vn
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    I think it would be a bad idea to do otherwise. Children need to learn about useful tools, and the shortcomings of those tools.

    16 year old me would have had a great time getting an AI to teach me things that my teachers in school did not have expertise in. Sure, it would be wrong some of the time, but so were my teachers at that age. It would have given me such a head start on university!

  • lemontree@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    7 months ago

    I would go back a few years and ask: Should i let a 16 year old use search engines?

    Probably not too different

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    Well, lets take a look at how the 16-year-olds who got to use ChatGPT ten years ago have turned out…

    In seriousness, as others have been pointing out, the big online AI assistants are all super neutered these days. I think it’s probably fine, and indeed given how these tools are going to likely become more widespread in the future I think it’s a good idea for kids to get used to using them. At 16 I’d say they’re too old to sit them down and give them a lecture about “it’s not really aware, it doesn’t feel emotions or have memories, and if you go to it with any sort of medical questions definitely double-check those with another source” - lectures at that age are probably going to backfire from what I’ve seen. Instead, suggest that they research those things themselves. Just put those questions out there and hopefully it’ll motivate them to be curious.

  • NoiseColor@startrek.website
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    Not to be contraversial, but likely chatgpt would be the most benign conversation a 16 year old will have in a day. 16 year old! Thats a crazy age.

    The public models are so neutered today that basically all they put out is happy shiny good thoughts information.

  • rufus@discuss.tchncs.de
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    7 months ago

    Kids should use their own creativity, practice reading, creating something. Play outside, get dirty. Do sports, maybe learn a musical instrument. And do their homework themselves.

    I’d say many things are alright in the proper dose. I mean ChatGPT is part of the world they’re growing in to…

    And 16 isn’t a kid anymore. They can handle some responsibility. I don’t see a one-size-fits every 16 yo solution. I think you should allow them and decide individually.

    I’d say at 16, give them some responsibility and let them practice handling it. But that means supervised. You can’t just give them anything and hop they’ll cope on their own. And AI has some non-obvious consequences / traps you can run into. Not even most of the adults can handle or understand it properly. So your focus should be teaching them the how and why, in my opinion. Alike you’d teach your kid how to use the circular saw at some point that age. As a parent you should lokk at them and see if they’re ready for it and how much supervision is appropriate.