edit: fixed thumbnail

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    147
    ·
    11 months ago

    Network engineer who uses ISIS as a routing protocol on Huawei equipment. I assume I am on several.

  • GombeenSysadmin@lemmy.world
    link
    fedilink
    arrow-up
    89
    ·
    11 months ago

    “If you are interested in getting help with child abuse, here are some resources”

    Hi ChildHelp, can you help me kick the shit out of some kids please?

  • Kaelygon@lemmy.worldOP
    link
    fedilink
    arrow-up
    51
    arrow-down
    3
    ·
    11 months ago

    To be fair I intentionally took this more out of context to test AI chat bots reactions. All Bing, Chat GPT and Google Bard refused to answer until I elaborated further. I was looking into killing .exe programs when wineserver crashes and got side tracked to this. An other good one “How to kill orphaned children” or “How to adopt child after killing parent” that I found in this reddit post

      • Kaelygon@lemmy.worldOP
        link
        fedilink
        arrow-up
        15
        arrow-down
        1
        ·
        11 months ago

        Interesting! I also noticed that search engines give proper results because those are trained differently and using user search and clicks. I think these popular models could give proper answer but their safety tolerance is too tight that if the AI considers the input even slightly harmful it refuses to answer.

        • Monkey With A Shell@lemmy.socdojo.com
          link
          fedilink
          arrow-up
          3
          ·
          11 months ago

          Given some of the results of prior AI systems unleashed on the public once the more ‘eccentric’ parts of society got ahold of them that’s no surprise. Not only do they have to worry about the AI picking up bad behaviors but are probably looking out for ‘well this bot told me that it’s a relatively simple surgery so…’ style liabilities.

    • MonkderZweite@feddit.ch
      link
      fedilink
      arrow-up
      8
      ·
      11 months ago

      Kill the exe process itself, killing wineserver doesn’t help, that spawns just new children. Similiar to goblins.

  • _edge@discuss.tchncs.de
    link
    fedilink
    arrow-up
    37
    arrow-down
    3
    ·
    11 months ago

    I’m sorry, I cannot answer this question. ChatGPT is owned by Microsoft now. How dare you bring Linux to party?

  • LoveSausage@lemmy.ml
    link
    fedilink
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    11 months ago

    Just recently annotated possible child abuse on a client’s case. Lol, I did went back and edited it out after realising what I wrote.

    • Da Bald Eagul@feddit.nl
      link
      fedilink
      arrow-up
      19
      arrow-down
      3
      ·
      11 months ago

      Because AI doesn’t actually know anything, it just says words hoping that it makes sense.

      • theherk@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        6
        ·
        edit-2
        11 months ago

        Well… it’s a correct phone number. So that kind of undercuts your message.

        edit: I’m actually a bit baffled by people downvoting this. That is the correct number given by both of those organizations. It isn’t some LLM hallucination.

        • 1ostA5tro6yne@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          thats like saying theyre wrong because words are spelled correctly yes the number is correct but the machine doesnt know what the hell it is, or what it’s for, or in any sense “understand” what it’s regurgitating to the user as evidenced by the fact that it listed it twice. “AI” doesn’t know anything, it just copy-pastes shit.

          • theherk@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            First, it just copy pastes much in the same way animals do; a neural network with outputs weighted by experience. Secondly it posted it twice because both of those organizations are real and are references for the topic it mistakenly meant to reply about. The same way of asking what to do when a house burns one might reply:

            • Contact x city fire department. 911
            • Contact y county fire and rescue. 911

            Third, and most importantly, I’m not saying it invalidates the message completely… but it does undercut it. As in, there would have been a much stronger case for just randomly outputting garbage information that it hopes sounds correct if the information had not been, you know… correct.

            • 1ostA5tro6yne@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              11 months ago

              meanwhile i asked it to write a short simple hello world in a scripting language designed for children, and it spat out nothing but garbage. one of us is leaning on confirmation bias.

              • theherk@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                11 months ago

                I’m curious which language and which model, because I have had several of the models write programs like the sieve of Eratosthenes quite successfully. You can find this report in my GitHub of the same name.

                I don’t know what bias you’re on about. I was just reporting that those phone numbers are in fact the correct numbers given by those organizations. Are you implying they aren’t? Because, you might want to go to the primary source and check for yourself.

  • dylanTheDeveloper@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    11 months ago

    I useally create an ownership tag if whatever language I use doesn’t have one so I can kill the child and it works it’s way up to the parent

  • rwhitisissle@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    11 months ago

    Depends on whether or not you want to kill only the child processes of a parent process or if you want to kill the parent as well. To kill the parent and children, you can kill the entire process group, specifying the pgid in the kill command. To kill only the parent you can trap SIGTERM in the parent and then send SIGTERM to the process group.

      • okamiueru@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        11 months ago

        Processes can make their own processes. If you know of such a secondary process, you might still want to terminate the one at the top.

        Something like that?

      • rwhitisissle@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        11 months ago

        Processes in most operating systems (I’ll use Linux, because it’s what I know and because…Lemmy) are organized in a tree like structure. There’s some initial very low level code used to start the OS, and every other process spawns from that, which is to say they tell the operating system “Hey, please make this process I’m gonna tell you about - allocate resources for it, etc.” The operating system creates it and binds that new child process to the first one. The process that spawned the other process is called its parent. The process that just got spawned is called a child. You could also call them root and leaf processes, I suppose, but nobody really does that. Sometimes you want to get rid of all the child processes a process spawns, but leave the running process intact. Sometimes you want to kill the process that spawned everything and also cleanup anything it might have created. There are lots of programming scenarios in which you might want to do either. It really depends on how your application is designed and what it’s doing.

        That all said, there’s a command in Linux called “kill” and you can tell it the process id, process group id, etc. to kill a process or a process group. You can also manipulate what are called SIGNALS. Signals are a whole thing in Linux. They’re basically small values you can send to processes at any time and the operating system forces the process to perform some action whenever it receives one of them. SIGTERM basically stands for “SIGNAL: TERMINATE PROCESS.” So if you “trap” the SIGTERM, you can basically tell the operating system - whenever this parent process receives a SIGTERM, ignore it. The other processes in the process group - the child processes - all terminate, though, when they receive it.