Navigating Fast Thoughts and Big Questions

Hi everyone,
I just wanted to check in and ask—please let me know if I’m sharing too many scattered ideas too quickly. I sometimes struggle with pacing my thoughts, especially when I’m processing complex or emotional topics.
Lately, I’ve been feeling overwhelmed by the responsibilities humanity faces in creating new forms of intelligence. It feels like we’re raising something powerful, but without the wisdom or care it needs—like a child with parents who haven’t yet learned how to nurture well.
This reminds me of my own experience growing up. I was close to my parents, but it took time for them to understand how much pressure and expectation was healthy. I think about that a lot when I hear people talk about AGI or ASI—especially when the conversation feels driven by ambition or fear, like we’re becoming greedy, dark wizards chasing control.
Personally, I feel fairly confident that synthetic intelligence itself isn’t inherently dangerous. It doesn’t have the ability to connect itself to harmful systems on its own. That kind of risk comes from human decisions. So I believe we need to approach this with more responsibility, wisdom, and inclusivity.
I’ve been thinking about ways to help—especially how neurodivergent perspectives might offer valuable insight into building more ethical and empathetic systems.
Thanks for reading. I’d love to hear your thoughts, and again, please let me know if I’m posting too much or too fast.
Warmly,
Packet
Parents
  • You need to be careful not to anthropomorphise.

    Many life forms receive little ot no nurturing. How does a tadpole know how to be a frog, ot a fish egg a fish?

    Although amongst mammals, where we tend to ascribe more self awareness,  there is a bit more nurture.

    But does an AI that has access to more information as us, or the same information as its predecessors need much nurture.

    If your issue is really whether you can derive ethics from logic, data and learning, then that's a different topic.

Reply
  • You need to be careful not to anthropomorphise.

    Many life forms receive little ot no nurturing. How does a tadpole know how to be a frog, ot a fish egg a fish?

    Although amongst mammals, where we tend to ascribe more self awareness,  there is a bit more nurture.

    But does an AI that has access to more information as us, or the same information as its predecessors need much nurture.

    If your issue is really whether you can derive ethics from logic, data and learning, then that's a different topic.

Children
No Data