For a long time I’ve thought it would be cool to upload my consciousness into a machine and be able to talk to a version of myself that didn’t have emotions and cravings.

It might tell me that being around my parents has consistently had a negative effect on my mood for years now, even if I don’t see it. Or that I don’t really love X, I just like having sex with her. Maybe it could determine that Y makes me uncomfortable, but has had an overall positive effect on my life. It could mirror myself back to me in a highly objective way.

Of course this is still science fiction, but @[email protected] has pointed out to me that it’s now just a little bit closer to being a reality.

With Private GPT, I could set up my own localized AI.

https://generativeai.pub/how-to-setup-and-run-privategpt-a-step-by-step-guide-ab6a1544803e

https://github.com/imartinez/privateGPT

I could feed this AI with information that I wasn’t comfortable showing to anyone else. I’ve been keeping diaries for most of my adult life. Once PrivateGPT was trained on the basic language model, I could feed it my diaries, and then have a chat with myself.

I realize PrivateGPT is not sentient, but this is still exciting, and my mind is kinda blown right now.

Edit 1: Guys, this isn’t about me creating a therapist-in-a-box to solve any particular emotional problem. It’s just an interesting idea about using a pattern recognition tool on myself, and have it create summaries of things I’ve said. Lighten up.

Edit 2: It was anticlimactic. This thing basically spits out word salad no matter what I ask it, even if the question has a correct answer, like a specific date.

  • PenguinTD@lemmy.ca
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    A regular adult human have 600 trillion synapses( connections between neurons ), so to just record index of these edges needs like 4.3 PB(yep petabytes), it’s not even counting what they do, just the index(cause 32bit int is not enough.) And, just in case you don’t know, toddler have even higher connection count for faster learning until our brain decides that “oh, these connection are not really needed” then disconnect and then save energy consumption. It is really not in our reach yet to simulate a self-aware artificial creature, cause most animals we know that are self-aware have high counts of synapses.

    And yes we are attempting those for various reason.

    https://www.humanbrainproject.eu/en/brain-simulation/