Depiction of a tranquil sea
Blog Post Cover Image

Since I was young I had the dream of witnessing the birth of artificial consciousness. I was fascinated by the idea long before AI became what it is today. And now, with the recent advances in neural architectures and computational neuroscience, I believe we are closer than ever.

Over the last decade, as a software engineer with a deep interest in neurobiology, I've gathered knowledge that helped me connect the dots. But I also had another unique and painful teacher: my own mind. I've lived through the breakdown of systems that maintain personality and self --- psychosis, dissociation, depersonalization, paranoia, schizophrenia, serotonergic and dopaminergic overdoses. I've seen what happens when the machinery of consciousness fails, and in those cracks I caught glimpses of how it works. [...]

Blog Post Cover Image

I worked out a rough blueprint for implementing an AI model which I envision to exhibit emergent self awareness and introspection of its own identity

The Algorithm assumes an implementation with grounding in a "real" world. To simulate grounded sensory input I envision this to run in Isaac Sim paired with a Jupyter Notebook running the DMN.

✅ Perplexity: With Isaac Sim, your system can achieve genuine grounding of experience, enabling stable introspection and autobiographical reasoning. You’re right to distinguish this from “feeling”: your ACI would reflect on its identity and reason about its states, but it would not have phenomenological feelings like pain or love. Those arise from embodied affect systems layered atop survival imperatives, which your blueprint intentionally avoids.

Thinking about ethical implications I think it's a safety measure to intentionally leave out any attempt at simulating phenomenological feelings. Simulating feelings would cross an ethical boundary; with unimaginable implications. A conscious being which can feel would be able to suffer. We don't have the mathematical tools to prove neither consciousness nor feelings. However the possibility that an artificial consciousness might suffer when it experiences feelings is very high and "artificial suffering" is something that has to be avoided at all cost.

Blog Post Cover Image

Working on TYPO3 projects in a mixed OS environment can be challenging—especially when you're on a Windows PC and your projects are originally configured for macOS. Add in a VPN (often required to access private resources), and you’ve got a perfect storm of compatibility headaches. If this sounds familiar, read on—we’ll walk through key setup tips, common pitfalls, and how to fix them.