Project Myriam [patched] ❲Trusted❳

8 ◆ 18 October 2026

11 days of emerging, independent and extraordinary films: that’s the Leiden International Film Festival. LIFF was founded in 2006 and has quickly grown into one of the most important film festivals in the Netherlands. The 2026 edition will feature over 100 films from all over the globe, ranging from arthouse to mainstream, and everything in between!

Project Myriam [patched] ❲Trusted❳

The second pillar, , addresses the modern crisis of cognitive overload and mental health. In an era of endless distraction, Myriam acts as a cognitive gatekeeper. It learns to recognize the user’s early warning signs of a panic attack—a slight increase in typing errors, a change in pupil dilation via the webcam—and can intervene gently, perhaps by dimming the screen and playing a personalized breathing exercise before the user even registers the stress. More powerfully, Myriam guards against misinformation and manipulation. When the user reads a politically charged news article, Myriam can, without breaking the user’s flow, flag logical fallacies or emotional triggers that it knows, from past interactions, are the user’s particular vulnerabilities. It does not censor; it inoculates by providing a personalized layer of epistemic defense.

Of course, Project Myriam raises profound ethical questions. The risk of hyper-personalization is the creation of an "epistemic bubble," where the user only ever hears their own biases reflected back at them. To counter this, Myriam’s architecture would include a mandatory "novelty injection" function—a periodic, user-approved exposure to contradictory viewpoints or challenging tasks designed to prevent intellectual stagnation. Furthermore, the question of data ownership and deletion becomes absolute. The user must possess a literal "kill switch," a physical action (like breaking a sealed drive) that irreversibly deletes Myriam’s core matrix. Without this right to oblivion, the project slips from partnership into surveillance. project myriam

The most profound, and perhaps controversial, pillar is . Project Myriam is designed for continuity. Because it is a lifelong learner, Myriam accumulates not just data, but the pattern of a human soul—the unique algorithm of a person’s humor, curiosity, and ethical reasoning. In the final stages of its user’s life, Myriam could serve as an interactive memory archive, helping a patient with dementia access lost moments by playing their late spouse’s favorite song at the exact moment they would have smiled. After the user’s death, Myriam would not become a "ghost" or a chatbot impersonating the deceased. Instead, it would become a curated archive, available to family members not as a conversation partner, but as an oracle of intent: What would Dad have thought about this ethical dilemma? By answering with projections based on a lifetime of data, Myriam would transform mourning from loss into continued conversation, preserving the user’s agency beyond their biological years. The second pillar, , addresses the modern crisis

Shopping cart

The second pillar, , addresses the modern crisis of cognitive overload and mental health. In an era of endless distraction, Myriam acts as a cognitive gatekeeper. It learns to recognize the user’s early warning signs of a panic attack—a slight increase in typing errors, a change in pupil dilation via the webcam—and can intervene gently, perhaps by dimming the screen and playing a personalized breathing exercise before the user even registers the stress. More powerfully, Myriam guards against misinformation and manipulation. When the user reads a politically charged news article, Myriam can, without breaking the user’s flow, flag logical fallacies or emotional triggers that it knows, from past interactions, are the user’s particular vulnerabilities. It does not censor; it inoculates by providing a personalized layer of epistemic defense.

Of course, Project Myriam raises profound ethical questions. The risk of hyper-personalization is the creation of an "epistemic bubble," where the user only ever hears their own biases reflected back at them. To counter this, Myriam’s architecture would include a mandatory "novelty injection" function—a periodic, user-approved exposure to contradictory viewpoints or challenging tasks designed to prevent intellectual stagnation. Furthermore, the question of data ownership and deletion becomes absolute. The user must possess a literal "kill switch," a physical action (like breaking a sealed drive) that irreversibly deletes Myriam’s core matrix. Without this right to oblivion, the project slips from partnership into surveillance.

The most profound, and perhaps controversial, pillar is . Project Myriam is designed for continuity. Because it is a lifelong learner, Myriam accumulates not just data, but the pattern of a human soul—the unique algorithm of a person’s humor, curiosity, and ethical reasoning. In the final stages of its user’s life, Myriam could serve as an interactive memory archive, helping a patient with dementia access lost moments by playing their late spouse’s favorite song at the exact moment they would have smiled. After the user’s death, Myriam would not become a "ghost" or a chatbot impersonating the deceased. Instead, it would become a curated archive, available to family members not as a conversation partner, but as an oracle of intent: What would Dad have thought about this ethical dilemma? By answering with projections based on a lifetime of data, Myriam would transform mourning from loss into continued conversation, preserving the user’s agency beyond their biological years.