Skip to content

Instead of treating neural networks as mechanisms for classification and certainty, this framework embraces liminality, recursion, absence, and hesitation as fundamental design principles.

License

Notifications You must be signed in to change notification settings

Wondermongering/Blanchotian-Components-for-a-Poetics-of-Machine-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

52 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Blanchotian Neural Network Component

“To write is to enter the space of impossibility where nothing becomes possible.”

— Maurice Blanchot, The Space of Literature


✨ What’s inside?

Component Essence File
Blanchotian Attention Each token attends to tokens‑present and to the very lack of attention it once gave. Includes learnable void Q/K vectors and a decaying trace of past attention. Blanchotian Attention Mechanism.py
Orphic Embeddings Tokens are shaped by what surrounds them and what is forever hidden. Rarity gates, isolation vectors, contextual inversion. OrphicEmbeddings.py
Recursive Orphic Tokenizer Encodes tokens alongside what is conspicuously absent in their contexts. Uses recursive n‑grams to track absence patterns. Recursive Orphic Tokenizer.py
Neutral Loss Re‑centres error as a creative differential; stable log‑sum‑exp, label smoothing, disaster threshold. losses.py
Blanchotian Layer Norm Normalises the centre while letting solitary features remain singular. Blanchotian Layer Normalization.py
Blanchotian Transformer Layers commune with all preceding layers—an endless palimpsest. The Infinite Conversation.py
Unavowable Community Ensemble whose disagreement is the signal; models converse through divergence. Unavowable Community.py
Blanchotian Style Transfer Apply Blanchotian components to reimagine text in a chosen style. Blanchotian_Style_Transfer.py
Différance Layer Meaning arrives only in the wake of what came before; outputs are forever displaced by traces of prior states. Differance Layer.py

⚖️ License & Ethics MIT. You are free to fork, remix, and deploy—but consider the societal impact of machines that refuse closure. “Perhaps the task is not to end the dialogue, but to sustain the silence between its words.”


About

Instead of treating neural networks as mechanisms for classification and certainty, this framework embraces liminality, recursion, absence, and hesitation as fundamental design principles.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages