ย ย ๐ค Hugging Faceย ย | ย ย ๐ค ModelScopeย ย | ย ย ๐ Blogย ย | ย ย ๐ Paperย ย
๐ฅ๏ธ Hugging Face Demoย ย | ย ย ๐ฅ๏ธ ModelScope Demoย ย | ย ย ๐ฌ WeChat (ๅพฎไฟก)ย ย | ย ย ๐ซจ Discordย ย | ย ย ๐ API
We release LPT-5.5.1, the flagship series of the Lisk Pre-trained Transformer architecture developed by LiskCell. It represents the pinnacle of creative AI, offering comprehensive support for multimodal generation, artistic reasoning, and deep human alignment.
- 2026.01.07: ๐๐๐ We have released the LPT-5.5.1 series, the most advanced public standard in the LiskCell ecosystem. Optimized for speed, creativity, and developer experience. Please check our blog!
LPT-5.5.1 is a state-of-the-art AI system designed to empower creators, developers, and visionaries. Built on the proprietary Lisk Pre-trained Transformer (LPT) architecture, it combines advanced logic with an artistic soul.
Key features:
- Ocular Synth v2.5: Advanced image analysis and scene understanding.
- LPT-Visual v3.0: Premium artistic generation and visual storytelling.
- Quantum Harmonics: Enhanced musical logic for melody, rhythm, and production assistant.
- Empathy Core: Deep emotional intelligence layer that prioritizes human well-being and needs.
- Human-First Principles: Deeply aligned with human values, ensuring AI acts as a partner, never a replacement.
To ensure seamless integration with libraries like diffusers and tools like Auto1111, the repository follows these filtering rules for identifying main model weights:
filter: [
{
bool: {
/// Include documents that match at least one of the following rules
should: [
/// Downloaded from diffusers lib
{
term: { path: "model_index.json" },
},
/// Direct downloads (LoRa, Auto1111 and others)
/// Filter out nested safetensors and pickle weights
{
regexp: { path: "[^/]*\\.safetensors" },
},
{
regexp: { path: "[^/]*\\.ckpt" },
},
{
regexp: { path: "[^/]*\\.bin" },
},
],
minimum_should_match: 1,
},
},
]LPT-5.5.1 utilizes a revolutionary adaptive transformer architecture that dynamically shifts parameters between creative and technical modes based on the task context. This ensures maximum efficiency without sacrificing the artistic "soul" of the output.
| Model | Features | Architecture | Status |
|---|---|---|---|
| LPT-5.5.1-Public | Creative Core, Technical Logic, Multilingual Support | LPT-Adaptive | Active |
| LPT-5.5 (Legacy) | Multimodal Baseline | LPT-v2 | Legacy |
| LPT-4.5 (Legacy) | Reasoning Milestone | LPT-v1.5 | Legacy |
The easiest way to integrate LPT-5.5.1 is via the lisk-flow SDK:
pip install liskcellDeveloper access only. Please contact LiskCell for credentials.
from liskcell import LPT551
model = LPT551.from_pretrained("liskcell-company/LPT-5.5.1")
# Creative generation with Empathy Core
response = model.generate(
prompt="Design a futuristic city that prioritizes human connection.",
mode="creative",
empathy_level="high"
)
print(response)LPT-5.5.1 consistently outperforms competing models (including Claude Sonnet 4.0, GPT-5.2, and Gemini 3 Deep Think) in creative generation, code health, and adaptive intelligence benchmarks.
| Model | Creativity Score | Logic Precision | Empathy Index |
|---|---|---|---|
| LPT-5.5.1 | 9.8/10 | 9.6/10 | 9.9/10 |
| Competitor G | 8.2/10 | 9.1/10 | 6.4/10 |
| Competitor C | 8.5/10 | 9.0/10 | 7.1/10 |
Technology exists to serve people, not replace them. LPT-5.5.1 is built to stand firmly on the side of humanity, amplifying human potential while honoring the irreplaceable spark of the human soul.
@article{LPT-5.5.1,
title={Lisk Pre-trained Transformer 5.5.1 Technical Report},
author={liskasYR (Yonatan Yosupov) and LiskCell Research Team},
journal={LiskCell AI Research},
year={2026}
}Counting the number of downloads for models is not a trivial task. To avoid double counting, the Hub uses a set of query files. No information is sent from the user, and no additional calls are made for this. The count is done server-side as the Hub serves files for downloads.
Every HTTP request to these files, including GET and HEAD, will be counted as a download. By default, the Hub uses config.json as the default query file.
By default, the Hub looks at config.json, config.yaml, hyperparams.yaml, params.json, and meta.yaml.
The diffusers library uses a specific filter to count both files loaded via the library and manual top-level downloads:
filter: [
{
bool: {
should: [
{ term: { path: "model_index.json" } },
{ regexp: { path: "[^/]*\\.safetensors" } },
{ regexp: { path: "[^/]*\\.ckpt" } },
{ regexp: { path: "[^/]*\\.bin" } },
],
minimum_should_match: 1,
},
},
]Powered by LiskCell ๐ โ Futuristic, Helpful, & Visionary.
