Skip to content

Latest commit

 

History

History
182 lines (130 loc) · 3.91 KB

File metadata and controls

182 lines (130 loc) · 3.91 KB

Reproducibility Guide

This document provides everything needed to reproduce the claims in "Fungible Computation Between Paradigms."


Environment

Python 3.10+
PyTorch 2.0+
NumPy 1.24+

Repository Versions

All experiments were conducted with these pinned commits:

Repository Commit Verification
trix 37c1a26 git checkout 37c1a26
flynnconceivable d9238df git checkout d9238df
hollywood-squares-os 25efb51 git checkout 25efb51

Claim 1: Neural → Classical (FLYNNCONCEIVABLE)

Claim: A neural network achieves 100% accuracy on all 460,928 6502 CPU operation combinations.

Reproduce

git clone https://github.com/anjaustin/flynnconceivable.git
cd flynnconceivable
pip install -e .
python -m pytest tests/ -v

Expected Output

60 passed

Exhaustive Verification

from flynnconceivable import CPU

cpu = CPU(weights_dir='weights')

# Test all ALU combinations (131,072)
errors = 0
for a in range(256):
    for b in range(256):
        for c in [0, 1]:
            cpu.reset()
            # ... verify ADC result matches ground truth
            
print(f"Errors: {errors} / 131,072")  # Expected: 0

Claim 2: Neural → Classical Tables (Spline-6502)

Claim: Neural computation compiles to 3,088 bytes of lookup tables with 100% accuracy.

Reproduce

git clone https://github.com/anjaustin/trix.git
cd trix
pip install -e .
python -m pytest tests/test_spline.py -v

Size Verification

Organ Neural (bytes) Spline (bytes) Compression
ALU 1,735,203 512 3,389×
SHIFT 417,905 1,536 272×
LOGIC 425,073 256 1,660×
INCDEC 413,311 512 807×
COMPARE 696,461 256 2,721×
BRANCH 14,537 16 909×
TOTAL 3,702,490 3,088 1,199×

Claim 3: Routing = Lookup (TriX)

Claim: Content-addressable routing in TriX is mathematically equivalent to spline interval selection.

Reproduce

git clone https://github.com/anjaustin/trix.git
cd trix
pip install -e .
python -m pytest tests/test_hierarchical.py -v

Conceptual Verification

from trix import HierarchicalTriXFFN
import torch

ffn = HierarchicalTriXFFN(d_model=512, num_tiles=16)

# Routing via signature matching
x = torch.randn(1, 512)
output, routing_info, _ = ffn(x)

# The routing decision is equivalent to:
# 1. Compute signature similarities (like spline interval check)
# 2. Select winning tile (like loading spline coefficients)
# 3. Apply tile transform (like spline evaluation)

The Closed Loop

Claim: A spline-based function executes correctly on the neural 6502.

Reproduce

cd flynnconceivable
python -c "
from flynnconceivable import CPU

cpu = CPU(weights_dir='weights')

# Lookup table for y = 2x
lookup = [min(255, 2*i) for i in range(128)]

# 6502 program using indexed addressing
program = [
    0xA5, 0x50,        # LDA \$50 (input)
    0xAA,              # TAX
    0xBD, 0x10, 0x02,  # LDA \$0210,X (table lookup)
    0x85, 0x51,        # STA \$51 (output)
    0x00,              # BRK
]

# Load program and table
cpu.load(program + [0xEA]*(0x10-len(program)) + lookup)

# Test
for x in [0, 10, 50, 100, 127]:
    cpu.reset()
    cpu.load(program + [0xEA]*(0x10-len(program)) + lookup)
    cpu.memory.write(0x50, x)
    cpu.run()
    result = cpu.memory.read(0x51)
    expected = min(255, 2*x)
    print(f'x={x}: expected={expected}, got={result}, ok={result==expected}')
"

Expected Output

x=0: expected=0, got=0, ok=True
x=10: expected=20, got=20, ok=True
x=50: expected=100, got=100, ok=True
x=100: expected=200, got=200, ok=True
x=127: expected=254, got=254, ok=True

Contact

For questions about reproducibility: open an issue