Indie researcher needs arXiv endorsement for TensionLM (new attention mechanism)

Hey everyone,

I'm an independent researcher who just submitted "TensionLM: Sigmoid Tension as Constraint Relaxation for Language Modelling" to arXiv (cs.AI + cs.LG).

- 117M model with full public code + weights

- Sigmoid tension attention (replaces softmax) + TS-native auxiliary losses

- Logic → Language → Math curriculum gives 96× better first-contact math PPL

Would really appreciate an endorsement so the paper can go live.

Endorsement code: **RLU86M**

Paper + code: https://github.com/BoggersTheFish/bozo

Model: https://huggingface.co/BoggersTheFish/TensionLM-117M-Curriculum

Thanks so much!

submitted by /u/Stock_Palpitation442
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top