MachineLearning

Dynamic batching for Encoder-Decoder MT training or generation when long sequence caps the batch size [P]

I built a small pytorch sampler called dynabatch after facing this specific batching issue while fine tuning a NLLB-200 600M model. Training on RTX 5090, the largest fixed batch size I could use was 8, any bigger leads to OOM. While training and monit…