Reinforcement Distillation Definition

Reinforcement Distillation: A technique that allows a student model to learn from a teacher model by mimicking its actions and receiving a reward signal that indicates how well it is doing.