Understanding Machine Learning Distillation

DEFINITION

By Staff Reporters

***

***

What is distillation? In machine learning, distillation is a technique for transferring knowledge from a large, complex model (often called the teacher model) to a smaller, simpler model (the student model). This process helps the smaller model achieve similar performance to the larger one while being more efficient in terms of computation and memory usage.

Distillation steps: The main steps in knowledge distillation are: [1.] Train the student model by using these predictions, along with the original dataset, to mimic the teacher model’s behavior. And, [2.] use the teacher model to generate predictions for the dataset.

Cite: ChatGPT via MSFT

COMMENTS APPRECIATED

Subscribe and Refer

***

***