- Little Britain
-
17:32
(UTC)
Stars
[CVPR 2024] VkD : Improving Knowledge Distillation using Orthogonal Projections
[NeurIPS 2024] VeLoRA : Memory Efficient Training using Rank-1 Sub-Token Projections
[BMVC 2022] Information Theoretic Representation Distillation
A generic framework for injecting sidecars and related configuration in Kubernetes using Mutating Webhook Admission Controllers
[AAAI 2024] Understanding the Role of the Projector in Knowledge Distillation
Explain complex systems using visuals and simple terms. Help you prepare for system design interviews.