Motivation
- LoRA tries to approximate the weights update matrices using 2 lower-ranked matrices
- the choice of affects how well the approximation can be and is fixed
Idea
Dynamically change during training
- Separately for each layer
- Start with a reasonable , during training
- Adding/removing columns must not change the approximation significantly. How ?question