Skip to content

Commit dae8392

Browse files
authored
Add FAQ on how to change the LR scheduler (#294)
1 parent c3d2bba commit dae8392

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed

docs/faq.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -132,6 +132,17 @@ To include new PTMs in Casanovo, you need to:
132132
It is unfortunately not possible to finetune a pre-trained Casanovo model to add new types of PTMs.
133133
Instead, such a model must be trained from scratch.
134134

135+
**How can I change the learning rate schedule used during training?**
136+
137+
By default, Casanovo uses a learning rate schedule that combines linear warm up followed by a cosine wave shaped decay (as implemented in `CosineWarmupScheduler` in `casanovo/denovo/model.py`) during training.
138+
To use a different learning rate schedule, you can specify an alternative learning rate scheduler as follows (in the `lr_scheduler` variable in function `Spec2Pep.configure_optimizers` in `casanovo/denovo/model.py`):
139+
140+
```
141+
lr_scheduler = torch.optim.lr_scheduler.LinearLR(optimizer, total_iters=self.warmup_iters)
142+
```
143+
144+
You can use any of the scheduler classes available in [`torch.optim.lr_scheduler`](https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate) or implement your custom learning rate schedule similar to `CosineWarmupScheduler`.
145+
135146
## Miscellaneous
136147

137148
**How can I generate a precision–coverage curve?**

0 commit comments

Comments
 (0)