-
Notifications
You must be signed in to change notification settings - Fork 69
Description
Hi @yaoyao-liu, thanks for your interesting work! I am interested in the BOP mnemonics training and
willing to make further extensions on it! I found some problems when I checked the codes of Mnemonics Training. It seems that trainer/mnemonics.py
is not the complete version, and it does not follow the training strategies described in the paper.
For example, I couldn't find the binary-level optimization of mnemonics exemplars. There seems to be only 1-level optimization of mnemonics based on NCE classification. But there should be another level before that, which is training for a temporary model on the exemplars (Eq.8) and unroll all training gradients? Also, I couldn't find the process of splitting exemplars and adjusting the mnemonics of old classes. Another issue is that some arguments are defined but are not used, such as self.mnemonics_lrs?
Could you please help to explain my doubts? I am really interested in the implementation of solving BOP and mnemonics training. I apologize if my understanding is wrong. Thank you very much!