Issue
I want to make sure my BertModel does not loads pre-trained weights. I am using auto class (hugging face) which loads model automatically.
My question is how do I load bert model without pretrained weights?
Solution
Use AutoConfig instead of AutoModel:
from transformers import AutoConfig
config = AutoConfig.from_pretrained('bert-base-uncased')
model = AutoModel.from_config(config)
this should set up the model without loading the weights.
Answered By - Matthew Cox
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.