본문 바로가기

미분류

훈련하면 훈련한거지 사전훈련은 또 뭔데

What is pretraining?

- This pretraining is usually done on very large amounts of data. Therefore, it requires a very large corpus of data, and training can take up to several weeks.

 

What is fine-tuning?
- Fine-tuning, on the other hand, is the training done after a model has been pretrained. To perform fine-tuning, you first acquire a pretrained language model, then perform additional training with a dataset specific to your task. 

 

Fine tuning vs. Transfer learning

- The general pretrained model then goes through a process called transfer learning. During this process, the model is fine-tuned in a supervised way — that is, using human-annotated labels — on a given task.

 

https://huggingface.co/learn/nlp-course/chapter1/4

 

How do Transformers work? - Hugging Face NLP Course

2. Using 🤗 Transformers 3. Fine-tuning a pretrained model 4. Sharing models and tokenizers 5. The 🤗 Datasets library 6. The 🤗 Tokenizers library 9. Building and sharing demos new

huggingface.co

 

'미분류' 카테고리의 다른 글

huggingface의 AutoModel 클래스  (0) 2024.03.20
GraphCodeBERT 메모  (0) 2024.03.20
핸즈온 머신러닝 Ch 10 연습문제  (0) 2024.03.12
딥러닝 관련 의문  (1) 2024.03.08
딥러닝의 주요 개념  (0) 2024.03.08