Skip to content

Latest commit

 

History

History
15 lines (11 loc) · 889 Bytes

File metadata and controls

15 lines (11 loc) · 889 Bytes

Part 4: Pretraining on Unlabeled Data

 

Main Code

 

Bonus Materials

  • 02_alternative_weight_loading contains code to load the GPT model weights from alternative places in case the model weights become unavailable from OpenAI
  • 03_bonus_pretraining_on_gutenberg contains code to pretrain the LLM longer on the whole corpus of books from Project Gutenberg
  • 04_bonus_hparam_tuning contains an optional hyperparameter tuning script
  • 05_user_interface implements an interactive user interface to interact with the pretrained LLM
  • 06_gpt_to_llama contains a step-by-step guide for converting a GPT architecture implementation to Llama 3.2 and loads pretrained weights from Meta AI