Skip to content

Mohamed-SayedAlAhl/GPT-2_Text_Generation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Text Generation with GPT-2

Overview

This project demonstrates how to use the GPT-2 language model from the Transformers library to generate text based on an input prompt. The code uses the pipeline function to create a text generation pipeline with the GPT-2 model, and generates one or more sentences based on an input string. The generated text is output to the console for inspection.

Getting Started

To get started with this project, you will need to have Python 3.x installed on your system, as well as the Transformers library. You can install Transformers using pip:

pip install transformers

Once you have the library installed, you can run the text_generation_gpt2.py script to generate text based on an input prompt. The script uses the GPT-2 model by default, but you can modify the script to use other models if desired.

Usage

To use the notebook, you will need to have Jupyter Notebook installed on your system. You can install Jupyter Notebook using pip:

pip install notebook

Once you have Jupyter Notebook installed, you can run the notebook using the following command:

jupyter notebook

This will open a new browser window with the Jupyter Notebook dashboard. From the dashboard, you can open the generate_text.ipynb notebook and run the code cells.

Functionality

The notebook contains the following functionality:

  • Load the GPT-2 model and tokenizer
  • Create a text generation pipeline
  • Generate text based on an input prompt

Results

The following is an example of the output generated by the notebook:

Input prompt: The quick brown fox jumps over the lazy dog.
Generated text: The quick brown fox jumps
over the lazy dog. The dog is a very
good dog. The dog is a very good dog.   

Conclusion

This project demonstrates how to use the GPT-2 language model from the Transformers library to generate text based on an input prompt. The code provides a simple example of how to use the pipeline function to create a text generation pipeline, and how to customize the generation parameters to control the output. With some modifications, this code could be adapted to other natural language processing tasks, such as text classification or sentiment analysis, using other pre-trained models from the Transformers library.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published