Generation [English]

docker pull cargoshipsh/text-generation-en-lg

Automatically generates text by completing a given input text. This is a GPT-J model provided by EleutherAI on Huggingface and was trained on the Pile, a large-scale curated dataset created by EleutherAI. The model itself is 24 GB in size and needs an aditional 1.5MB for the tokenizer.

License

The model is licensed under Apache-2.0 License. The code for the API wrapper is licensed under MIT License.

System Requirements

Minimum: 24GB RAM, 1 vCPU
Recommended: 50GB RAM, 4 vCPU, GPU (container support coming soon)

Limitations and Bias

GPT-J has been trained on the Pile, a dataset known to contain profane, lewd and otherwise offensive language. Depending on the application, GPT-J may produce text that is socially unacceptable. Although the model operates on a token-by-token basis, it appears to respond meaningfully to the input text.

Want to learn more about Bias?

Get more details in our Blogpost on that topic.

API

If you don't want to implement the model all by yourself, no worries. Benefit from our easy to use API and get started right away!

Get Started

Usage

Input [POST]

{
  "text": "Hello, I'm a language model"
}

Output

{
  "text": "Hello, I'm a language modeler. The data coming from the model is the value of the model's function. For example, the values
stored in a table are the first row, the first column from the model, if any, and the"
}

You need to set an API Key via the environment variable API_KEY to run the image and set the X-API-KEY header in your request with the same KEY.

Need a more detailed setup guide?

To get more detailed instructions how to get started please check out our quick start guide in the docs.

Example

Make sure you have Docker installed then run the following command:

docker run -p 80:80 --env API_KEY=CHANGE_ME cargoshipsh/text-generation-en-lg

In a new terminal window, run the following command to call the API

curl -X POST -H 'Content-type: application/json' -H 'X-API-Key: CHANGE_ME' --data '{"text": "Hello, I'm a language model"}' http://localhost:80

You see the output of the model in the terminal.

{"text": "Hello, I'm a language modeler. The data coming from the model is the value of the model's function. For example, the values
stored in a table are the first row, the first column from the model, if any, and the"}

Need help?

Join our Discord and ask away. We're happy to help where we can!

Join Discord