The data to be translated: Anthropic recently open-sourced their language model Persimmon-8B, which has fewer than 1 billion parameters. This model is released under the Apache license and features a context length of 16K, surpassing both LLaMA2 and GPT-3. Additionally, it comes with efficient inference code that allows for rapid generation on A100 GPUs. Anthropic hopes that the community will build upon this model to create more innovative applications.