AI - Meta AI OPT-66B
Leading developer of large-scale language models based on Transformers architecture, renowned for generating human-like text through Opt-66b.
- Name
- Meta AI OPT-66B - https://huggingface.co/facebook/opt-66b
- Last Audited At
About Meta AI OPT-66B
Meta AI OPT-66B is a leading developer of large-scale language models based on the Transformers architecture. Their flagship model, Opt-66b, is a causal language model that utilizes AutoModelForCausalLM and AutoTokenizer for processing and generation. This model is renowned for its ability to generate human-like text based on given prompts. The preprocessing of texts involves the use of Byte Pair Encoding (BPE) with a vocabulary size of 50272, tokenizing sequences into 2048 consecutive tokens. Opt-66b was trained on approximately 175 billion model parameters across 992 80GB A100 GPUs over roughly 33 days of continuous training. The model is accessible through the Transformers library and has been downloaded over 15,000 times in the last month. OPT is open-source and available under a non-restrictive license, with resources including a model card, extensive documentation, and an active community of users for collaboration and learning.