AI Product Engineer Logo

Command Palette

Search for a command to run...

Back to AI Ecosystem

Google Research Bert

Advancing natural language processing through Google Research Bert's suite of Bidirectional Encoder Representations from Transformers models in various sizes and languages.

Google Research Bert logo
Open Source Infrastructure

Google Research Bert is a Google project that creates and distributes Bidirectional Encoder Representations from Transformers (BERT) models for natural language processing tasks in various sizes, including those specific to Chinese languages and English with different numbers of layers, hidden units, attention heads, and parameters. Offerings include cased English, multilingual data, and third-party versions in PyTorch and Chainer frameworks.

About Google Research Bert

Google Research Bert is a project by Google that develops and provides various Bidirectional Encoder Representations from Transformers (BERT) models for natural language processing tasks. The models come in different sizes, including 12-layer, 768-hidden, 12-heads versions with varying numbers of parameters. For instance, there is a model specifically designed for Chinese Simplified and Traditional languages, which has 12 layers, 768 hidden units, 12 attention heads, and 110 million parameters. Additionally, Google Research Bert offers models for cased English, multilingual data, and third-party versions of BERT in PyTorch and Chainer frameworks. These models have different numbers of layers, hidden units, attention heads, and parameters to cater to various use cases.