AI - Google Research Bert

Advancing natural language processing through Google Research Bert's suite of Bidirectional Encoder Representations from Transformers models in various sizes and languages.

Logo of Google Research Bert
Last Audited At

About Google Research Bert

Google Research Bert is a project by Google that develops and provides various Bidirectional Encoder Representations from Transformers (BERT) models for natural language processing tasks. The models come in different sizes, including 12-layer, 768-hidden, 12-heads versions with varying numbers of parameters. For instance, there is a model specifically designed for Chinese Simplified and Traditional languages, which has 12 layers, 768 hidden units, 12 attention heads, and 110 million parameters. Additionally, Google Research Bert offers models for cased English, multilingual data, and third-party versions of BERT in PyTorch and Chainer frameworks. These models have different numbers of layers, hidden units, attention heads, and parameters to cater to various use cases.

Was this page helpful?

More companies

Google Research

Leading the way in advanced AI research and development for healthcare diagnostics, conversational systems, climate change solutions, and responsible innovation.

Read more

Qeexo

Providing AI solutions for sensor data via innovative ML Platform & consulting services, recognized by tech leaders.

Read more

Snorkel

Collaborate in AI community via open-source Snorkel, a machine learning framework for data management. Use GitHub Issues, Spectrum forum, Google Group, and Twitter for updates.

Read more

Tell us about your project

Our Hubs

London, United Kingdom

A global AI hotspot, thrives on innovation, diverse talent, and a dynamic tech ecosystem, offering unparalleled opportunities for AI engineers.

Munich, Germany

A vibrant AI hub, merges cutting-edge technology with rich cultural experiences, creating an inspiring environment for AI engineers.