Skip to main content

Danish BERT

BERT (Bidirectional Encoder Representations from Transformers) is a deep neural network model used in Natural Language Processing. The network learns the grammar and semantics of human language by training on large bodies of text. Danish BERT focuses on making BERT better for the nordic languages.

This repository provides downloadable weights for a Danish, a Norwegian and a Swedish BERT model trained from scratch. The models can be used in downstream tasks to improve the performance of Nordic Natural Language Processing systems.

Data og ressourcer

Nøgleord

Yderligere info

URI https://data.gov.dk/dataset/lang/ee8441c7-8ee9-46bc-92f3-0a79f572b62a
Destinationsside https://github.com/botxo/danish_bert
Høstes af Datavejviser
Udgivelsesdato
Seneste ændringsdato
Opdateringsfrekvens
Dækningsperiode  / 
Emne(r)
  • 16.05.07 Sprog og retskrivning
  • Uddannelse, kultur og sport
Adgangsrettigheder offentlig
Overholder
Proveniensudsagn
Dokumentation