Comments (6)
Hi @kaansonmezoz ,
thanks for your interest in our models
- The complete training corpus was filtered and sentence segmented, basically with:
from nltk.tokenize import sent_tokenize
for sent in sent_tokenize(line, "turkish"):
if len(sent.split()) > 5:
print(sent)
So it is not only applied to the OSCAR subcorpus here.
-
I used sentences longer than 5 tokens (split on whitespaces), see above :)
-
Not only full stops are considered for sentence segmentation, NLTK has some more tokens to be considered.
-
I just looked it up in my "data lake", the
trwiki-latest-pages-articles.xml.bz2
dump has a480M 2. Feb 2020
timestamp. -
I could found the following OPUS-related files:
bible-uedin.txt GNOME.txt JW300.txt OpenSubtitles.txt opus.all QED.txt SETIMES.txt Tanzil.txt Tatoeba.txt TED2013.txt Wikipedia.txt
With a timestamp of 3. Feb 2020
.
- For pre-processing (of pre-training data) the official BERT implementation was used, so basically all pre-processing steps can be found here: https://github.com/google-research/bert/blob/master/tokenization.py#L161-L182, so first a basic tokenization step is done, followed by the wordpiece stuff. I did not add extra steps.
Please just give me your mail addresse and I can immediately send you the link to the corpus used for pre-training
from turkish-bert.
Hey @hazalturkmen , no problem, just give me an email addresse where I can contact you
from turkish-bert.
Mails are out
from turkish-bert.
Hi @stefan-it,
Can i get the links to the corpus used for pre-training?
thanks,
from turkish-bert.
Thanks, @stefan-it ,
Here is my email address:
from turkish-bert.
@stefan-it Thank you for detailed explanation. My email is [email protected]
You are a life saver !
from turkish-bert.
Related Issues (20)
- Bibtext HOT 2
- Summarization and Classificaiton fine-tune HOT 2
- Question HOT 1
- Electra Model HOT 2
- fill-mask HOT 2
- Vocabs Genration HOT 8
- Any plan for Turkish T5? HOT 2
- Why `handle_chinese_chars=False`? HOT 2
- How to get all hidden layers' output of pre-trained BERTurk model in HuggingFace Transformers library? HOT 2
- ValueError: Tensor conversion requested dtype string for Tensor with dtype float32: <tf.Tensor 'args_0:0' shape=() dtype=float32> HOT 3
- ValueError: Must specify max_steps > 0, given: 0 HOT 6
- TensorFlow Checkpoints for Fine-Tuning HOT 1
- AssertionError: ('Pointer shape torch.Size([256]) and array shape (64,) mismatched', torch.Size([256]), (64,)) HOT 6
- Q: Using as a classifier
- DistilBERTurk training for question answering failed HOT 8
- PoS tagging HOT 3
- CHEATSHEET for ConvBERT? HOT 6
- bert uncased tf checkpoints HOT 8
- Training Data HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from turkish-bert.