Question from the Natural Language Processing - Fundamentals test

Tokenization is the process of separating text into words or groups of words.

Easy

What is tokenization?

Author: ConstantinStatus: PublishedQuestion passed 380 times
Edit
2
Community Evaluations
developer avatar
Ambiguous
Auteur anonyme
25/07/2024
Un token n'est pas forcement un mot ou un groupe de mot
developer avatar
Auteur anonyme
30/07/2024
OK