BERT, which stands for Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing pre-training. In plain English, it can be used to help Google better discern the context of words in search queries.
Liked By
Write Answer
What does the BERT algorithm do?
Join MindStick Community
You have need login or register for voting of answers or question.
Anshu Dwivedi
02-Dec-2020BERT, which stands for Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing pre-training. In plain English, it can be used to help Google better discern the context of words in search queries.