試験Professional-Machine-Learning-Engineer トピック5 問題207 スレッド

Google Professional-Machine-Learning-Engineerのリアル試験問題集
問題 #: 207
トピック #: 5
You work at a leading healthcare firm developing state-of-the-art algorithms for various use cases You have unstructured textual data with custom labels You need to extract and classify various medical phrases with these labels What should you do?

おすすめの解答:B 解答を投票する

Medical entity extraction is a task that involves identifying and classifying medical terms or concepts from unstructured textual data, such as electronic health records, clinical notes, or research papers. Medical entity extraction can help with various use cases, such as information retrieval, knowledge discovery, decision support, and data analysis1.
One possible approach to perform medical entity extraction is to use a BERT-based model to fine-tune a medical entity extraction model. BERT (Bidirectional Encoder Representations from Transformers) is a pre-trained language model that can capture the contextual information from both left and right sides of a given token2. BERT can be fine-tuned on a specific downstream task, such as medical entity extraction, by adding a task-specific layer on top of the pre-trained model and updating the model parameters with a small amount of labeled data3.
A BERT-based model can achieve high performance on medical entity extraction by leveraging the large-scale pre-training on general-domain corpora and the fine-tuning on domain-specific data. Forexample, Nesterov and Umerenkov4 proposed a novel method of doing medical entity extraction from electronic health records as a single-step multi-label classification task by fine-tuning a transformer model pre-trained on a large EHR dataset. They showed that their model can achieve human-level quality for most frequent entities.
References:
* 1: Medical Named Entity Recognition from Un-labelled Medical Records based on Pre-trained Language Models and Domain Dictionary | Data Intelligence | MIT Press
* 2: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
* 3: Fine-tuning BERT for Medical Entity Extraction
* 4: Distantly supervised end-to-end medical entity extraction from electronic health records with human-level quality

高桥** 2024-04-05 06:24:38

コメント

正解:
?」こちらは投票コメントになっております。普通のコメントに切り替えます。
ニックネーム: 送信 キャンセル
投票コメントをあげるごとに、選択した解答の投票数を1つ増やすことができます。

他人の解答コメントを賛成するのも、その解答に一票を入れることになります。したがって、すでに同じ意見の投票コメントが存在する場合、新規コメントをする代わりに賛成することもできます。

弊社を連絡する

我々は12時間以内ですべてのお問い合わせを答えます。

オンラインサポート時間:( UTC+9 ) 9:00-24:00
月曜日から土曜日まで

サポート:現在連絡