Dow2Vec - How Text Embeddings Enhance NLP Models for Manufacturing | AIChE

Dow2Vec - How Text Embeddings Enhance NLP Models for Manufacturing

Type

Conference Presentation

Conference Type

AIChE Spring Meeting and Global Congress on Process Safety

Presentation Date

August 20, 2020

Duration

15 minutes

Skill Level

Intermediate

PDHs

0.30

Over the past few years, several language embedding model architectures have been shared with the open source community which have achieved top accuracy measures across many natural language processing benchmarks. One of these techniques called Bidirectional Encoder Representations from Transformers (BERT) can provide both out-of-the-box text features for downstream NLP tasks or a fine-tuned model trained on a smaller set of labelled data. This talk will provide the audience with an overview of how to apply this model to a manufacturing-specific document classification task and share the accuracy results when compared to traditional NLP feature extraction methods.

Presenter(s) 

Once the content has been viewed and you have attested to it, you will be able to download and print a certificate for PDH credits. If you have already viewed this content, please click here to login.

Language 

Checkout

Checkout

Do you already own this?

Pricing

Individuals

AIChE Member Credits 0.5
AIChE Pro Members $19.00
AIChE Graduate Student Members Free
AIChE Undergraduate Student Members Free
Computing and Systems Technology Division Members Free
AIChE Explorer Members $29.00
Non-Members $29.00