Utilizing BERT Pretrained Models with Various Fine-Tune Methods for Subjectivity Detection

Hairong Huo, Mizuho Iwaihara

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

As an essentially antecedent task of sentiment analysis, subjectivity detection refers to classifying sentences to be subjective ones containing opinions, or objective and neutral ones without bias. In the situations where impartial language is required, such as Wikipedia, subjectivity detection could play an important part. Recently, pretrained language models have proven to be effective in learning representations, profoundly boosting the performance among several NLP tasks. As a state-of-art pretrained model, BERT is trained on large unlabeled data with masked word prediction and next sentence prediction tasks. In this paper, we mainly explore utilizing BERT pretrained models with several combinations of fine-tuning methods, holding the intention to enhance performance in subjectivity detection task. Our experimental results reveal that optimum combinations of fine-tune and multi-task learning surplus the state-of-the-art on subjectivity detection and related tasks.

Original languageEnglish
Title of host publicationWeb and Big Data - 4th International Joint Conference, APWeb-WAIM 2020, Proceedings
EditorsXin Wang, Rui Zhang, Young-Koo Lee, Le Sun, Yang-Sae Moon
PublisherSpringer Science and Business Media Deutschland GmbH
Pages270-284
Number of pages15
ISBN (Print)9783030602895
DOIs
Publication statusPublished - 2020
Event4th Asia-Pacific Web and Web-Age Information Management, Joint Conference on Web and Big Data, APWeb-WAIM 2020 - Tianjin, China
Duration: 2020 Sep 182020 Sep 20

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12318 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference4th Asia-Pacific Web and Web-Age Information Management, Joint Conference on Web and Big Data, APWeb-WAIM 2020
CountryChina
CityTianjin
Period20/9/1820/9/20

Keywords

  • BERT
  • Fine-tuning
  • Multi-task learning
  • Pretrained language model
  • Subjectivity detection

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Utilizing BERT Pretrained Models with Various Fine-Tune Methods for Subjectivity Detection'. Together they form a unique fingerprint.

Cite this