Model description

This is a logistic regression model trained with GPT-2 embeddings on imdb dataset. The notebook to generate this model is in this repository and in this kaggle link.

Intended uses & limitations

This model is trained for educational purposes.

Training Procedure

Hyperparameters

The model is trained with below hyperparameters.

Click to expand
Hyperparameter Value
memory
steps [('embedding', HFTransformersLanguage(model_name_or_path='facebook/bart-base')), ('model', LogisticRegression())]
verbose False
embedding HFTransformersLanguage(model_name_or_path='facebook/bart-base')
model LogisticRegression()
embedding__model_name_or_path facebook/bart-base
model__C 1.0
model__class_weight
model__dual False
model__fit_intercept True
model__intercept_scaling 1
model__l1_ratio
model__max_iter 100
model__multi_class auto
model__n_jobs
model__penalty l2
model__random_state
model__solver lbfgs
model__tol 0.0001
model__verbose 0
model__warm_start False

Model Plot

The model plot is below.

Pipeline(steps=[('embedding',HFTransformersLanguage(model_name_or_path='facebook/bart-base')),('model', LogisticRegression())])
Please rerun this cell to show the HTML repr or trust the notebook.

Evaluation Results

You can find the details about evaluation process and the evaluation results.

Metric Value
f1_score 0.867535

How to Get Started with the Model

Use the code below to get started with the model.

Click to expand
[More Information Needed]


# Additional Content

## Confusion matrix

![Confusion matrix](confusion_matrix.png)
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.