metadata
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: >-
"But PBMs operate with little to no transparency within the drug pricing
system, and they often take advantage of their opaque position at the
expense of patients. Their work includes establishing formularies,
contracting with pharmacies, and negotiating rebates and discounts with
drug manufacturers. But instead of passing these savings on to consumers,
PBMs retain these costs, and the patients do not benefit at the pharmacy
counter. But it's actually worse than that. Just as a rising tide lifts
all boats, PBMs' rebate manipulation inflates health care prices generally
and that ultimately increases the cost of patients' medications."
- text: >-
"That's why our state's local pharmacies are so essential. They provide
people access to the care they need when they need it. But now, many
pharmacies are under serious threatand our most vulnerable patients along
with them. Over the past 14 years, the number of Oregon pharmacies has
decreased more than 26%. Accessing medications or treatments should be
simple, but unfortunately it's only becoming more difficult. Why is this
happening? One reason involves middlemen insurers called pharmacy benefit
managers (PBMs)."
- text: >-
"But more often, insurers and PBMs have implemented schemes called \"copay
accumulator adjustment programs\" that prevent the value of the copay
assistance from counting toward a patient's deductible. Faced with
unexpectedly high costs at the pharmacy counter, patients impacted by
these policies are less likely to adhere to treatment which can lead to
worsened health outcomes, increased hospitalizations, and greater costs to
the health care system. Copay accumulator policies disproportionately
impact communities of color."
- text: >-
"PBMs also compile lists of drugs, called formularies, that providers of
health benefits agree to cover; establish pharmacy networks that patients
can access; and run their own mail-order pharmacies. Although PBMs are
supposed to help lower costs, some of their practices may well do the
opposite. PBMs often keep a portion of the rebates they negotiate, which
can incentivize them to favor more expensive drugs on their formularies.
(A $1 million drug, for example, would fetch a bigger fee than a $100
one."
- text: >-
"This secrecy raises challenging questions. Do PBMs use their size and
negotiating power to win lower net prices from drugmakers? Or do PBMs use
their dominant market position and opaque business practices to enrich
themselves at the expense of their customers and the rest of society? The
answer to both these questions is, surprisingly, yes. If the contest for
formulary placement works as it should, competition compels drugmakers to
offer substantial discounts off the published list price. As a result,
insurers and consumers benefit from a reduced net price for drugs.
However, formulary competition can be undermined in various ways."
metrics:
- accuracy
pipeline_tag: text-classification
library_name: setfit
inference: true
base_model: sentence-transformers/all-mpnet-base-v2
SetFit with sentence-transformers/all-mpnet-base-v2
This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
- Fine-tuning a Sentence Transformer with contrastive learning.
- Training a classification head with features from the fine-tuned Sentence Transformer.
Model Details
Model Description
- Model Type: SetFit
- Sentence Transformer body: sentence-transformers/all-mpnet-base-v2
- Classification head: a LogisticRegression instance
- Maximum Sequence Length: 384 tokens
- Number of Classes: 2 classes
Model Sources
- Repository: SetFit on GitHub
- Paper: Efficient Few-Shot Learning Without Prompts
- Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts
Model Labels
| Label | Examples |
|---|---|
| Critical |
|
| Supportive |
|
Uses
Direct Use for Inference
First install the SetFit library:
pip install setfit
Then you can load this model and run inference.
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("setfit_model_id")
# Run inference
preds = model("\"PBMs also compile lists of drugs, called formularies, that providers of health benefits agree to cover; establish pharmacy networks that patients can access; and run their own mail-order pharmacies. Although PBMs are supposed to help lower costs, some of their practices may well do the opposite. PBMs often keep a portion of the rebates they negotiate, which can incentivize them to favor more expensive drugs on their formularies. (A $1 million drug, for example, would fetch a bigger fee than a $100 one.\"")
Training Details
Training Set Metrics
| Training set | Min | Median | Max |
|---|---|---|---|
| Word count | 74 | 88.9474 | 100 |
| Label | Training Sample Count |
|---|---|
| Supportive | 8 |
| Critical | 11 |
Training Hyperparameters
- batch_size: (8, 8)
- num_epochs: (2, 2)
- max_steps: -1
- sampling_strategy: oversampling
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
Training Results
| Epoch | Step | Training Loss | Validation Loss |
|---|---|---|---|
| 0.0385 | 1 | 0.201 | - |
| 1.9231 | 50 | 0.1192 | - |
Framework Versions
- Python: 3.10.6
- SetFit: 1.1.1
- Sentence Transformers: 3.4.1
- Transformers: 4.50.1
- PyTorch: 2.6.0
- Datasets: 3.4.1
- Tokenizers: 0.21.1
Citation
BibTeX
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}