autointent.modules.scoring.BERTLoRAScorer#
- class autointent.modules.scoring.BERTLoRAScorer(classification_model_config=None, num_train_epochs=3, batch_size=8, learning_rate=5e-05, seed=0, report_to='none', print_progress=False, **lora_kwargs)#
Bases:
autointent.modules.scoring._bert.BertScorer
BERTLoRAScorer class for transformer-based classification with LoRA (Low-Rank Adaptation).
- Parameters:
classification_model_config (autointent.configs.HFModelConfig | str | dict[str, Any] | None) – Config of the base transformer model (HFModelConfig, str, or dict)
num_train_epochs (int) – Number of training epochs (default: 3)
batch_size (int) – Batch size for training (default: 8)
learning_rate (float) – Learning rate for training (default: 5e-5)
seed (int) – Random seed for reproducibility (default: 0)
report_to (autointent._callbacks.REPORTERS_NAMES | Literal['none']) – Reporting tool for training logs
**lora_kwargs (Any) – Arguments for LoraConfig
print_progress (bool)
**lora_kwargs
Example:#
from autointent.modules import BERTLoRAScorer # Initialize scorer with LoRA configuration scorer = BERTLoRAScorer( classification_model_config="bert-base-uncased", num_train_epochs=3, batch_size=8, learning_rate=5e-5, seed=42, r=8, # LoRA rank lora_alpha=16, # LoRA alpha ) # Training data utterances = ["This is great!", "I didn't like it", "Awesome product", "Poor quality"] labels = [1, 0, 1, 0] # Binary classification # Fit the model scorer.fit(utterances, labels) # Make predictions test_utterances = ["Good product", "Not worth it"] probabilities = scorer.predict(test_utterances)
- name = 'lora'#
Name of the module to reference in search space configuration.
- classmethod from_context(context, classification_model_config=None, num_train_epochs=3, batch_size=8, learning_rate=5e-05, seed=0, **lora_kwargs)#
Initialize self from context.
- Parameters:
context (autointent.Context) – Context to init from
**kwargs – Additional kwargs
classification_model_config (autointent.configs.HFModelConfig | str | dict[str, Any] | None)
num_train_epochs (int)
batch_size (int)
learning_rate (float)
seed (int)
lora_kwargs (Any)
- Returns:
Initialized module
- Return type: