How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("feature-extraction", model="ciCic/decisionTransformer")
# Load model directly
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("ciCic/decisionTransformer")
model = AutoModel.from_pretrained("ciCic/decisionTransformer")
Quick Links

Running training

  • Num examples = 1000
  • Num Epochs = 120
  • Instantaneous batch size per device = 64
  • Total train batch size = 64
  • Gradient Accumulation steps = 1
  • Total optimization steps = 1920

Train Output

  • global_step = 1920
  • train_runtime = 1849.2158
  • train_samples_per_second = 64.892
  • train_steps_per_second = 1.038
  • train_loss = 0.04717305501302083
  • epoch = 120.0

Dataset

  • edbeeching/decision_transformer_gym_replay
    • halfcheetah-expert-v2
Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train ciCic/decisionTransformer