--- library_name: transformers tags: - chess - llm-course - chess-challenge license: mit --- # my-chess-model Chess model submitted to the LLM Course Chess Challenge. ## Submission Info - **Submitted by**: [nathanael-fijalkow](https://huggingface.co/nathanael-fijalkow) - **Parameters**: 909,824 - **Organization**: LLM-course ## Usage ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("LLM-course/my-chess-model", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained("LLM-course/my-chess-model", trust_remote_code=True) ``` ## Model Details - **Architecture**: Chess Transformer (GPT-style) - **Vocab size**: 1682 - **Embedding dim**: 128 - **Layers**: 4 - **Heads**: 4