studentscolab commited on
Commit
e331e4f
·
verified ·
1 Parent(s): 0019cb1

Dodanie modelu Keras

Browse files
Files changed (1) hide show
  1. README.md +121 -5
README.md CHANGED
@@ -1,10 +1,126 @@
1
  ---
2
- library_name: keras
3
- pipeline_tag: tabular-classification
4
  tags:
5
- - keras
6
- - mlp
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  ---
8
 
9
- # Model Keras Klasyfikacja Iris
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
 
 
 
 
 
1
  ---
2
+ language: en
3
+ library_name: tensorflow
4
  tags:
5
+ - keras
6
+ - tensorflow
7
+ - tabular
8
+ - iris
9
+ - multiclass-classification
10
+ pipeline_tag: tabular-classification
11
+ license: mit
12
+ ---
13
+
14
+ # Iris MLP Classifier (Keras / TensorFlow)
15
+
16
+ This repository contains a simple **multiclass classifier** for the classic **Iris** dataset, implemented as a small **MLP (Multi-Layer Perceptron)** in **TensorFlow / Keras**.
17
+ The model predicts one of three classes based on four numerical features.
18
+
19
+ ## Task
20
+
21
+ **Tabular multiclass classification**
22
+ Given the 4 iris measurements, predict the class:
23
+
24
+ - `0` → setosa
25
+ - `1` → versicolor
26
+ - `2` → virginica
27
+
28
+ ## Dataset
29
+
30
+ **Iris dataset** (from `sklearn.datasets.load_iris`)
31
+
32
+ ### Input features (4)
33
+ The model expects **4 float features** in this exact order:
34
+
35
+ 1. `sepal length (cm)`
36
+ 2. `sepal width (cm)`
37
+ 3. `petal length (cm)`
38
+ 4. `petal width (cm)`
39
+
40
+ ### Target (3 classes)
41
+ - integer labels `y ∈ {0,1,2}`
42
+ - no one-hot encoding required (training uses `sparse_categorical_crossentropy`)
43
+
44
  ---
45
 
46
+ ## Model architecture
47
+
48
+ A small feed-forward network with built-in feature normalization:
49
+
50
+
51
+ ### Why `Normalization` layer?
52
+ The `tf.keras.layers.Normalization` layer learns feature-wise mean and variance from the training set via `adapt(...)`.
53
+ This makes inference easier and safer: **the same scaling used during training is embedded inside the saved model**.
54
+
55
+ ---
56
+
57
+ ## Training configuration
58
+
59
+ - **Optimizer:** Adam (`learning_rate=1e-3`)
60
+ - **Loss:** `sparse_categorical_crossentropy`
61
+ - **Metric:** accuracy
62
+ - **Train/test split:** 80/20 (`stratify=y`, `random_state=42`)
63
+ - **Validation split (from train):** 20% (`validation_split=0.2`)
64
+ - **Epochs:** 100
65
+ - **Batch size:** 16
66
+ - **Reproducibility:**
67
+ - `tf.random.set_seed(42)`
68
+ - `np.random.seed(42)`
69
+
70
+ > Note: Exact accuracy may vary slightly across environments due to numerical differences and nondeterminism in some TF ops.
71
+
72
+ ---
73
+
74
+ ## Example: training script (reference)
75
+
76
+ The model was trained using the following core logic (simplified):
77
+
78
+ ```python
79
+ normalizer = tf.keras.layers.Normalization(axis=-1)
80
+ normalizer.adapt(X_train.to_numpy())
81
+
82
+ model = tf.keras.Sequential([
83
+ tf.keras.Input(shape=(4,)),
84
+ normalizer,
85
+ tf.keras.layers.Dense(16, activation="relu"),
86
+ tf.keras.layers.Dense(16, activation="relu"),
87
+ tf.keras.layers.Dense(3, activation="softmax"),
88
+ ])
89
+
90
+ model.compile(
91
+ optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3),
92
+ loss="sparse_categorical_crossentropy",
93
+ metrics=["accuracy"]
94
+ )
95
+
96
+ model.fit(
97
+ X_train.to_numpy(), y_train.to_numpy(),
98
+ validation_split=0.2,
99
+ epochs=100,
100
+ batch_size=16,
101
+ verbose=0
102
+ )
103
+
104
+ ```
105
+ ## Example
106
+
107
+ ```python
108
+ import numpy as np
109
+ import pandas as pd
110
+ import tensorflow as tf
111
+
112
+ model = tf.keras.models.load_model("iris_mlp.keras")
113
+
114
+ x_new = pd.DataFrame([{
115
+ "sepal length (cm)": 5.1,
116
+ "sepal width (cm)": 3.5,
117
+ "petal length (cm)": 1.4,
118
+ "petal width (cm)": 0.2,
119
+ }])
120
+
121
+ proba = model.predict(x_new.to_numpy(), verbose=0)[0] # shape: (3,)
122
+ pred = int(np.argmax(proba))
123
 
124
+ print("Probabilities:", proba)
125
+ print("Predicted class:", pred)
126
+ ```