Yashvj123 commited on
Commit
5a370ea
Β·
verified Β·
1 Parent(s): 5665d45

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +6 -6
app.py CHANGED
@@ -400,11 +400,11 @@ elif st.session_state.current_page == "EDA":
400
  # Model Building
401
  elif st.session_state.current_page == "Model Building":
402
  st.markdown("""
403
- <h2 style='text-align: center; color: #333;'>πŸš€ Model Building</h2>
404
  """, unsafe_allow_html=True)
405
 
406
  st.markdown("""
407
- <h4>πŸ“Œ Introduction</h4>
408
  <p>In this section, we explore different <b>Ensemble Learning</b> techniques to improve model performance.</p>
409
  <p>We implemented three ensemble models:
410
  <span style='font-size:16px;'>πŸ₯‡ <b>Voting Regressor</b> - 🎯 <b>Bagging Regressor</b> - 🌲 <b>Random Forest Regressor</b></span></p>
@@ -443,7 +443,7 @@ elif st.session_state.current_page == "Model Building":
443
  st.markdown("<hr style='border:1px solid #ddd;'>", unsafe_allow_html=True)
444
 
445
  st.markdown("""
446
- <h4>βš–οΈ Combining High & Low Variance Models</h4>
447
  <p>A crucial step to improve ensemble performance is <b>choosing models with different variance levels</b>:</p>
448
  <ul>
449
  <li><b>Voting Regressor:</b> Uses a combination of <b>high-variance</b> (Decision Tree, KNN with small K) and <b>low-variance</b> (KNN with large K, Decision Tree with depth constraint) models.</li>
@@ -456,7 +456,7 @@ elif st.session_state.current_page == "Model Building":
456
 
457
  # Hyperparameter Tuning
458
  st.markdown("""
459
- <h4 style='color: #FF9F45;'>⚑ Hyperparameter Tuning using Optuna</h4>
460
  <p>We optimized hyperparameters for <b>KNN, Decision Tree, Bagging Regressor, and Random Forest</b> using <b>Optuna</b>.</p>
461
  <p>Below are the <b>optimized parameters</b> for each model:</p>
462
 
@@ -494,7 +494,7 @@ elif st.session_state.current_page == "Model Building":
494
 
495
  # Model Performance Insights
496
  st.markdown("""
497
- <h4 style='color: #00A19D;'>πŸ“Š Model Performance Insights</h4>
498
  <p>Here’s how our ensemble models performed on training and test datasets:</p>
499
  """, unsafe_allow_html=True)
500
 
@@ -549,7 +549,7 @@ elif st.session_state.current_page == "Model Building":
549
 
550
  # Choosing the Best Model
551
  st.markdown("""
552
- <h4 style='color: #FF5D5D;'>πŸ† Choosing the Best Model</h4>
553
  <ul>
554
  <li>πŸ” We checked for <b>overfitting</b> (high training accuracy, low test accuracy).</li>
555
  <li>πŸš€ We avoided <b>underfitting</b> (low training and test accuracy).</li>
 
400
  # Model Building
401
  elif st.session_state.current_page == "Model Building":
402
  st.markdown("""
403
+ <h2 style='text-align: center; color: #333;'>Model Building</h2>
404
  """, unsafe_allow_html=True)
405
 
406
  st.markdown("""
407
+ <h2>Introduction</h2>
408
  <p>In this section, we explore different <b>Ensemble Learning</b> techniques to improve model performance.</p>
409
  <p>We implemented three ensemble models:
410
  <span style='font-size:16px;'>πŸ₯‡ <b>Voting Regressor</b> - 🎯 <b>Bagging Regressor</b> - 🌲 <b>Random Forest Regressor</b></span></p>
 
443
  st.markdown("<hr style='border:1px solid #ddd;'>", unsafe_allow_html=True)
444
 
445
  st.markdown("""
446
+ <h2>Combining High & Low Variance Models</h2>
447
  <p>A crucial step to improve ensemble performance is <b>choosing models with different variance levels</b>:</p>
448
  <ul>
449
  <li><b>Voting Regressor:</b> Uses a combination of <b>high-variance</b> (Decision Tree, KNN with small K) and <b>low-variance</b> (KNN with large K, Decision Tree with depth constraint) models.</li>
 
456
 
457
  # Hyperparameter Tuning
458
  st.markdown("""
459
+ <h2>⚑ Hyperparameter Tuning using Optuna</h2>
460
  <p>We optimized hyperparameters for <b>KNN, Decision Tree, Bagging Regressor, and Random Forest</b> using <b>Optuna</b>.</p>
461
  <p>Below are the <b>optimized parameters</b> for each model:</p>
462
 
 
494
 
495
  # Model Performance Insights
496
  st.markdown("""
497
+ <h2>πŸ“Š Model Performance Insights</h2>
498
  <p>Here’s how our ensemble models performed on training and test datasets:</p>
499
  """, unsafe_allow_html=True)
500
 
 
549
 
550
  # Choosing the Best Model
551
  st.markdown("""
552
+ <h2>Choosing the Best Model</h2>
553
  <ul>
554
  <li>πŸ” We checked for <b>overfitting</b> (high training accuracy, low test accuracy).</li>
555
  <li>πŸš€ We avoided <b>underfitting</b> (low training and test accuracy).</li>