Spaces:
Sleeping
Sleeping
| import streamlit as st | |
| from streamlit_option_menu import option_menu | |
| def show(): | |
| st.markdown("# _Explainable AI for Housing Estimates_") | |
| st.markdown(""" | |
| ## Background | |
| _This application is designed to demonstrate an understanding of how complex problems can be explained | |
| using Generalized Additive Models (GAMs) and LIME, and when used in conjunction of one another, can sometimes | |
| explain even the most complex decision boundaries._ | |
| """) | |
| st.markdown(""" | |
| ### Generalized Additive Models | |
| > GAMs are an extension of linear regression models that allow for non-linear relationships between the | |
| features and target. GAMs use "smooth functions of our feature variables, which can take on a great many | |
| forms, with more detail on what that means in the following section" (Clarke). This is where GAMs differ from | |
| generalized linear models like linear regression. A GAM is "composed of a sum of smooth functions of features | |
| instead of or in addition to the standard linear feature contributions" (Clarke). These smooth functions or | |
| spline functions can create non-linear decision boundaries. Spline functions show the partial dependence of | |
| each feature on the target variable, enabling a high-level understanding of the model predictions. | |
| """) | |
| st.markdown(""" | |
| ### Local Interpretable Model-agnostic Explanations (LIME) | |
| - Local - explanation should reflect the behavior of the classifier in the neighborhood of the instance being predicted | |
| - Interpretable - human is able to make sense of it | |
| - Model Agnostic - applied to any machine learning model | |
| - Explanation - that helps interpret the model | |
| """) | |
| st.markdown(""" | |
| > LIME is a powerful tool that explains some predictions created by a neural network. | |
| However, LIME only explains a local area within the neural network model and does not fully explain the entire decision space of a given model. | |
| This is an important consideration when using LIME, for example, you are only mapping a subset of the entire decision space of a black box model. | |
| """) | |
| st.markdown(""" | |
| ### Purpose | |
| > This project aims to provide insights into Generalized Additive Models and LIME and how they can be used to create and explain non-linear solutions | |
| """) | |
| st.markdown(""" | |
| ### Models | |
| > **Neural Network**: 3 linear layers using pytorch | |
| > **GAM**: linear gam using pygam | |
| """) | |
| st.markdown(""" | |
| ### Author | |
| > Created by Keese Phillips | |
| ### Contact | |
| > [keese.phillips@duke.edu](mailto:keese.phillips@duke.edu) | |
| """) | |
| st.markdown(""" | |
| ### References | |
| > Clark, Michael. “Generalized Additive Models.” Generalized Additive Models, m-clark.github.io/generalized-additive-models/introduction.html. Accessed 2 Dec. 2024. | |
| """) | |
| if __name__ == '__main__': | |
| st.set_page_config(page_title="Generative Additive Model", page_icon="🚀") | |
| page = option_menu( | |
| menu_title=None, | |
| options=["Home", "About"], | |
| icons=["house", "book"], | |
| menu_icon="cast", | |
| default_index=1, | |
| orientation="horizontal", | |
| ) | |
| if page == "Home": | |
| st.switch_page("pages/main.py") | |
| elif page == "About": | |
| show() | |
| st.info("This app is for explaining the problem domain using Generalized Additive Models") | |