This project provides an interactive dashboard for analyzing the LeviLayer activation function in neural networks. LeviLayer is a novel activation function that has shown promising results in various deep learning tasks. With this dashboard, users can explore the behavior of LeviLayer and compare it with other popular activation functions.
- Parameter Tuning: Adjust learning rate, number of neurons, and number of data points for analysis.
- Model Comparison: Compare LeviLayer with ReLU, Sigmoid, Tanh, Leaky ReLU, and ELU activation functions.
- Data Visualization: Visualize the LeviLayer activation function and compare it with other functions using interactive plots.
To run this project locally, follow these steps:
-
Clone this repository:
git clone https://github.com/Priyanshu085/levilayer.git
-
Navigate to the project directory:
cd levilayer
-
Install the required dependencies:
pip install -r requirements.txt
-
Run the Streamlit app:
streamlit run index.py
-
Access the app in your browser at
http://localhost:8501
.
- Adjust the parameters in the sidebar to tune LeviLayer and compare it with other activation functions.
- Click the "Visualize" button to generate interactive plots of the activation functions.
- Explore different settings and observe how LeviLayer behaves under various conditions.
This project is licensed under the MIT License - see the LICENSE file for details.
This project was created and maintained by Priyanshu.