Asal Valisoltani's Projects
AI_Based_Triage_ER ( Group Project)
Belly Button Biodiversity Dashboard is an open-source interactive dashboard that visualizes the Belly Button Biodiversity dataset. Built with JavaScript, D3.js, Plotly.js, HTML, and CSS, the dashboard features a dropdown menu, horizontal bar chart, bubble chart, demographic information display, and optional gauge chart.
This project uses logistic regression models to analyze credit risk. The recommended model, trained with resampled data, shows higher precision and recall scores for predicting high-risk loans. This model helps mitigate credit risk for lending companies.
The Crowdfunding-Analysis with Excel project analyzes 1,000 crowdfunding projects and concludes that the US, theater, film and video, and music industries have the most campaigns, with a success rate of 50-60%. The dataset lacks some key data points such as gender and age of backers and distribution of campaigns across different US states.
Builded an ETL pipeline using Python, Pandas, Python dictionary methods and regular expressions to extract and transform the data. Created four CSV files and use the CSV file data to create an ERD and a table schema. Finally, uploaded the CSV file data into a Postgres database.
This project applies K-means clustering to group cryptocurrencies based on 24-hour and 7-day price changes. It also investigates the impact of dimensionality reduction using PCA on clustering outcomes.
This project uses deep learning to solve a classification problem. The dataset was preprocessed and a neural network model was optimized to achieve the target performance. Various techniques were tried to improve the model, demonstrating the power of deep learning models for classification problems.
The analysis aims to provide insights into the climate patterns of Honolulu and inform decisions regarding the best time to visit and what activities to plan.
This project analyzes home sales data using PySpark SQL. It involves creating a temporary table, running queries, and performing caching and partitioning. The final step involves uncaching and verifying the temporary table.
The SQL project involved designing tables to hold data from six CSV files, creating a table schema for each file, importing the data into SQL tables, and performing data analysis. The analysis involved answering various questions about the data, such as listing employee information and department managers and ...
Melbourne Rental Market Dashboard is a web-based platform that helps international students search for affordable rental housing in Melbourne. Built with Flask, Leaflet, Plotly, Chart, D3, and Video JS libraries, the platform features various visualization tools and is available on GitHub.
The study involved treating 249 mice with SCC tumors using a range of drug regimens, including Pymaceuticals' drug of interest, Capomulin. Over 45 days, tumor development was observed and measured to compare the performance of Capomulin against other treatments.My task was to generate tables and figures for the technical report of the study, as
The goal is to help the editors of a food magazine, Eat Safe, Love, to evaluate the data and assist their journalists and food critics in deciding where to focus future articles. The project aims to provide insights into the ratings data to identify establishments that meet the magazine's criteria for featuring in their articles.
USGS Earthquake Visualization is an open-source project that provides an interactive map to visualize earthquake data collected by the USGS, highlighting the relationship between tectonic plates and seismic activity. Built with JavaScript, Leaflet.js, D3.js, HTML, and CSS, the project is available on GitHub under the MIT License.
This project involved using Python and an API to investigate weather trends near the equator by collecting and analyzing weather data. The analysis helped to draw conclusions and provide insights into the factors affecting weather trends in this region.
I used Beautiful Soup and automated browsing to extract information about Mars from two different sources. In Part 1, I scraped titles and preview text from Mars news articles, while in Part 2, I scraped and analyzed Mars weather data to gain insights into the planet's climate patterns.