Skip to content

Annbelbella/Deep_Learning_Challenge

Repository files navigation

Deep_Learning_Challenge

image

This analysis aims to create a deep learning model using a neural network to predict the success of funding applicants for Alphabet Soup. By training a model on historical data, we aim to classify future applicants into successful or unsuccessful funding categories, assisting Alphabet Soup in making informed decisions.

Target variable(s): The target variable for the model is the "success" or "failure" of funding applicants

Feature Variables: Information about the applicanIt included; classification, application type, affiliation, organization, status etc.

Variables removed: EIN and NAME

I achieved a 73% target model performance which was unfortunaely below the required 75%

image

Techniques taken to increase model performance?

  1. Increasing the number of neurons and epochs: By increasing the number of neurons in a layer, the model becomes more expressive and can capture complex patterns in the data. It also allows the model to refine its predictions and find better parameter values, which can lead to improved accuracy. However, it's important to find a balance as increasing epochs excessively can lead to overfitting. I was only able to achieve 73% .

image

  1. Adding more layers: This can provide the model with additional capacity to capture and represent intricate relationships within the data. Each layer can learn different levels of abstraction, enabling the model to extract more meaningful features and potentially improving accuracy. Deep models with multiple layers have the ability to learn hierarchical representations of the data, which can be advantageous for complex problems. I was only able to achieve 73% image

  2. Using a different activation function (tanh for the second layer)

image

  1. Utilizing an Automated Optimiser (such as a hyperparameter tuner): Automated optimisers, like hyperparameter tuners, systematically explore various combinations of hyperparameters, such as activation functions, number of layers, number of neurons, and epochs. This exploration can help identify the most optimal combination of hyperparameters for your specific problem, potentially leading to higher accuracy. Again, I only achieved 73% image

Conclusion

The deep learning model that I have developed was unable to achieve accuracy higher than 73%. To further improve the model's performance, I would recommend exploring a different algorithm to improve accuracy and adding more data because increasing the size of the training dataset can help the model learn from a larger and more diverse set of examples.

About

The nonprofit foundation Alphabet Soup wants a tool that can help it select the applicants for funding with the best chance of success in their ventures

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published