Git Product home page Git Product logo

dsc-regression-introduction-dc-ds-career-042219's Introduction

Multiple Regression and Model Validation - Introduction

Introduction

In this section, you'll start to see more advanced regression models which employ multiple predictors. With that, you'll also start to investigate how to validate your models to ensure they are neither overfit nor underfit, and will generalize well to future cases.

Multiple Linear Regression

In the last section, you learned how to perform a basic linear regression with a single predictor variable. Here, you'll explore how to perform linear regressions using multiple independent variables to better predict a target variable.

Improving a Baseline Model

Over the next few lessons, you'll see some ways to improve basic regression models using different combinations of features as well as some feature engineering. Often, a simple linear regression can be used as a "baseline model" upon which new features can be added to improve the predictions. Any decisions on how to change features should be compared against a simpler model to see if the changes have improved the model or not. This section will give an introduction to many of the techniques which can improve a regression model.

Dealing with Categorical Variables

Up to this point you've only seen continuous predictor variables. Here, you'll get a further look at how to identify and then transform categorical variables to utilize them as predictors of our target variable.

Multicollinearity of Features

While multiple predictors will ultimately increase model performance and yield better predictions, there are also possible negative effects when using multiple predictors that have a high correlation with each other. This is known as multicollinearity and can muddy model interpretation.

Feature Scaling and Normalization

Another consideration when using multiple predictors in any model is the scale of those features. For example, a dataset surrounding households might have a "number of children" feature and an "income" feature. These two variables are of vastly different scales and as such, simply having a feature like income which is on a much larger scale can impact its influence over the model. To avoid this, it is best practice to normalize the scale of all features before feeding the data to a machine learning algorithm.

Multiple Linear Regression in Statsmodels

After covering a lot of the key theory, you'll then get some hands-on practice in performing multiple linear regressions using the Statsmodels and Scikit-Learn libraries.

Model Fit and Validation

You'll continue the section by looking at how we can analyze the results of a regression, and learn the importance of splitting data into training and test sets to determine how well a model predicts "unknown" values (the test dataset). Finally, you'll wrap up the section by looking at how k-fold cross-validation can be used to get additional model validations on a limited data set by taking multiple splits of training and testing data.

Summary

In this section, you'll continue to bolster your understanding of linear regression so that you can solve a wider range of problems more accurately. Many of the principles covered in this section will also apply in later modules as you move onto working with other machine learning models.

dsc-regression-introduction-dc-ds-career-042219's People

Contributors

peterbell avatar mathymitchell avatar loredirick avatar mas16 avatar sumedh10 avatar cheffrey2000 avatar

Watchers

James Cloos avatar Kevin McAlear avatar  avatar Victoria Thevenot avatar Belinda Black avatar Bernard Mordan avatar  avatar Joe Cardarelli avatar Sara Tibbetts avatar The Learn Team avatar Sophie DeBenedetto avatar  avatar Antoin avatar Alex Griffith avatar  avatar Amanda D'Avria avatar  avatar Scott Ungchusri avatar Nicole Kroese  avatar  avatar Lisa Jiang avatar Vicki Aubin avatar Maxwell Benton avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.