Decision Tree is one of the beautiful algorithms used for solving both classification and regression problems. It is a supervised machine learning algorithm, which means along with the independent features, output features/labels are required for building the decision tree-based model.

A decision tree is a powerful algorithm and used for solving complex datasets. It became popular due to its simplicity, as it is easily understandable and has a good tree-like visualization of the entire model. Decision Tree forms the base of most ensemble techniques like Random Forest, Adaboost, Xgboost.

Unlike linear regression, where it tries to find the relationship between…

Logistic regression is a type of algorithm used for solving classification types of problems. Classification could be either binary such as 0/1, yes/no, High/low, male/female etc.. or multiple classes such as low/medium/ high, poor/average/excellent etc.,. It falls under the supervised learning algorithm, where for the given input features, output features/labels are also known. Unlike Linear regression which tries to find the relation between dependent(y) and independent variables(X) by establishing the best fit line, logistic regression also tries to regress a line that divides/separates dataset and classifies by using probabilistic functions.

Let us consider an example, depending on the average run…

Linear Regression is one of the most fundamental, yet powerful algorithms used in Supervised Machine Learning. It is simple, easily understandable, and used for solving regression types of problems. In Linear Regression, we try to find the relationships between the independent variables (X) and dependent variable (y), and in the future, for unseen (X) it predicts (y). If only a single (X) is available, then it is called a **simple linear regression, **if** **more number of (X) variables are present, it is called a **multiple linear regression.**

The relation between X and y is nothing but a mathematical equation that…