Seaborn: annotate the linear regression equation

You can use coefficients of linear fit to make a legend like in this example: import seaborn as sns import matplotlib.pyplot as plt from scipy import stats tips = sns.load_dataset(“tips”) # get coeffs of linear fit slope, intercept, r_value, p_value, std_err = stats.linregress(tips[‘total_bill’],tips[‘tip’]) # use line_kws to set line label for legend ax = sns.regplot(x=”total_bill”, … Read more

How to add additional plots to a seaborn FacetGrid and specify colors

I think you want units in the call to relplot and then add a layer of lineplot using map: import seaborn as sns import pandas as pd fm = sns.load_dataset(‘fmri’).query(“event == ‘stim'”) g = sns.relplot( data=fm, kind=’line’, col=”region”, x=’timepoint’, y=’signal’, units=”subject”, estimator=None, color=”.7″ ) g.data = fm # Hack needed to work around bug on … Read more

Machine learning – Linear regression using batch gradient descent

The error is very simple. Your delta declaration should be inside the first for loop. Every time you accumulate the weighted differences between the training sample and output, you should start accumulating from the beginning. By not doing this, what you’re doing is accumulating the errors from the previous iteration which takes the error of … Read more

Linear regression analysis with string/categorical features (variables)?

Yes, you will have to convert everything to numbers. That requires thinking about what these attributes represent. Usually there are three possibilities: One-Hot encoding for categorical data Arbitrary numbers for ordinal data Use something like group means for categorical data (e. g. mean prices for city districts). You have to be carefull to not infuse … Read more

What is the difference between linear regression and logistic regression? [closed]

Linear regression output as probabilities It’s tempting to use the linear regression output as probabilities but it’s a mistake because the output can be negative, and greater than 1 whereas probability can not. As regression might actually produce probabilities that could be less than 0, or even bigger than 1, logistic regression was introduced. Source: … Read more