Chapter 16 Flashcards Preview

Statistics for the Behavioral Sciences > Chapter 16 > Flashcards

Flashcards in Chapter 16 Deck (38):
1

What is a Simple Linear Regression and How do we use it?

MASTERING THE CONCEPT Page 423

16.1: Simple linear regression allows us to determine an equation for a straight line that predicts a person’s score on a dependent variable from his or her score on the independent variable.

We can only use it when the data are approximately linearly related.

2

What is the formula for predicting a z score by the Pearson Correlation Coefficient?

MASTERING THE FORMULA Page 424

16-1: The standardized regression equation predicts the z score of a dependent variable, Y, from the z score of an independent variable, X. We simply multiply the independent variable’s z score by the Pearson correlation coefficient to get the predicted z score on the dependent variable:

z  = (rXY)(zX)

3

What is regression to the mean

Page 425

■■Regression to the mean is the tendency of scores that are particularly high or low to drift toward the mean over time.

4

What is the intercept in a regression?

Page 426

■■The intercept is the predicted value for Y when X is equal to 0, which is the point at which the line crosses, or intercepts, the y-axis.

5

Explain the slope of a regression?

Page 426

■■The slope is the amount that Y is predicted to increase for an increase of 1 in X.

6

What is the simple regression formula?

MASTERING THE FORMULA Page 426

16-2: The simple linear regression equation uses the formula:

Ý= a + b(X).

In this formula, X is the raw score on the independent variable and Ý is the predicted raw score on the dependent variable. a is the intercept of the line, and b is its slope.

7

 What is the formula for standardized regression coefficient ß?

MASTERING THE FORMULA Page 430 16-3:

The standardized regression coefficient, ß, is calculated by multiplying the slope of the regression equation by the quotient of the square root of the sum of squares for the independent variable by the square root of the sum of squares for the dependent variable:

ß = (b)*(√SSX / √SSY)

8

What is ß or Beta Weight?

Page 430

■■The standardized regression coefficient, a standardized version of the slope in a regression equation, is the predicted change in the dependent variable in terms of standard deviations for an increase of 1 standard deviation in the independent variable; symbolized by ß and often called beta weight.

9

Explain Standardized Regression Coefficient?

MASTERING THE CONCEPT Page 431

16.2: A standardized regression coefficient is the standardized version of a slope, much like a z statistic is a standardized version of a raw score.

For simple linear regression, the standardized regression coefficient is identical to the correlation coefficient.

This means that when we conduct hypothesis testing and conclude that a correlation coefficient is statistically significantly different from 0, we can draw the same conclusion about the standardized regression coefficient.

10

How does regression work with correlation?

>Regression builds on correlation, enabling us not only to quantify the relation between two variables but also to predict a score on a dependent variable from a score on an independent variable. Page 431

11

How does Standardized Regression Work?

>With the standardized regression equation, we simply multiply a person’s z score on an independent variable by the Pearson correlation coefficient to predict that person’s z score on a dependent variable. Page 431

12

Explain Raw-Score Regression?

>The raw-score regression equation is easier to use in that the equation itself does the transformations from raw score to z score and back. Page 431

13

What is the standardized regression equation used for?

>We use the standardized regression equation to build the regression equation that can predict a raw score on a dependent variable from a raw score on an independent variable. Page 431

14

How do we graph the regression line?

>We can graph the regression line,

Ý= a + b(X)

based on values for the y intercept, a; the predicted value on Y when X is 0; and the slope, b, which is the change in Y expected for a 1-unit increase in X.

Page 431

15

What is does the standardized regression coefficient tell us?

>The slope, which captures the nature of the relation between the variables, can be standardized by calculating the standardized regression coefficient.

The standardized regression coefficient tells us the predicted change in the dependent variable in terms of standard deviations for every increase of 1 standard deviation in the independent variable.

Page 431

16

Explain the relationship between the standardized regression coefficient to the Pearson correlation coefficient in a simple linear regression?

>With simple linear regression, the standardized regression coefficient is identical to the Pearson correlation coefficient. Page 431

17

Define: Standard Error of the Estimate?

Page 433

■■The standard error of the estimate is a statistic indicating the typical distance between a regression line and the actual data points.

18

Explain regression to the mean?

MASTERING THE CONCEPT Page 434

16.3: Regression to the mean occurs because extreme scores tend to become less extreme—that is, they tend to regress toward the mean.

Very tall parents do tend to have tall children, but usually not as tall as they are, whereas very short parents do tend to have short children, but usually not as short as they are.

19

What is the Coefficient of Determination?

Page 435

■■The proportionate reduction in error is a statistic that quantifies how much more accurate predictions are when we use the regression line instead of the mean as a prediction tool; also called the coefficient of determination.

20

What is the proportionate reduction in error and how is it calculated?

MASTERING THE FORMULA Page

438 16-4: The proportionate reduction in error is calculated by subtracting the error generated using the regression equation as a prediction tool from the total error that would occur if we used the mean as everyone’s predicted score. We then divide this difference by the total error:

r2 = (SStotal - SSerror) / SStotal

We can interpret the proportionate reduction in error as we did the effect-size estimate for ANOVA. It represents the same statistic.

21

What is Proportionate Reduction In Error?

MASTERING THE CONCEPT Page 439

16.4: Proportionate reduction in error is the effect size used with regression. It is the same number we calculated as the effect size estimate for ANOVA.

It tells us the proportion of error that is eliminated when we predict scores on the dependent variable using the regression equation versus simply predicting that everyone is at the mean on the dependent variable.

22

What is Multiple Regression?

MASTERING THE CONCEPT Page 440

16.5: Multiple regression predicts scores on a single dependent variable from scores on more than one independent variable.

Because behavior tends to be influenced by many factors, multiple regression allows us to better predict a given outcome.

23

What is an Orthogonal Variable?

Page 440

■■An orthogonal variable is an independent variable that makes a separate and distinct contribution in the prediction of a dependent variable, as compared with the contributions of another variable.

24

How does Multiple Regression Work?

Page 440

■■Multiple regression is a statistical technique that includes two or more predictor variables in a prediction equation.

25

What is a Stepwise multiple regression?

Page 442

■■Stepwise multiple regression is a type of multiple regression in which a computer program determines the order in which independent variables are included in the equation.

26

What is the difference between multiple regression and stepwise multiple regression?

MASTERING THE CONCEPT Page 443

16.6: In multiple regression, we determine whether each added independent variable increases the amount of variance in the dependent variable that we can explain.

In stepwise multiple regression, the computer program determines the order in which independent variables are added, whereas in hierarchical multiple regression, the researcher chooses the order.

In both cases, however, we report the increase in R2 with the inclusion of each new independent variable or variables.

27

What is a Hierarchical multiple regression?

Page 443

■■Hierarchical multiple regression is a type of multiple regression in which the researcher adds independent variables into the equation in an order determined by theory.

28

What is Structural Equation Modeling?

Page 444

■■Structural equation modeling (SEM) is a statistical technique that quantifies how well sample data “fit” a theoretical model that hypothesizes a set of relations among multiple variables.

29

What is a statistical (or theoretical) model?

Page 444

■■A statistical (or theoretical) model is a hypothesized network of relations, often portrayed graphically, among multiple variables.

30

Explain the term Path?

Page 445

■■Path is the term that statisticians use to describe the connection between two variables in a statistical model.

31

What is Path Analysis?

Page 445 ■■Path analysis is a statistical method that examines a hypothesized model, usually by conducting a series of regression analyses that quantify the paths between variables at each succeeding step in the model.

32

What are manifest variables?

Page 445

■■Manifest variables are the variables in a study that we can observe and that are measured.

33

What are Latent Variables?

Page 445

■■Latent variables are the ideas that we want to research but cannot directly measure.

34

How do we use Multiple Regression?

> Multiple regression is used to predict a dependent variable from more than one independent variable.

Ideally, these variables are distinct from one another in such a way that they contribute uniquely to the predictions. Page 447

35

Explain a Multiple Regression Equation?

> We can develop a multiple regression equation and input specific scores for each independent variable to determine the predicted score on the dependent variable. Page 447

36

How is Multiple Regression Used in The Real World ?

> Multiple regression is the backbone of many online tools that we can use for predicting everyday variables such as traffic or home prices. Page 447

37

What is the difference between stepwise and hiearchical multiple regression?

> In stepwise multiple regression, a computer program determines the order in which independent variables are tested; in hierarchical multiple regression, the researcher determines the order. Page 447

38

Explain How Structural Equation Modeling works?

> Structural equation modeling (SEM) allows us to examine the “fit” of a sample’s data to a hypothesized model of the relations among multiple variables, the latent variables that we hypothesize to exist but cannot see. Page 447