Flashcards in Chapter 16 Deck (23):

1

## Bartlett's test of sphericity

### unsurprisingly this is a test of the assumption of sphericity. This test examines whether a variance-covariance matrix is proportional to an identity matrix. Therefore, it effectively tests whether the diagonal elements of the variance-covariance matrix are equal (i.e. group variances are the same), and that the off-diagonal elements are approximately zero (i.e. the dependent variables are not correlated). Jeremy Miles, who does a lot of multivariate stuff, claims he's never ever seen a matrix that reached non-significance using this test and, come to think of it, I've never seen one either (although I do less multivariate stuff) so you've got to wonder about its practical utility.

2

## Box's test

### a test of the assumption of homogeneity of covariance matrices. This test should be non-significant if the matrices are roughly the same. Box's test is very susceptible to deviations from multivariate normality and so can be non-significant, not because the variance-covariance matrices are similar across groups, but because the assumption of multivariate normality is not tenable. Hence, it is vital to have some idea of whether the data meet the multivariate normality assumption (which is extremely difficult) before interpreting the result of Box's test.

3

## Discriminant analysis

### also known as discriminant function analysis. This analysis identifies and describes the discriminant function variates of a set of variables and is useful as a follow-up test to MANOVA as a means of seeing how these variates allow groups of cases to be discriminated.

4

## Discriminant function variates

###
a linear combination of variables created such that the differences between group means on the transformed variable are maximized. It takes the general form:

Variate1=b1X1+b2X2+...+bnXn:

5

## Discriminant scores

### a score for an individual case on a particular

6

## Error SSCP (

### the error sum of squares and cross-product matrix. This is a

7

## null

### this is a matrix that is functionally equivalent to the hypothesis SSCP divided by the error SSCP in MANOVA. Conceptually it represents the ratio of systematic to unsystematic variance, so is a multivariate analogue of the F-ratio.

8

## Homogeneity of covariance matrices

### an assumption of some

9

## Hotelling-Lawley trace (T2)

### a

10

## Hypothesis SSCP (

### the hypothesis sum of squares and cross-product matrix. This is a

11

## Identity matrix

### a square matrix (i.e. has the same number of rows and columns) in which the diagonal elements are equal to 1, and the off-diagonal elements are equal to 0. The following are all examples:

12

## Matrix

### a collection of numbers arranged in columns and rows. The values within a matrix are typically referred to as

13

## Multivariate

### means 'many variables' and is usually used when referring to analyses in which there is more than one outcome variable (e.g. MANOVA, principal component analysis, etc.).

14

## Multivariate analysis of variance (or MANOVA)

### family of tests that extend the basic analysis of variance to situations in which more than one outcome variable has been measured.

15

## Multivariate normality

### an extension of a normal distribution to multiple variables. It is a

16

## Pillai-Bartlett trace (

### a test statistic in MANOVA. It is the sum of the proportion of explained variance on the discriminant function variates of the data. As such, it is similar to the ratio of SSM/SST.

17

## Roy's largest root

### a

18

## Square matrix

### a

19

## Sum of squares and cross-products (SSCP) matrix

### a

20

## Total SSCP (

### the total sum of squares and cross-product matrix. This is a

21

## Univariate

### means 'one variable' and is usually used to refer to situations in which only one outcome variable has been measured (i.e. ANOVA, t-tests, Mann-Whitney tests, etc.).

22

## Variance-covariance matrix

### a square matrix (i.e. same number of columns and rows) representing the variables measured. The diagonals represent the variances within each variable, whereas the off-diagonals represent the covariances between pairs of variables.

23