08_Computer-Assisted Mass Valuation Flashcards

1
Q

Computer-Assisted Mass Appraisal

A
  • to identify relationship between property attributes (age, size, location) and prices that have been paid for them
  • Application: Residential Properties
  • Reliability: large quantity of data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Steps in Computer-Assited Mass Appraisal

A

1. Definition of Problem
2. Preliminiary Analysis + Data Collection

3. Demand and supply data, property characteristics: location, neighbourhood and structure
4. Model building and calibration
5. Model testing and quality control
6. Assesment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Hedonic Valuation

A

identifies price factors according to price that is determined both by internal characteristics of good and external environmental characteristics

1. Consider characteristics of a house
- # of floors, presence of garden, # of bedrooms, # of bathrooms, sqm of house, type of house, age garage, etc.
2. Consider external environmental characteristics
- accessibility to schools, shopping, sport facilities, public transport, etc.

Composite goods has a price, how about implicit price for each characteristic?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Hedonic Valuation
Constraints in Maximization problem

A
  • income
  • price of house
  • level of taxes

-> housing market gives us information on buyers preferences for housing e.g. structure and environmental characteristics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Hedonic Pricing
Definition

A
  • method applies simple concept to characteristics of property (land) price
  • Willingness to pay (WTP) determined b y price difference between houses that have different levels of structural and locational quality

-> assess value of environmental quality according to market prices of residential properties
-> variation in environmental quality affects price of housing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Hedonic Price
Formula

A
  • derivative function with respect to one of characteristics (k) is implicit price of k, or consumer is willing to pay p(k) for a marginal change of characteristics of k

implicit Price = d(p) / d(x(k))

p = f(x1, x2, …, xk)

e.g. housing price and # of bedrooms p(i) = 500 + 100n(i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Regression Model
Formula

A

y(i) = b(1) + b(2) * x(i) + e(i)

e(i) = error term, difference between observed y(i) and estimated y(i) (noise)
Y(i) = dependent or endogenous variable or variable to be explained at observation time i
x(i) = independent or exogenous or explaining variable (regressor)at time i
beta = regression coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Linear Regression model
Definition

A
  • modeling a linear relationship between a dependent variable to be explained and one or several explaining (independent) variables
  • simple linear regression when single explaining variable
  • multiple linear regression otherwise
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Function Form of Linear Regression Model

A
  • linear means that parameters are included linearly in the model
  • doesn
  • t imply there has to be a linear relationship between variables
  • if no linear relationship between variables linear regressin model can nevertheless be applied in certain cases by transforming one or several explanatroy variabels or the entire model
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Additional examples for models which are linear in the parameters

Linear Regression Model

A

− Yt=β1+β2Xt+ut
− Yt=β1+β2ln(Xt)+ut
− ln(Yt)=β1+β2Xt+ut
− ln(Yt)=β1+ β2ln(Xt)+ut
− Yt=β1+β2Xt2+ut

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

We observe 10 transactions:
Interpretation:
ln( yˆi) = 2.51 + 0.66 ln(x1i) + 0.67x2i

A
  • Keeping the presence of park in the nearby area constant, 1% increase in no. of bedrooms will increase the house price by 0.66% on average.
  • Keeping the number of bedroom constant, the presence of park in the nearby area will increase the house price by 67% on average.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Pros and Cons of Log-Transformation

A

Pros
- decrease heteroskedasticity in data
- model the non-loinear relationship using linear regression: diminishing marginal utility
- simplify a model e.g. sometimes log can simplify number and complexity of interaction terms
- interpret the implicit price in % changes

Cons
- can destroy normal distribution of data sometimes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

natural logarithm is probably most widespread transformation
- Implications

linear regression

A
  • interpretation of slope parameter must be adjusted accordingly

1. Level- log
Y and ln(X)
∆Y= (β1 / 100) %∆X

2. Log-Level
ln(Y) and X
%∆Y = (100 * β1 ∆x

3. Log-log
ln(Y) and ln(X)
%∆Y = %β1∆X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Model Evaluation
Dimensions

2 items

A

Global Evaluation
- can model explain data generating process of dependent variables as a whole?

Global Criteria
- coefficient of determination
- adjusted coefficient ofdetermination

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Coefficient of Determination

Model Evaluation (Global Criteria)

A

R-squared
- proportion of variance explained by regression model (variance of estimated Y) with respect to total variance of observations of Y
- frequently indicated in %

R^2= ssy(expected)ˆ2 / ssy(total variance)^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

R-squared
Characteristics I

A

0 <= R^2 <= 1

R-squared = 1 means:
- all residuals are 0
- Ys that are estimated by the regression model absolutely fit actual observations

R-square = 0 means:
- all regression coefficients have value 0, the intercept equals mean value
- regression model has no explanatory power at all

17
Q

R-squared
Characteristics II

A
  • preferrably as high as possible -> model is specified well
  • high R^2 however, does not necessarily imply:
  • every regressor included on model statistically has a significant influence on Y
  • all relevant regressors are included in model
  • R-square cannot become smaller by adding a regressor
  • beware: for regression without intercept R-square < 0 is possible
18
Q

Ajusted R-Squared

A
  • expanding model by one regressor R-squared becomes bigger
  • an increase in R-Squared does not necessarily mean that the new regressor has explanatory power

Adjusted Coefficient of determination

Adjusted R-Squared = 1 - ((n-1) / (n-k)) x (RSS/TSS)

Adjusted R-squared is smaller than R-Squared
- with a large n, (n-1)/(n-k) almost = 1 so that adjusted R-squared differs less and less from R-Squared

19
Q

Characteristics of Adjusted R-Squared

A
  • may increase or decrease when an additional regressor is incorporated
  • adjusted coefficient of determination is well suited to compare models
  • Caution: No comparison of models with differing Ys (e.g. Y and ln(Y))
20
Q

Evaluation of regression coefficients

A
  • does regressor Xi have explanatory power?

Testing that Null hypothesis (H0) can be rejected

T-Test

Possible Outcome for testing:
- reject H0
- Fail to reject H0

21
Q

Null Hypothesis
vs
Alternative Hypothesis

A

Null Hypothesis (H0)
- currently accepted thing
- e.g. value for a parameter

Alternative Hypothesis (Ha)
- claim to be tested

22
Q

α in Hyptohesis testing

A

level of significance
- in practice, mostly given as 1%, 5% or 10%

23
Q

c in Hypothesis testing

A

level of confidence

statistically significant

24
Q

Decision Rule of Hypothesis Testing

A
  • H0 is rejected it test statistic falls in a region (rejection region) in which it could fall only with very low probability alpha when Null Hypothesis is true
  • Rejection region means probability of type I error is smaller than alpha or test statistics is above critical value
  • rejecting H0 equals statistical proof of Ha (with error probability of alpha)
  • non-rejection of H0 on the other hand does not prove anything
25
Q

Steps in Hypothesis testing

A
  1. Quantiy Problem
  2. Formulate model assumptions
  3. Formulate Hyposthesis
  4. Determine significance level
  5. Determine rejection region
  6. Calculating test statistic
  7. Decision
  8. Interpretation
26
Q

Two-tailed T-Test

A

Ti = (b(i)−β(i) / s * sqrt(aii)

s * sqrt(aii) are called standard errors of b(i)

27
Q

p-value

A
  • probability of obtaining an absolute value of the test statistics that is higher than the value that was observed, assuming that H0 is true
  • H0 is rejected if p-value is very small (less than predetermined significance level)
28
Q

When do you reject H0 in Two-tailed T-Test

A
  • H0 rejected at a significance level of alpha
  • if absolute value of test statistics T(i) exceeds (1-alpha/2) quantile of the t-distribution with n-k degrees of freedom

N = total sample size
K = # of parameters including intercept

29
Q

Basic Rules
T-Test

A

Rejection H0
- absolute value of **T statistics is higher than T * ** or
- p value < alpha -> H0 can be rejected at alpha % level
- coefficient is significant / non zero at alpha % level

Non-Rejection Ha
- **absolute value of T statistics is lower than T * ** or
- p value > alpha
- H0 CANNOT be rejected at alpha % level
- coefficient is insignificnat / 0 at alpha% level