Lecture 2 Flashcards
1
Q
Labelling
A
- After detecting foreground regions, label them
- Find group of pixels that connect to each other -> connected component analysis
2
Q
Connectivity
A
- 2D
o 4-connectivity: any two pixels that have an Euclidean distance D = 1
o 8-connectivity: any two pixels that have an Euclidean distance D < 2 - 3D
o 6-connectivity: any two pixels that have an Euclidean distance D = 1
o 26-connectivity: any two pixels that have an Euclidean
distance D < 2
3
Q
Connected Component Analysis
A
- Two pass blob colour algorithm
- Forward pass: labelling takes into account connections to already
labelled pixels and keeps track of connections - Keep track of connected region using connectivity table
- Backward pass: replace labels with lowest connected label
o In the example on the right we thus get labels 1 and 3
4
Q
Region Growing Algorithm
A
- Start from seed point (coordinates in an image)
o Has some intensity value - Check neighbours of the seed point, given a connectivity measure
- If neighbour fulfils some criteria it is included in the segmentation
o For example similar texture or intensity
o Usually requires use of a (double) threshold - Repeat previous steps for all neighbours included in the segmentation
o Segmented region grows this way - Iterate until no improvements are made
5
Q
Segmentation Performance
A
- Class detection can have 4 different results, assuming the foreground is the positive class and
the background is the negative class
o True positive (TP): foreground pixel labelled as foreground
o True negative (TN): background pixel labelled as background
o False positive (FP): background pixel labelled as foreground
o False negative (FN): foreground pixel labelled as background - Accuracy: (TP + TN) / (TP + TN + FP + FN)
o Problematic because we often have extremely imbalanced classes - Specificity: TN / (TN + FP)
- Sensitivity/recall: TP / (TP + FN)
- Dice coefficient: 2TP / (2TP + FP + FN)
- Jaccard measure: TP / (TP + FP + FN) = Intersection over Union (IoU)
6
Q
Classification
A
- First-order statistical texture analysis -> histogram,
typically normalised - Can extract six features, shown on the right
- Does not consider spatial relationship and
correlation between pixels
o Identical histograms can belong to different textures
7
Q
Local Binary Pattern
A
- Takes a centre pixel gc and P neighbours gp at a distance R
o Now takes spatial relations into account for classification - Can also use K-nearest neighbours
o But need to normalise features
8
Q
Gabor filters
A
- Multiply Gaussian kernel with sinusoidal function
o Can change frequency of sinusoidal function
o Can change orientation of sine wave
o Can change scale (sigma) of the Gaussian function - Real and imaginary component representing orthogonal directions
- Can create a bank of filters by varying parameters of filters
9
Q
Classification Performance Metrics
A
- Each classified case gets a likelihood score
- Often use ROC curves for analysis
- Check all thresholds for classification
o For each, calculate sensitivity and 1-sensitivity
o Gives a point in a 2D space, make curve through all points - Calculate Area Under the Curve (AUC) for ROC curves
o Range [0,1]
o Identity line (AUC = 0.5) indicates random chance
o Also seen as the probability that the model ranks a random positive example more
highly than a random negative sample
10
Q
Detection
A
- Answers question such as: are there any abnormal structures in this image?
- Feature based
o Extract relevant features for every location
o Combine them (e.g. machine learning)
o Threshold result (post-processing) - Classifier: trained by combining several features
o After classification, each detection gets a likelihood value
o Then perform threshold and post-processing - Post-processing
o Morphology to remove small areas
o Discard detection that are too small/large
▪ Can be learned from training data
o Use prior knowledge to discard highly unlikely detected features
11
Q
Template Matching
A
- When you need to find a known object in an image
o Can build a matched filter for appearance of the object
▪ High responses when doing convolution with matched filter and image
12
Q
Detection Performance Metrics
A
- Need to define a hit criterion
o If d < 2r -> hit the object
o Ensures that we do not need an exact match - Sensitivity/recall (R) = TP / (TP+FN)
- Precision (P) = TP / (FP+TP)
- F1-score = 2PR (P+R) -> Balance between precision and recall
- Since each detection has a likelihood value, we can threshold to select
detections
o Influences TP, FP,FN - Free-response receiver operating characteristic (FROC)
o Like ROC curves, but now sensitivity vs average
number of false positives