Imaging and Cognitive Data Flashcards

1
Q

What do we end up with at the end of ROI analysis?

A

A spreadsheet - this tells us how much to normalise by

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the biggest downside to an ROI analysis?

A

You need to know what to look at (need an a-priori hypothesis)

e.g. need to know what the ROI is

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

If we dont have an ROI what can we do?

A

Conduct an exploratory analysis - there may be some existing literature to base a hypothesis on

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is VBM?

A

Voxel-Based Morphometry

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is voxel-based morphometry?

A

A “global” volumetric brain analysis

A single experiement that enables you to identify local grey matter changes and other associations across the whole brain.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the goal of VBM?

A

To get an example brain where affected regions are highlighted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When should you use VBM?

A

When you can’t generate an a-priori hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What scans does VBM use?

A

T1 scans

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the first step to VBM?

A

Brain extraction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is brain extraction?

A

FSL has a tool called BET to skull strip the brain by providing a graphical output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the second step to VBM?

A

Grey matter tissue segmentation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is tissue segmentation?

A

FSL has a tool called FAST which runs on skull stripped images to generate tissue probability maps (TPMs)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are grey matter tissue probability maps (GM TPMs)?

A

A quantitative image in which the white matter voxel values have been converted to 0 so they don’t appear on the scan.

The voxel values in TPMs have meaning - they convey the fraction of that voxel which contains grey matter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is step 3 of VBM?

A

Templates and registration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Why are templates useful for VBM?

A

VBM experiments all require the same shared space, templates like MNI152 help to facilitae this- we can register our VBM scans to the template

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is an important issue with registration?

A

Introduces error and noise to the data

The more the target image is unlike the scan, the more noise and error we get in our result

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

If MNI152 is different to patient brains with diseases i.e. atrophy, what can we do instead that is an important step in VBM?

A

Make a template image from the data we are using the the analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is a benefit of creating a template fom our own data set?

A

The target image would be as geometrically similar to our scans as possible - minimising registration error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is a problem with uneven group sizes when creating the template?

A

If we are testing different groups, we are hypothesising one group has consistently “different” brains to the other

-we need to be sure that differences are evenly weighted in our template image or the template image would look disproportionately more like one group than the other, creating an unbalanced spread of noise across our experiment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

How do we resolve uneven group sizes for the template?

A

FSL VMB needs a file containing all of the scans contributing to our image template

IF we are testing a group design AND our group sizes are not the same, we need to randomly cut some scan names out of the list to make it balanced

Otherwise we would list all the scans in the FSL VMB file

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Do we still use the MNI152 brain in VBM?

A

Yes, MNI152 brain used as a starting point for making our template brain in a two-step process

22
Q

What is the two-step registration process for creating our own template with MNI152?

A
  1. 12DOF linear registration to the approximate shape of the MNI152 brain
  2. Do a non-linear registration of that transformation to finish
23
Q

What happens once we have registrered the participants native space to the MNI152’s brain space?

A

We apply the same to the grey matter TPM (we co-register everything to the MNI152 space)

Running this for every participant gives us a set of GM TPMS’s which are in the MNI152 space
-> the GM TPMs are then averaged to create a new template image

So we have taken the mathematical average of all the voxels in the grey matter from all the scans to create the template

24
Q

What happens once the template has been created?

A

We go back to the original (raw data) native space GM TPMs and then register them all to the new GM template we have made by performing a fresh linear and non-linear registration

25
Q

What does it mean once the raw GM TPMs have been registered to the new template?

A

Every GM TPM should now have a reasonably high-quality transformation into the same shared space.

So if we examine the “same” voxel in each transformed image, the registered TPM value reflects the quantity of GM, per-participant, for the same anatomical point.

26
Q

What does this registration form the basis of?

A

voxelwise analysis - running a statistical test per every voxel in the brain, its the basis of how VBM studies work

27
Q

What is a HUGE LIMITATION of this method?

A

We lose all information regarding cortical thickness

28
Q

Explain how we lose this cortical thickness information.

A

GM atrophy primarily presents as a thinning of the cortex.

If we register everyone’s GM TPM to the same target space, this will make thin cortexes thicker, and thick cortexes thinner.

All cortexes will end up the same thickness in order to match the template image.

29
Q

For a voxelwise analysis it is essential everyone shares the same space but how can we preserve information about how thick or thin someone’s cortex was pre-transformation?

A

When we perfrom our registrations we can save the deformation map

This is an image where voxel values represent how much the original scan was expanded or contracted at that point to match template space.

30
Q

What is a deformation map?

A

Its a side output from the non-linear registration which is a map of what voxels had to be inflated or deflated to match the template

-an image where voxel values represent how much the original scan was expanded or contracted to match template space

31
Q

What can we do with a deformation map?

A

Multiply it by the raw tranformation

This gives us a version of the GM TPM, in template space, where cortex thickness has been spatially matched but the GM values have been increased or decreased by the degree that it was inflated or deflated to achieve this

32
Q

What is it called when the deformation map is multipled by the raw transformation?

A

“modulated” TPMs - these are used in the statistical analysis

33
Q

What is the fourth step of VBM?

A

Smoothing

34
Q

What is smoothing?

A

It is essentially making the picture blurry

For every voxel you make it an average of itself and of its neighbouring voxel as well
- taking an average from local areas increases accuracy

35
Q

Why do we smooth?

A

To boost signal to noise

36
Q

What is signal?

A

Signal = patterns in grey matter values reflect a numerical change in a consistent direction across all relevant voxels- so neighbourhood grey matter values will all be higher or lower in a particular direction

37
Q

What is noise?

A

Noise- comes from image transformations, scanner/sequence imperfections

= a numerical change on top of the “true” GM values which is random

38
Q

Why is increasing signal to noise through smoothing important?

A

Because any signal can be relied on to be consistent across a region of a scan, by averaging voxel values with their neighbouring values we aim to preserve the effect of brain atrophy while smoothing random noise out of the data.

39
Q

How does FSL VBM smooth data?

A

Smooths your data by 3 different degrees it measures in “sigma” values (to a sigma of 2, 3 and 4).

These hold a relationship to the “full-width at half-maximum”, the way smoothing is measured in image analyses.

This uses a gaussian curve to weight the averaging of a voxels value differently, depending on how far away the other neighbouring voxels are.

A higher sigma value = higher FWHM = smoothing over a greater distance.

40
Q

What happens after the data has been smoothed by 3 different degrees in FSL VBM?

A

FSL VBM will then run quick pilot statistics on each version- we identify the set that has the strongest statistical values -> theoretically that degree of smoothing is most beneficial to your signal:noise problem.

41
Q

What is step five of VBM?

A

The analysis

42
Q

How does FSL carry out the analysis?

A

A voxelwise analysis will perform our statistical test separately for every voxel co-ordinate across all participants.

FSL runs statistics on images using a tool called RANDOMISE.

This can implement any desired statistical design (groupwise, correlation, control for additional variables etc.) via permutation testing.

43
Q

What is permutation testing?

A

Non-parametric implementation of stats testing

More permutations = more accurate statistical values.

For a quick check, do 500-1000.
For publications, do 10,000

44
Q

What is the result of the VBM experiment?

A

We end up with a final image, in template space, where each voxel value is now the p value of the statistical test for that location.

45
Q

What is the problem if in the final image we get a p value for every single value?

A

It is a multiple compairisons problem

46
Q

How do we over come this multiple comparisons problem?

A

Cluster enhancement

We can assume that voxels which become “significant” by chance alone will be positioned randomly, while ones which are significant because of a real effect will neighbour one another in clusters.

SO a final correction is made to suppress “lone” significant voxels and to enhance areas where voxels are consistently significant

47
Q

What are the pros of ROI analysis?

A

Higher measurement accurancy and statistical sensitivity for regions examined

48
Q

What is a limitation of ROI analysis?

A

Very restricted in regional scope (needs a good priori hypothesis)

49
Q

What are the pros of global analyses (e.g. VBM)?

A

Considers the whole brain and therefore nothing is “missed”

The elaborate processing and transformation steps create a relatively high amount of noise in our final data – for a given region our statistical power will not be as high as an ROI study of the same place

50
Q

What is a limitation of global analyses (e.g. VBM)?

A

Multiple comparisons are still unideal even after cluster-based correction