Tag Archives: fMRI

New method - BWAS

There is a report of a new method of analyzing fMRI scans – using enormous sets of data and giving very clear results. Brain-wide association analysis (BWAS for short) was used in a comparison of autistic and normal brains in a recent paper (citation below).

The scan data is divided into 47,636 small areas of the brain, voxels, and then these are analyzed in pairs, each voxel with all other voxels. This gives 1,134,570,430 data points for each brain. This sort of analysis has been done in the past but only for restricted areas of the brain and not the whole brain. The method was devised by J. Feng, University of Warwick, Computer Department.

This first paper featuring the method shows its strengths. Cheng and others used data from over 900 existing scans from various sources that had matched autistic and normal pairs. The results are in the abstract below. (This blog does not usually deal with information on autism and similar conditions but tries to keep to normal function; I am not a physician. So the results are not being discussed, just the new method.)

A flow chart of the brain-wide association study [termed BWAS, in line with genome-wide association studies (GWAS)] is shown in Fig. 1. This ‘discovery’ approach tests for differences between patients and controls in the connectivity of every pair of brain voxels at a whole-brain level. Unlike previous seed-based or independent components-based approaches, this method has the advantage of being fully unbiased, in that the connectivity of all brain voxels can be compared, not just selected brain regions. Additionally, we investigated clinical associations between the identified abnormal circuitry and symptom severity; and we also investigated the extent to which the analysis can reliably discriminate between patients and controls using a pattern classification approach. Further, we confirmed that our findings were robust by split data cross-validations.” FC = functional connectivity; ROI = region of interest.

The results are very clear and have a very good statistical probability.

Abstract: “Whole-brain voxel-based unbiased resting state functional connectivity was analysed in 418 subjects with autism and 509 matched typically developing individuals. We identified a key system in the middle temporal gyrus/superior temporal sulcus region that has reduced cortical functional connectivity (and increased with the medial thalamus), which is implicated in face expression processing involved in social behaviour. This system has reduced functional connectivity with the ventromedial prefrontal cortex, which is implicated in emotion and social communication. The middle temporal gyrus system is also implicated in theory of mind processing. We also identified in autism a second key system in the precuneus/superior parietal lobule region with reduced functional connectivity, which is implicated in spatial functions including of oneself, and of the spatial environment. It is proposed that these two types of functionality, face expression-related, and of one’s self and the environment, are important components of the computations involved in theory of mind, whether of oneself or of others, and that reduced connectivity within and between these regions may make a major contribution to the symptoms of autism.
ResearchBlogging.org

Cheng, W., Rolls, E., Gu, H., Zhang, J., & Feng, J. (2015). Autism: reduced connectivity between cortical areas involved in face expression, theory of mind, and the sense of self Brain DOI: 10.1093/brain/awv051

Accuracy in both time and space

There has been a problem with studying the human brain. It has been possible to look at activity in terms of where it is happening using fMRI but there is poor resolution of time. On the other hand activity can be looked at with a good deal of time resolution with MEG and EEG but the spatial resolution is not good. Only the placement of electrodes in epileptic patients has giving clear spatial and temporal resolution. However, these opportunities are not common and the placement of the electrodes is dictated by the treatment and not by any particular studies. This has meant that much of what we know about the brain was gained by studies on animals, especially monkeys. The results on animals have been consistent with what can be seen in humans, but there is rarely detailed specific confirmation. This may be about to change.

Researchers at MIT are using fMRI with resolutions of a millimeter and MEG with a resolution of a millsecond and combining them with a method called representational similarity analysis. They had subjects look at 90 images of various things for half a second each. They looked at the same series of images multiple times being scanned with fMRI and multiple times with MEG. They then found the similarities between each image’s fMRI and MEG records for each subject. This allowed them to match the two scans and see both the spatial and the temporal changes as single events, resolved in time and space.

We wanted to measure how visual information flows through the brain. It’s just pure automatic machinery that starts every time you open your eyes, and it’s incredibly fast. This is a very complex process, and we have not yet looked at higher cognitive processes that come later, such as recalling thoughts and memories when you are watching objects.” This flow was extremely close to the flow found in monkeys.

It appears to take 50 milliseconds after exposure to an image for the visual information to reach the first area of the visual cortex (V1), during this time information would have passed through processing in the retina and the thalamus. The information then is processed by stages in the visual cortex and reaches the inferior temporal cortex at about 120 milliseconds. Here objects are identified and classified, all done by 160 milliseconds.

Here is the abstract:

A comprehensive picture of object processing in the human brain requires combining both spatial and temporal information about brain activity. Here we acquired human magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) responses to 92 object images. Multivariate pattern classification applied to MEG revealed the time course of object processing: whereas individual images were discriminated by visual representations early, ordinate and superordinate category levels emerged relatively late. Using representational similarity analysis, we combined human fMRI and MEG to show content-specific correspondence between early MEG responses and primary visual cortex (V1), and later MEG responses and inferior temporal (IT) cortex. We identified transient and persistent neural activities during object processing with sources in V1 and IT. Finally, we correlated human MEG signals to single-unit responses in monkey IT. Together, our findings provide an integrated space- and time-resolved view of human object categorization during the first few hundred milliseconds of vision.”

Source:

http://www.kurzweilai.net/where-and-when-the-brain-recognizes-categorizes-an-object - review of paper: Radoslaw Martin Cichy, Dimitrios Pantazis, Aude Oliva, Resolving human object recognition in space and time, Nature Neuroscience, 2014, DOI: 10.1038/nn.3635