{ "nbformat_minor": 0, "nbformat": 4, "cells": [ { "execution_count": null, "cell_type": "code", "source": [ "%matplotlib inline" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "\n\nHead model and forward computation\n==================================\n\nThe aim of this tutorial is to be a getting started for forward\ncomputation.\n\nFor more extensive details and presentation of the general\nconcepts for forward modeling. See `ch_forward`.\n\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "import mne\nfrom mne.datasets import sample\ndata_path = sample.data_path()\n\n# the raw file containing the channel location + types\nraw_fname = data_path + '/MEG/sample/sample_audvis_raw.fif'\n# The paths to freesurfer reconstructions\nsubjects_dir = data_path + '/subjects'\nsubject = 'sample'" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "Computing the forward operator\n------------------------------\n\nTo compute a forward operator we need:\n\n - a ``-trans.fif`` file that contains the coregistration info.\n - a source space\n - the BEM surfaces\n\n" ], "cell_type": "markdown", "metadata": {} }, { "source": [ "Compute and visualize BEM surfaces\n----------------------------------\n\nThe BEM surfaces are the triangulations of the interfaces between different\ntissues needed for forward computation. These surfaces are for example\nthe inner skull surface, the outer skull surface and the outer skill\nsurface.\n\nComputing the BEM surfaces requires FreeSurfer and makes use of either of\nthe two following command line tools:\n\n - `gen_mne_watershed_bem`\n - `gen_mne_flash_bem`\n\nHere we'll assume it's already computed. It takes a few minutes per subject.\n\nFor EEG we use 3 layers (inner skull, outer skull, and skin) while for\nMEG 1 layer (inner skull) is enough.\n\nLet's look at these surfaces. The function :func:`mne.viz.plot_bem`\nassumes that you have the the *bem* folder of your subject FreeSurfer\nreconstruction the necessary files.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "mne.viz.plot_bem(subject=subject, subjects_dir=subjects_dir,\n brain_surfaces='white', orientation='coronal')" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "Visualization the coregistration\n--------------------------------\n\nThe coregistration is operation that allows to position the head and the\nsensors in a common coordinate system. In the MNE software the transformation\nto align the head and the sensors in stored in a so-called **trans file**.\nIt is a FIF file that ends with -trans.fif. It can be obtained with\nmne_analyze (Unix tools), mne.gui.coregistration (in Python) or mrilab\nif you're using a Neuromag system.\n\nFor the Python version see func:`mne.gui.coregistration`\n\nHere we assume the coregistration is done, so we just visually check the\nalignment with the following code.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "# The transformation file obtained by coregistration\ntrans = data_path + '/MEG/sample/sample_audvis_raw-trans.fif'\n\ninfo = mne.io.read_info(raw_fname)\nmne.viz.plot_trans(info, trans, subject=subject, dig=True,\n meg_sensors=True, subjects_dir=subjects_dir)" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "Compute Source Space\n--------------------\n\nThe source space defines the position of the candidate source locations.\nThe following code compute such a cortical source space with\nan OCT-6 resolution.\n\nSee `setting_up_source_space` for details on source space definition\nand spacing parameter.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "src = mne.setup_source_space(subject, spacing='oct6',\n subjects_dir=subjects_dir,\n add_dist=False, overwrite=True)\nprint(src)" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "``src`` contains two parts, one for the left hemisphere (4098 locations) and\none for the right hemisphere (4098 locations). Sources can be visualized on\ntop of the BEM surfaces.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "mne.viz.plot_bem(subject=subject, subjects_dir=subjects_dir,\n brain_surfaces='white', src=src, orientation='coronal')" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "However, only sources that lie in the plotted MRI slices are shown.\nLet's write a few lines of mayavi to see all sources.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "import numpy as np # noqa\nfrom mayavi import mlab # noqa\nfrom surfer import Brain # noqa\n\nbrain = Brain('sample', 'lh', 'inflated', subjects_dir=subjects_dir)\nsurf = brain._geo\n\nvertidx = np.where(src[0]['inuse'])[0]\n\nmlab.points3d(surf.x[vertidx], surf.y[vertidx],\n surf.z[vertidx], color=(1, 1, 0), scale_factor=1.5)" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "Compute forward solution\n------------------------\n\nWe can now compute the forward solution.\nTo reduce computation we'll just compute a single layer BEM (just inner\nskull) that can then be used for MEG (not EEG).\n\nWe specify if we want a one-layer or a three-layer BEM using the\nconductivity parameter.\n\nThe BEM solution requires a BEM model which describes the geometry\nof the head the conductivities of the different tissues.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "conductivity = (0.3,) # for single layer\n# conductivity = (0.3, 0.006, 0.3) # for three layers\nmodel = mne.make_bem_model(subject='sample', ico=4,\n conductivity=conductivity,\n subjects_dir=subjects_dir)\nbem = mne.make_bem_solution(model)" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "Note that the BEM does not involve any use of the trans file. The BEM\nonly depends on the head geometry and conductivities.\nIt is therefore independent from the MEG data and the head position.\n\nLet's now compute the forward operator, commonly referred to as the\ngain or leadfield matrix.\n\nSee :func:`mne.make_forward_solution` for details on parameters meaning.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "fwd = mne.make_forward_solution(raw_fname, trans=trans, src=src, bem=bem,\n fname=None, meg=True, eeg=False,\n mindist=5.0, n_jobs=2)\nprint(fwd)" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "We can explore the content of fwd to access the numpy array that contains\nthe gain matrix.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "execution_count": null, "cell_type": "code", "source": [ "leadfield = fwd['sol']['data']\nprint(\"Leadfield size : %d sensors x %d dipoles\" % leadfield.shape)" ], "outputs": [], "metadata": { "collapsed": false } }, { "source": [ "To save to disk a forward solution you can use\n:func:`mne.write_forward_solution` and to read it back from disk\n:func:`mne.read_forward_solution`. Don't forget that FIF files containing\nforward solution should end with *-fwd.fif*.\n\nTo get a fixed-orientation forward solution, use\n:func:`mne.convert_forward_solution` to convert the free-orientation\nsolution to (surface-oriented) fixed orientation.\n\n" ], "cell_type": "markdown", "metadata": {} }, { "source": [ "Exercise\n--------\n\nBy looking at\n`sphx_glr_auto_examples_forward_plot_forward_sensitivity_maps.py`\nplot the sensitivity maps for EEG and compare it with the MEG, can you\njustify the claims that:\n\n - MEG is not sensitive to radial sources\n - EEG is more sensitive to deep sources\n\nHow will the MEG sensitivity maps and histograms change if you use a free\ninstead if a fixed/surface oriented orientation?\n\nTry this changing the mode parameter in :func:`mne.sensitivity_map`\naccordingly. Why don't we see any dipoles on the gyri?\n\n" ], "cell_type": "markdown", "metadata": {} } ], "metadata": { "kernelspec": { "display_name": "Python 2", "name": "python2", "language": "python" }, "language_info": { "mimetype": "text/x-python", "nbconvert_exporter": "python", "name": "python", "file_extension": ".py", "version": "2.7.13", "pygments_lexer": "ipython2", "codemirror_mode": { "version": 2, "name": "ipython" } } } }