Neural Plasticity

Neural Plasticity / 2016 / Article
Special Issue

Neuroplastic Mechanisms Underlying Perceptual and Cognitive Enhancement

View this Special Issue

Research Article | Open Access

Volume 2016 |Article ID 5940634 | 15 pages | https://doi.org/10.1155/2016/5940634

Effects of Fast Simple Numerical Calculation Training on Neural Systems

Academic Editor: Jyoti Mishra
Received12 Jul 2015
Revised21 Oct 2015
Accepted02 Nov 2015
Published06 Jan 2016

Abstract

Cognitive training, including fast simple numerical calculation (FSNC), has been shown to improve performance on untrained processing speed and executive function tasks in the elderly. However, the effects of FSNC training on cognitive functions in the young and on neural mechanisms remain unknown. We investigated the effects of 1-week intensive FSNC training on cognitive function, regional gray matter volume (rGMV), and regional cerebral blood flow at rest (resting rCBF) in healthy young adults. FSNC training was associated with improvements in performance on simple processing speed, speeded executive functioning, and simple and complex arithmetic tasks. FSNC training was associated with a reduction in rGMV and an increase in resting rCBF in the frontopolar areas and a weak but widespread increase in resting rCBF in an anatomical cluster in the posterior region. These results provide direct evidence that FSNC training alone can improve performance on processing speed and executive function tasks as well as plasticity of brain structures and perfusion. Our results also indicate that changes in neural systems in the frontopolar areas may underlie these cognitive improvements.

1. Introduction

This study focused on training to improve performance on a fast simple numerical calculation (FSNC) task, which involves quickly solving mathematical problems, namely, single-digit addition, subtraction, and multiplication. The ability to complete an FSNC task correlates with processing speed, quantitative ability or knowledge, and general intelligence [1]. Previous studies of psychological interventions showed that cognitive interventions involving arithmetic [2, 3] or FSNC [4] tasks lead to improvements in performance on untrained cognitive tasks (transfer effects) among the elderly as well as dementia patients.

However, some questions related to these studies remain to be answered. First, these studies used multiple training protocols such as reading and arithmetic involving simple and more complex numerical calculations [2, 3] or a battery of several cognitive training tasks including FSNC [4]. Thus, whether FSNC training alone affects untrained cognitive functions remains unclear. Second, whether the same effect occurs in the young remains to be investigated. Third, the effects on neural systems are also unclear.

This study aimed at investigating the effect of FSNC training on cognitive functions and neural systems in healthy young adults. Considering the transfer effects brought about by the FSNC training, it is important to investigate the extent of and neural mechanisms underlying these FSNC training effects. Among FSNC training-affected neural mechanisms, we focused on changes in regional gray matter volume (rGMV) and regional cerebral blood flow during rest (resting rCBF). Using imaging analyses, we determined whether the effects of FSNC training extend beyond task-specific functional activation in the brain and, if so, the areas in which such changes occur.

Based on the results of our previous study [4], we hypothesized that FSNC training would improve executive functions and processing speed and activate neural mechanisms in the prefrontal cortex. In that study, we found that participation in a brain training game that included FSNC improved subsequent performance on processing speed and speeded executive function tasks. Several prefrontal regions, including areas in the middle frontal, inferior frontal, orbitofrontal, and frontopolar areas, are activated during numerical calculation [5, 6] and executive functioning and are also associated with processing speed (e.g., [7]). Thus, consistent with the hypothesis of our previous study [4], we reasoned that neural mechanisms in the prefrontal regions would be affected by FSNC training.

Using (a) various psychological measures, such as arithmetic measures, processing speed, and executive function, (b) rGMV analysis with voxel-based morphometry (VBM), and (c) resting rCBF analyses, we investigated the effects of 1-week intensive (up to 4 h/day) adaptive FSNC training on these variables in healthy young adults. Training consisted of three paper-and-pencil FSNC tasks (addition, subtraction, and multiplication) as well as three computerized FSNC tasks (Figure 1). We included three different operations (addition, subtraction, and multiplication) and two formats (a computerized format and a paper-and-pencil format) to increase transfer effects. This reasoning is because, as a general rule, heterogeneous training programs are thought to strengthen transfer effects [8, 9].

2. Methods

2.1. Participants

The study reported here was implemented in conjunction with and shared control group participants with our previous study which investigated the effects of training on processing speed [10]. Sixty-three healthy, right-handed university or postgraduate students (32 men, 31 women) participated in this study. Their mean age was 21.6 years [standard deviation (SD), 1.68]. Of these 63 subjects, 23 were assigned to a processing speed training group for another study [10], and the data from the remaining 40 subjects were used in the FSNC group or the nonintervention control group. All participants had normal vision, and none had a history of neurological or psychiatric illness. The latter was assessed with our laboratory’s routine questionnaire about whether they had or have certain illnesses. Handedness was evaluated using the Edinburgh Handedness Inventory [11]. Each subject provided written informed consent according to the Declaration of Helsinki (1991). The Ethics Committee of Tohoku University approved the study.

Group assignments were performed in the following manner as described below. In 3 months, there were six experimental periods, each lasting 8 days. Among these six periods, the first, third, and fifth experimental periods involved PS training and no intervention. The rest of the experimental periods involved FSNC training and no intervention. The periods for the FSNC (and the nonintervention control group) and the periods for training on processing speed (and the nonintervention control group) were rotated. We could perform one type of intervention in one experimental period because of the limitation of several types of experimental resources. Subjects chose which period they participated at, but they did not know there were two types of training in the experiment. They were randomly assigned to intervention groups or the no-intervention group. Therefore, neither subjects nor experimenters could decide which groups subjects could be assigned to, and distribution amongst the three groups was arbitrary.

The FSNC training group comprised 19 participants (9 men, 10 women; mean age, 21.4 years; SD 1.8). The no-intervention group comprised 21 participants (12 men, 9 women; mean age, 21.2 years; SD 1.7). Participants in the training and no-intervention groups did not differ significantly (, two-tailed -tests) in basic background characteristics such as age, sex, and scores on Raven’s Advanced Progressive Matrix [12], which measures cognitive ability central to general intelligence [13], and scores for simple arithmetic tasks. One subject in both the training and no-intervention groups was not able to participate in post-MRI and psychological evaluations owing to ill health. Another participant in the training group was very slow at completing the paper-and-pencil FSNC task compared with performance expected from the performance simple arithmetic pretest measures (which means that participants were not doing the training tasks earnestly at all). The criteria are that speed of the paper-and-pencil training tasks in the first day of training was less than 80% of speed of the pretest arithmetic measure. These three participants were excluded from further analysis, leaving 17 participants in the FSNC group and 20 participants in the no-intervention group. Furthermore, two participants (one in the FSNC group and the other in the no-intervention group) who misunderstood the rules of the cognitive tasks used as outcome measures (characterized by few answers for simple tasks or chance-level accuracy) were excluded from the analyses involving those tasks.

2.2. Procedure

The FSNC training program comprised computerized Borland C++ programs developed in-house, which consisted of adaptive training of FSNC tasks. The tasks included three tasks that were performed using computer keys and three tasks that were performed using paper and a pencil. Participants in the training group underwent 5 days of training (approximately 4 h/day) within a 6-day period in the laboratory. The training on one day finished after the completion of a certain amount of training tasks, and the participants were allowed to take breaks when they thought that they needed breaks. Instructions and brief practice of all training tasks were given on the first training day before training began. All participants underwent MRI scanning and psychological tests immediately before and after this 6-day period. In other words, pretraining MRI scans and psychological tests were performed on day 1, training was provided from day 2 to day 7, and posttraining MRI scans and psychological tests were performed on day 8. The no-intervention group did not receive any training or perform any specific activity during the period separating the two MRI sessions. The timing and order of psychological tests and MRI sessions differed among participants (independent of the training group allocations) because, for MRI scans, only one participant could be scanned at a time and the psychological tests were performed in a group setting. Participants completed psychological test sessions when they were not participating in MRI scans. Because participants in the intervention group were required to participate in the 5-day training session during the 6-day intervention period, they had to complete the postexperiment evaluations 1 or 2 days after intervention completion.

The lack of an active control group (placebo training) has been common to almost all of the imaging studies of cognitive training. In particular, the use of the no-intervention group as a control group has been well described [10, 1419]. We believe that it is appropriate and congruent with the custom of the research field. For details related to this discussion, please refer to our previous study [15].

2.3. Training Tasks

All three computerized tasks were adaptive tasks, in which the problems were presented for a fixed period of time that was adjusted based on a subject’s performance (for details, see below). In all three computerized tasks, operations (simple calculations using a pair of single digits (e.g., 8–4)) were presented successively. The three computerized tasks included simple addition, subtraction, and multiplication (Figure 1). In these tasks, three simple calculations were presented in a vertical order (e.g., 6 + 7, 7 + 5, and 2 + 2) for each trial. The participants had to press the keys that corresponded to the digits of the answer to the first (top) of the three problems presented (in case of 6 + 7, the key to be pressed was 3, “tens” did not need to be keyed in response) before the next stimuli were presented (next trial). Participants could push the button whenever they wanted, but the button pushed at last during each trial was used to judge the correctness of answers. The keys 1, 2, 3, 4, 7, 8, 9, and 0 were pressed for answers of 1, 2, 3, 4, 7, 8, 9, and 0, respectively. However, in case of 5 and 6, the subject had to press the R and Y keys using their thumbs, respectively, so participants could push 10 buttons using 10 fingers. We instructed participants to press 10 keys using 10 fingers (i.e., the fifth finger of the left hand was used to press key 1, and the fourth finger of the left hand was used to press key 2; and the fifth finger of the right hand was used to press key 0). However, when it was impossible for the participants to adhere strictly to these instructions, they were allowed to press the keys with any finger. In the next trial, the second problem was presented at the top of the list, which was reordered so that the second highest problem was the one which was at the bottom of the list in the previous trial and a new problem appeared at the bottom of the new list (e.g., 7 + 5, 2 + 2, and 4 + 8 for the above example). A fixation was not used between the trials. The list of problems remained on the screen until the next trial started. This presentation enabled participants to solve the upcoming problem before the next trial (such as in the case of paper-and-pencil tasks) and created a kind of multitasking situation (such as in the case of paper-and-pencil tasks). In problems involving subtraction, the digits were presented such that the answers to the problem were not below 0. In these computerized tasks, performance of each block (a period during which operations were presented sequentially) was defined by the number of correct responses and each block ended after 24 trials. One session of each computerized task ended after 30 blocks with the exception of the first training day; on the first training day, one session of each computerized task ended after 20 blocks.

In all three computerized tasks, the difficulty (stimulus presentation rate) was modulated based on subject performance (the number of trials in which participants were able to input the correct answers (out of 24 trials) in one block is represented as below) by multiplying by 0.99 or 100/99; that is, the participants’ performance on each task was expressed as in a certain block and the stimulus presentation rate as in that block. When was 0–6, in the next block, the stimulus presentation rate was ; when was 7–9, in the next block, the stimulus presentation rate was ; when was 10–12, in the next block, the stimulus presentation rate did not change; and when was 13–24, in the next block, the stimulus presentation rate was . For example, when participants answered correctly in 16 out of 24 trials in one block and the stimulus presentation rate of the block was 2 stimuli/s, then in the next trial, the stimuli presentation rate became  stimuli/s. Basically, with this procedure, when participants performed the tasks properly at a given speed, then in the next block, the stimulus presentation rate was increased based on how well the participants could perform the tasks properly. When participants could not perform the tasks properly at a given speed, then in the next block, the stimulus presentation rate was decreased based on their performance. When participants’ performance was not so bad, then in the next block, the stimulus presentation rate did not change. In the computerized tasks, the participants began the training each day at the same level that they had finished at for each task on the previous day. The initial stimulus presentation rate was 1 stimulus/s. As for the difference in how many times participants met each operation (e.g., 1 + 4), for example, in the case of the computerized addition task, there were 64 possible operations (1 and 0 were removed from the problems and 8 × 8 = 64 operations existed) that occurred by the same possibility (repeat of the same operations could happen with a probability of 1/64 × 100 (%)). Also, as described above, participants faced addition operations 3600 times in the computerized addition task during the training period (24 trials (operations) × 30 blocks × 3 sessions × 1 day (first training day) + 24 trials (operations) × 30 blocks × 3 sessions × 4 days (second to fifth training days)). Thus, participants were expected to face each stimulus 56.25 times (3600/64) on average during the training period. Although the actual number must have differed among different operations due to the computerized randomization, due to the law of large numbers, that was not a significant concern.

In all three paper-and-pencil tasks, rows of problems of simple numerical calculations involving a pair of single digits (e.g., ) were printed. These tasks involved simple addition, subtraction, and multiplication (Figure 1). Participants had to solve these problems from the top order and write down the answers (if the answers were two-digit numbers, they also had to write down the tens digits). The participants were instructed to answer as many questions as possible in 1 min. They had to perform this task 10 times/session.

Computerized and paper-and-pencil tasks were alternated, and when the participants had completed three sessions for each of these tasks, training for that day was declared to be complete. The order of these tasks were fixed and were as follows: (1) the paper-and-pencil addition task (1 min × 10 times); (2) the computerized addition task (30 blocks (1 block consists of 24 trials), only in the first training day, 20 blocks); (3) the paper-and-pencil multiplication task (1 min × 10 times); (4) the computerized multiplication task (30 blocks (1 block consists of 24 trials), only in the first training day, 20 blocks); (5) the paper-and-pencil subtraction task (1 min × 10 times); and (6) the computerized subtraction (30 blocks (1 block consists of 24 trials), only in the first training day, 20 blocks). These processes (1 through 6) were repeated three times each day.

Both computerized and paper-and-pencil tasks were included in training because increasing the variability of tasks, stimuli, and training context leads to more successful transfer of information [8, 20, 21].

2.4. Psychological Outcome Measures

For the evaluation of the pre- and posttraining effects on psychological measures, a battery of neuropsychological tests and questionnaires was administered. These cognitive tests were generally the same as in our previous study [22] and evaluated a wide range of cognitive functions, but the tests that showed low test-retest reliabilities or some other problems in our previous study were replaced by alternate tests.

2.4.1. Arithmetic Tasks

[A] Arithmetic tasks, similar to the ones constructed by Grabner et al. [23], measured multiplication performance on two forms of one-digit × one-digit multiplication problems (a simple arithmetic task with numbers between 2 and 9) and two forms of two-digit × two-digit multiplication problems (a complex arithmetic task with numbers between 11 and 19). The two forms of each task were identical, but the numbers used in the problems were ordered differently. Each form of the simple and complex arithmetic tasks was presented with a time limit of 30 and 60 s, respectively.

2.4.2. Nonverbal Reasoning Tasks

[B] These included the following: Raven’s Advanced Progressive Matrices [12], a nonverbal reasoning task; [C] Cattell’s Culture Fair Test [24], a nonverbal reasoning test.

2.4.3. Working Memory Tasks

[D] These included the following: a (computerized) digit span task, a verbal working memory task (for the details of this task, see [25]); [E] a (computerized) visuospatial working memory task [10].

2.4.4. Intelligence Test with Speeded Tasks

[F] The test used was the Tanaka B-type intelligence test [26]. Type 3B, which is for examinees in their 3rd year of junior high school and older, was used in this study. This task was mainly performed as previously described [10]. This test is a nonverbal mass intelligence test which does not include story problems but uses figures, single numbers, and letters as stimuli. In all subtests, participants had to complete as many problems as possible within a certain time (a few minutes). This test consists of a maze test (participants had to trace a maze with a pencil from start to finish), counting cubes (participants had to count the number of cubes piled up in three-dimensional ways), a displacement task (figures and numbers; participants had to substitute a figure (9 figures) with a number (1 to 9) according to a model chart), identification versus same-different judgments (Japanese kana characters; participants had to judge whether a pair of meaningless Japanese strings were the same), filling in a sequence of numbers (participants had to fill in the blanks of a number sequence with suitable numbers according to the rules of the number arrangement), marking figures (participants had to select forms which were identical to three samples from a series (sequence) of eight different forms), and filling in figures (participants had to complete uncompleted figures so that the uncompleted figures were the same as the sample figures when rotated).

2.4.5. Simple Processing Speed Tasks and Executive Function (Inhibition Tasks)

[G] The task used was the Stroop task (Hakoda’s version) [27], which measures response inhibition and impulsivity and which is the matching-type Stroop task. The following description is essentially the same as the description in our previous study [28]. Unlike the oral naming-type Stroop tasks, in the matching-type Stroop task (writing), participants had to choose and write down as many appropriate answers as possible from five options. This type of task enables the measurement of participants’ performance correctly. The task consists of two control tasks (Word-Color task, Color-Word task), a reverse Stroop task, and a Stroop task. Reverse Stroop interference means the slowing of an output when participants have to provide the meaning of a word when there is a conflict between the meaning of the word and its printed color. In the Word-Color task, a color name (e.g., “blue”) is presented in the leftmost column. In addition, five columns are painted with five different colors and participants have to check the column whose color corresponds to the color name in the leftmost column. In the Color-Word task, the leftmost column is painted with a color, and five other columns contain color names. The participants have to check the column with the word corresponding to the name of the color painted in the leftmost column. In the reverse Stroop task, in the leftmost column, a color name is printed in another color (e.g., “blue” is printed in green) and five other columns are painted in five different colors. The participants have to check the column whose color corresponds to the color name in the leftmost column. In the Stroop task, in the leftmost column, a color name is printed in another color (e.g., “blue” is printed in green) and five other columns contain color names. The participants have to check the column with the word corresponding to the name of the color in which the word in the leftmost column is printed (Supplemental Figure 1 in Supplementary Material available online at http://dx.doi.org/10.1155/2016/5940634). During each task, the participants were instructed to complete as many tasks as possible in 1 min. Four tasks were performed in a fixed order, but the order of the task did not affect the performance of each task [27]. We used the Word-Color and Color-Word tasks as simple processing speed measures and Stroop and reverse Stroop tasks as inhibition measures [10].

2.4.6. Creativity Task

[H] The S-A creativity test [29] is used to evaluate creativity through divergent thinking and involves three types of tasks (for details on the development of this instrument and its psychometric properties, refer to the technical manual for this test [29]). The first, second, and third tasks require participants to generate unique ways of using typical objects, imagine desirable functions for ordinary objects, and imagine the consequences of “unimaginable things” happening, respectively. The S-A test scores the four dimensions of the creative process (fluency, originality, elaboration, and flexibility). In this study, the sum of the graded scores for the four dimensions was used in the analysis. For more details including the psychometric properties of this test, sample answers to the questionnaire, and the manner in which they were scored, refer to our previous studies [30, 31].

Other than these cognitive tests, we collected several questionnaires designed to assess the traits or states of the participants, but these are not reported here. In most cases, these were self-report questionnaires evaluating participant behavior in daily life. They were designed to assess the traits of the participants and not the effect of the five-day intervention. Other than the self-report questionnaires, all neuropsychological assessments were performed by postgraduate and undergraduate students blinded to the group membership of the participants.

2.5. Group-Level Statistical Analysis of Behavioral Data

Behavioral data were analyzed using SPSS 16.0 (SPSS Inc., Chicago, IL, USA). Effects of FSNC training on each measure were analyzed by comparing the FSNC training group and no-intervention group using one-way analyses of covariance (ANCOVAs). In these ANCOVAs, the differences between pre- and posttest measures were computed by subtracting the preintervention value from the postintervention value and were entered as dependent variables. Also, in ANCOVAs of the psychological measures, the pretest scores were entered as covariates to exclude the possibility that any preexisting differences between groups in the measures would affect the results of each measure (see below for the covariates of imaging data analyses). Repeated measure analyses of variance (ANOVAs) have no superiority over this design as far as we know, but an obvious inferiority of ANCOVA is that repeated measures ANOVAs cannot correct the effects of pretest scores and thus cannot control the preexisting difference in the measures between the groups. Because the superiority (or beneficial effects) of intervention training was our primary interest, in our behavioral analysis, test-retest changes in the group of interest were compared to those in the control group using one-tailed tests () [32, 33]. However, for psychological outcome measures in which “superiority” (or beneficial effects) was unclear (such as when the creativity test score was associated with an impaired selective attention system, psychosis, or cognitive disinhibition), two-tailed tests were used (for details, see [25, 34]).

We reported results that were only significant at the level of uncorrected data for multiple comparisons. This was partly because this study’s investigation of cognitive functions had an exploratory nature (administration of a wide range of cognitive tests of major cognitive functions regardless of the existence of strong a priori hypotheses). This was also partly because we followed the customs of the field in not performing a correction for multiple comparisons [10, 22, 32, 33, 3541]. However, we have also reported the statistical results after correction for multiple comparisons. The correction for multiple comparisons was performed using the false discovery rate (FDR) and the graphically sharpened method [42]. FDR was applied to analyses of 12 tests that are presented in Table 1.


 FSNC trainingControl Planned contrast in ANCOVA value
(uncorrected, corrected)
PrePostPrePost

Arithmetic
 Simple arithmetic (items)30.7 ± 1.237.8 ± 1.533.0 ± 1.135.0 ± 1.3FSNC training > control4.28 10−4, 0.003
 Complex arithmetic (items)7.79 ± 1.239.59 ± 1.717.03 ± 0.447.65 ± 0.52FSNC training > control0.0340, 0.060
Nonverbal reasoning
 RAPM (score)27.7 ± 0.930.7 ± 0.928.1 ± 0.830.1 ± 0.9FSNC training > control0.112, 0.157
 CCFT (score)31.7 ± 1.132.8 ± 2.029.8 ± 1.035.0 ± 1.3FSNC training > control0.754, 0.440
Working memory (WM)
 Digit span (score)37.9 ± 1.137.9 ± 1.236.5 ± 1.937.6 ± 1.7FSNC training > control0.702, 0.440
 Visuospatial WM (score)25.8 ± 1.728.6 ± 1.829.8 ± 1.230.6 ± 1.2FSNC training > control0.241, 0.211
Intelligence test with speeded tasks
 Tanaka B type intelligence test114.1 ± 3.5125.2 ± 3.4118.7 ± 2.8127.6 ± 2.8FSNC training > control0.301, 0.214
Simple processing speed
 Word-Color task (items)74.2 ± 2.081.1 ± 1.473.8 ± 1.380.2 ± 1.2FSNC training > control0.306, 0.214
 Color-Word task (items)54.2 ± 1.859.6 ± 1.652.9 ± 1.455.3 ± 1.7FSNC training > control0.005, 0.018
Executive function (inhibition)
 Reverse Stroop task (items)60.8 ± 2.368.0 ± 1.761.3 ± 1.666.0 ± 1.5FSNC training > control0.030, 0.060
 Stroop task (items)50.9 ± 2.155.0 ± 2.147.8 ± 1.651.0 ± 1.7FSNC training > control0.158, 0.158
Creativity
 S-A creativity test (total grade)23.4 ± 2.023.2 ± 1.626.1 ± 1.427.0 ± 1.3Two-tailed0.152, 0.158

Raven’s Advanced Progressive Matrices.
Cattell’s Culture Fair Test.
One-way ANCOVAs with test-retest differences in psychological measures as dependent variables and pretest scores on the psychological measures as covariates.
values of results that were corrected for multiple comparisons using FDR.

2.5.1. Confirmation of the Significant Behavioral Findings Using the Data of the Active Control Group in the Previous Study

However, this study had a smaller sample size than some previous studies [43] involving this type of behavioral analysis. Furthermore, it involved a number of cognitive tasks. Moreover, FSNC training led to significant improvements in only one of two measures each for processing speed and executive functioning in the main analysis (see Results). We therefore performed an additional analysis that included data from the active control (placebo training) group of our previous study [22] to increase the statistical power and reliability of the present study results and to directly address the lack of effect in the placebo training group.

In this previous study [22], the pre- and posttest measures were also separated by 1 week, and several (but not all) of the cognitive tests were performed in the same manner as in the present study, with participants having similar characteristics (healthy young adults). Furthermore, the placebo training group in this previous study [22] had the same training period, training time, and training frequency as the training group in the present study. This previous study had a no-intervention control group in addition to the active control group, but when the active control group was compared with the no-intervention group, no effects of placebo training were observed on task performance [22]. Thus, please note, in the description of the present paper, that there are 3 control groups (one no-intervention group in the experiment of the present study, one active control group from the previous study, and one no-intervention group from the previous study). And the data from the no-intervention group in the experiment of the present study and the active control group from the previous study was used in the confirmatory analysis of this subsection (further addition of the data of no-intervention group from the previous study just strengthened the values of the significant results in the confirmatory analysis of this subsection).

2.6. Image Acquisition

All MRI data acquisition was performed using a 3-T Philips Achieva scanner. Using a MPRAGE sequence, high-resolution 1-weighted structural images (240 × 240 matrix, TR = 6.5 ms, TE = 3 ms, FOV = 24 cm, 162 slices, slice thickness = 1 mm) were acquired. Arterial Spin Labeling (ASL) was performed to measure resting CBF. It was performed with quantitative signal-intensity targeting by alternating the radio-frequency pulse labeling of arterial regions (QUASAR), a pulsed ASL method [44]. Details of the sequence and the method for calculating perfusion parameters have been outlined elsewhere [4446]. The actual imaging parameters were as follows: 64 × 64 matrix, TR = 300 ms, TE = 22 ms, FOV = 24 cm, 7 slices, slice thickness = 7 mm (2.0 mm gap), SENSE = 2.5, 84 averages, and scan duration = 5 min 52 s. We determined the position of the slice by putting the fourth of seven slices on the body of the corpus callosum in the coronal scout view [47]. During ASL scan, the participants were instructed to remain still with their eyes closed, as motionless as possible, and not to sleep or think about anything in particular.

3. Preprocessing and Analysis of Structural Data

VBM, a method of in vivo study of human brain structures that can detect changes in rGM caused by training [17, 48], was used to investigate the effect of FSNC training on brain structures. Morphological data were preprocessed using the default cross-sectional methods of VBM2 software [49] and as performed in our previous study [22] as an extension of SPM2. We used cross-sectional methods with VBM2 software, and pre- and postimages of each participant were preprocessed independently to avoid asymmetry-induced bias [50]. To reduce the scanner-specific bias, we used a customized gray matter (GM) anatomical template and prior probability maps of GM and white matter images created from 1-weighted structural images obtained using this scanner in our previous study [30, 51]. Next, the 1-weighted structural images from each subject were segmented into GM and white matter partitions using the abovementioned custom GM and white matter prior probability maps. The resulting images included extracted GM and white matter partitions in the native space. The GM partition was then normalized to the abovementioned custom GM probability map. The normalization parameters determined from this initial step were applied to the native 1-weighted structural image. These normalized 1-weighted structural data were then segmented into GM and white matter partitions. In addition, we performed a volume change correction (modulation) by modulating each voxel with the Jacobian determinants derived from spatial normalization, allowing the determination of regional differences in the absolute amount of GM [52]. Subsequently, all images were smoothed by convolving them with an isotropic Gaussian Kernel of 12 mm full-width at half maximum (FWHM). 12-mm FWHM smoothing value is warranted in the cluster size test for VBM (see the paragraph below).

VBM2 was used instead of VBM5 or VBM8 for the preprocessing of 1-weighted structural imaging data because 1WIs obtained using our MPRAGE sequence (see above) were incompatible with preprocessing with VBM5/SPM5 and VBM8/SPM8. When VBM5 or SPM5 was used, many apparent segmentation errors occurred, unlike when the optimized protocol of VBM2 was used. Segmentation errors apparent at first glance were not found when VBM2 or VBM8 was used. However, when VBM8 was used, the test-retest reliability of total GMV of 50 participants who participated in a 1-week longitudinal intervention study in which 1WI was taken on the first day of the experiment and 1 week thereafter [10] was 0.746, whereas when VBM2 was used, the reliability was 0.980. It should be noted that this longitudinal intervention experiment is the same as the experiment in this study. In this experiment, 58 participants in three groups (FSNC training, no-intervention group, and PS-training group) completed the longitudinal experiment properly and enrolled in the analysis. Among these participants, the data from 50 participants were used to calculate reliability. Visual inspection was also conducted on the results of segmentation as a quality check. The results indicated that the quality did not seemingly differ between the pre- and postexperiment results for segmentation. These procedures (preprocessing with VBM2 and statistical analyses using different versions of SPM/VBM) were also used in previous studies [17, 22, 30, 51]. Although this data does not indicate that preprocessing with VBM5/VBM8 is worse, it does say something about the compatibility between 1WIs of certain sequences and VBM5/VBM8. For more extensive discussions related to this issue, please refer to [53].

In the group-level analysis, we tested for a change in rGMV between the first and second time points by comparing the training and control groups (i.e., group × time interaction). The statistical significance level was set at , corrected for multiple comparison (FWE) at the nonisotropic adjusted cluster level [54] with an underlying voxel-level of . Nonisotropic adjusted cluster size tests can and should be applied when cluster size tests are applied to nonstationary data (i.e., are not uniformly smooth), such as VBM data [54]. In this nonisotropic cluster size test of random field theory, a relatively higher cluster-determining voxel-level threshold combined with high smoothing values of more than six voxels leads to appropriate conservativeness in real data. With high smoothing values, an uncorrected threshold of seems to lead to too many false positives, whereas that of seems to lead to slight conservativeness [55].

Furthermore, we investigated whether preexisting differences in rGMV (in the preintervention scan) existed between the training and control groups at the whole brain level using ANOVA. In all of these group analyses of morphological data, we included only voxels with a GM value >0.10 to avoid the possibility of partial volume effects around the borders between GM and WM as well as those between GM and the cerebrospinal fluid (CSF).

We did not control the global signals across all brain imaging analyses, as was the case for almost all intervention imaging studies.

4. Preprocessing and Statistical Analysis of Resting rCBF Data

Maps of raw resting rCBF and the longitudinal relaxivity () of each subject were obtained using dedicated software running on IDL (Research Systems, Boulder, Colorado) ([44]; National Neuroscience Institute, Singapore). The following constants were used in CBF calculation: 1 of arterial blood, 1.65 s; inversion efficiency, 95%; blood-brain partition coefficients for GM and WM (0.98 and 0.82, resp.) [44].

Preprocessing and data analysis were performed using SPM5 implemented in Matlab, except in the segmentation procedure (see below), where SPM2 was used. This approach was used because the 1-weighted images acquired using the MPRAGE sequence were incompatible with VBM5 and SPM5 preprocessing and resulted in numerous apparent segmentation errors.

We segmented the 1-weighted images into GM, WM, and CSF. Then, using the segmented GM and WM images as masks, we removed the parts that did not belong to the GM and WM images from the 1-weighted image and created images that solely consisted of GM and WM (designated “GM + WM 1-weighted image”).

maps from the pre- and post-MRI scans of each subject, which lack the skull and skin section of the head and retain their alignment with the rCBF maps of each subject, were coregistered to the GM + WM 1-weighted image from the pre-MRI scan of each subject using the within-subject registration method. Asymmetry-induced bias was avoided during preprocessing of ASL images using the third structural image for the registration of pre- and post-ASL images (although this image is taken before the experiment, since the structural image and ASL images were collected separately, this fact does not cause asymmetry-induced bias).

The raw 1-weighted structural image from the pre-MRI scan of each subject, which maintained its alignment with the GM + WM 1-weighted image from the pre-MRI scan and rCBF maps from the pre- and post-MRI scans, was then normalized to our original template of the 1-weighted structural image which was established in our previous study on images of young adults taken with our scanner [56].

Using the parameters for this normalizing procedure, rCBF maps from the pre- and post-MRI scans of each subject were spatially normalized to create images with 2 × 2 × 2 mm3 voxels. The processed normalized rCBF maps from the pre-and post-MRI scans were then spatially smoothed using a Gaussian kernel of 12 mm FWHM. Finally, the signal change in resting rCBF between the pre- and post-scan images was computed at each voxel by subtracting the former image from the latter for each subject. The maps representing resting rCBF from the pre-MRI scan and those representing changes in resting rCBF from the pre-MRI scan to the post-MRI scan were subjected to the group-level analysis (see below). The total CBF in the whole brain may change through these processes, as was the case with when total signals in BOLD signal images and FA images change through normalization procedures [10, 57], it did not cause problems in this study.

In the group-level imaging analysis, we tested for group-wise differences in changes in resting rCBF. We performed voxel-wise ANCOVAs using the differences in each measure between the pre- and post-MRI scan values at each voxel as dependent variables and the pre-MRI scan values at each voxel as independent variables. These voxel-wise ANCOVAs were performed using Biological Parametrical Mapping (BPM) [58], which is implemented in SPM5 and using images representing prescan resting rCBF and pre/postscan changes in resting rCBF. This analysis using BPM was not applied to rGMV analysis because BPM does not handle the nonisotropic adjusted cluster size test, which was used in the rGMV analysis.

Regions of significance for the ASL analysis were inferred using cluster-level statistics of the standard SPM method [59]. Only clusters with , after correction for multiple comparisons (FWE) at cluster size with a voxel-level cluster-determining threshold of , uncorrected, were considered statistically significant in this analysis.

5. Investigation of Associations between Performance Changes of Training Tasks and Neural Changes

We next investigated whether there was an association between changes in performance on training tasks and neural changes where the effects of FSNC training were observed through simple regression analyses. Individual performance change was calculated as follows. First, each task’s best performance (in computerized tasks: the shortest interstimulus interval (ISI) of blocks in which participants answered correctly in more than half of the trials; in the paper-and-pencil tasks: the largest number of items completed in a single trial) of the first training day and that of the last training day were calculated across all six training tasks. Then, the ratio of change in these performances was calculated. Finally, the mean ratio of the three computerized tasks was averaged and that of three paper-and-pencil tasks was averaged and evaluated as the degree of the performance increase during the course of training. Next, we extracted the mean value of the pre- to posttraining changes in rGMV or resting rCBF in each of the significant clusters identified above. Then, simple regression analyses were performed to determine the association between the improvements in performance (either computerized tasks or paper-and-pencil tasks) on FSNC training tasks, and the neural changes of each cluster were calculated as described above. We employed one-tailed analyses to investigate the association between the increases in performance on FSNC training tasks and mean changes in each cluster in directions in which FSNC training effects were seen because that was our sole hypothesis and interest. In other words, when FSNC training resulted in a decrease in neural values, the associations between individual task performance increase and mean decrease in neural values for the cluster were investigated and vice versa.

6. Results

6.1. Training Data

Practice resulted in a significant increase in performance across all six training tasks (in computerized tasks: the shortest interstimulus interval (ISI) of blocks, in which participants answered correctly in more than half of trials in the day, was decreased; in the paper-and-pencil tasks: the number of items completed in a single trial was increased) from the first to the last day of training (paired -test, for all six training tasks; Figure 2).

6.2. Effect of FSNC Training on Psychological Outcome Measures

Compared with the control group, the training group showed significantly larger pre- to posttest increases in performance on a simple arithmetic (multiplication) task (, uncorrected), a complex arithmetic (multiplication) task (, uncorrected), a processing speed measure (Color-Word task, , uncorrected), and an executive function task (reverse Stroop task, , uncorrected; Table 1).

These results revealed that FSNC training improved performance on an untrained complex arithmetic task, a simple processing speed task, and a speeded executive function task. However, FSNC training consistently failed to improve performance on tasks involving working memory, nonverbal reasoning, and creativity measures.

We reported results with significant values that were uncorrected for multiple comparisons for the reasons described in Methods. However, even when correction for multiple comparisons was performed using FDR, the results of a simple arithmetic (multiplication) task and a processing speed measure (Color-Word task) remained significant (, corrected and , corrected, resp.), and the results of a complex arithmetic (multiplication) task and an executive function task (reverse Stroop task) still showed a nearly significant tendency (, corrected and , corrected, resp.).

6.2.1. Confirmation of the Significant Behavioral Findings

The mean ± SD of the scores of the tests for the active control group from the previous study are reproduced in Supplemental Table 1.

In the present analysis, comparison of the combined control group (the data from the no-intervention group in the experiment of the present study and the active control group from the previous study) and the FSNC training group showed that all significant results in the first behavioral analysis remained significant (simple arithmetic task, , uncorrected; complex arithmetic task, , uncorrected; Color-Word task, , uncorrected; reverse Stroop task, , uncorrected). Moreover, in this second behavioral analysis, the FSNC training group showed tendency of pre- to posttest increases in performance on the Stroop task (, uncorrected). No other changes were observed in the significance of the other tests (RAPM and a creativity measure) used in our previous study [22]. When the correction for multiple comparisons using FDR that was described above was performed against these values for the comparisons between the FSNC training group and the two control groups in the cases where data from the two control groups were available, in addition to the values for the comparisons between the FSNC training group and one control group in the present experiment in the cases where data from the previous study was not available, three of the significant uncorrected results (uncorrected) in this subsection (simple arithmetic task, Color-Word task, and reverse Stroop task) remained significant even after the correction of multiple comparisons (see Supplemental Table 1) and the result of the complex arithmetic task was also close to significance (, corrected). The results of this comparison further support the significance of the conclusions drawn in this study.

6.3. Effect of FSNC Training on GM Structures

VBM analysis tested for a change in brain structure after the intervention by comparing pre- and posttest images from the training and control groups (group × time interaction). This analysis revealed that FSNC training resulted in a statistically significant greater decrease in rGMV around the right frontopolar area (the right middle and superior frontal gyri; Figure 3). No statistically significant FSNC training-related increases in rGMV were observed. For statistical values, see Table 2.


Area score of the peak voxel Corrected value (cluster)Raw cluster size (mm3)

Frontopolar area (superior frontal gyrus, orbital part/middle frontal gyrus, orbital part/superior frontal gyrus, and medial orbital/gyrus rectus)R1360−224.180.0232128

No other significant results were observed.

Furthermore, whole-brain ANOVA showed no significant regional differences in rGMV between the training and control groups before the intervention (pre-MRI scan; , corrected for multiple comparisons).

6.4. Effect of FSNC Training on Resting rCBF

We next compared changes in resting rCBF in the training and control groups. However, no region showed statistically significant changes at a threshold of , corrected for multiple comparisons at the cluster level with a cluster-determining voxel-level threshold of , uncorrected. ASL, which was used to measure resting rCBF, may lack regional sensitivity due to a number of reasons [56]. Thus, we performed statistical analysis with a more lenient cluster-determining voxel-level threshold (, uncorrected) to determine whether any weak but widely significant results could be observed. This analysis revealed a statistically significant FSNC training-related (training group versus control group) increase in resting rCBF in (a) a large cluster that included the precuneus, left postcentral gyrus, right middle temporal gyrus, left inferior parietal lobule, right superior occipital lobe, paracentral lobule, posterior cingulate cortex, right parahippocampal gyrus, and left superior parietal lobule (Figure 4(a)) and (b) a large cluster mainly in the bilateral frontopolar areas but also in the bilateral middle and superior frontal gyri (Figure 4(b)). No regions showed statistically significant FSNC training-related decreases in resting rCBF. The frontal cluster located close to the significant cluster of FSNC-related rGMV change but did not overlap when the latter cluster was formed with a threshold of uncorrected but rather overlapped when the latter cluster was formed with a threshold of uncorrected , a more lenient threshold. Because much of the orbitofrontal area was barely included in analyses of ASL due to the limited scan area of this scan method (for areas of ASL analyses, see supplemental Figure 2), it is difficult to conclude something regarding the overlap of the cluster of rGMV and cluster of ASL in the frontal area. For statistical values, see Table 3.


Area score of the peak voxel Corrected value (cluster)Raw cluster size (mm3)

Posterior cluster [angular gyrus (B)/calcarine cortex (B)/middle and posterior cingulate gyrus (B)/posterior cingulate gyrus (B)/cuneus (B)/lingual gyrus (B)/middle and superior occipital lobe (B)/paracentral lobule (L)/parahippocampal gyrus (R)/inferior parietal lobule (L)/superior parietal lobule (B)/postcentral gyrus (L)/precuneus (B)/inferior and middle temporal lobe (R)/cerebellum (B)]
 Cerebellum250−63.180.00147512
  Calcarine cortexL−10−70162.81
  PrecuneusL−8−58462.75
Anterior cluster [middle frontal gyrus (B)/superior frontal gyrus (B)/medial part of superior frontal gyrus (B)]
 Middle frontal gyrusL−2648362.780.01131376
  Middle frontal gyrusL−3832362.53
  Superior frontal gyrusR2254342.49

No other significant results were observed.

6.5. Associations between Neural Changes and FSNC Training Tasks’ Performance Changes

Simple regression analyses that tested correlations between improvements in the performance of computerized FSNC training tasks and paper-and-pencil FSNC training tasks and the amount of rGMV and resting rCBF changes in the significant clusters identified in this study’s analyses (see above) were also performed ( analyses). The results only revealed the tendency of the negative correlation between rGMV change in the cluster of the right OFC and computerized FSNC tasks’ performance change (, ). The result may suggest an association between an rGMV decrease on the right OFC and an increase in FSNC training tasks performance; however, due to the nonsignificant tendency of the results, we cannot draw conclusions.

7. Discussion

To the best of our knowledge, the present study is the first to reveal the effects of FSNC training on cognitive function, rGMV, and resting rCBF in healthy young adults. Our previous study showed that cognitive training including FSNC improves performance on untrained processing speed and speeded executive functioning tasks [4] in the elderly. In this study, consistent with our previous study, FSNC training was associated with improvements in performance on simple processing speed, speeded executive functioning, and simple and complex arithmetic tasks. Moreover, consistent with our hypothesis proposing the involvement of the prefrontal cortex, FSNC training was associated with a reduction in rGMV and an increase in resting rCBF in the prefrontal cortex, specifically in the frontopolar areas. FSNC training was also associated with a weak but widespread increase in resting rCBF in an anatomical cluster in the posterior region.

The FSNC-related change in rGMV in the right frontopolar area (Figure 3) may be caused by a requirement for a certain type of multitasking operation during FSNC and may mediate the FSNC-related improvement in performance on untrained cognitive tasks. The frontopolar region is said to be involved with multitasking, and lesions in this region lead to impaired multitasking [60]. One of the important cognitive operations involved in faster performance on the FSNC task is observing and solving the next problem whilst writing down the answer to the current problem (a certain form of multitasking is apparently required here; Figure 1). We specifically designed the computerized tasks to have the same characteristics as the pen-and-paper task, where participants could look ahead to the next problem. This multitasking-like cognitive operation may require the frontopolar area, and FSNC training may lead to a change in rGMV in this region. This cognitive operation appears to be required for a number of cognitive tasks in which participants have to solve as many problems as possible. Thus, a certain type of multitasking operation in FSNC training may affect the right frontopolar area, which in turn may affect other types of cognitive tasks that require the same cognitive operation. Please note that the present cognitive training has few commonalities with the working memory training using calculations used in our previous study (for details, see [22]), and a lack of regional overlap in the effects of cognitive training is expected.

On the other hand, it has been proposed that there are two networks for calculation in the brain (one for exact calculation and the other for approximate calculation) [6]. The former network involves the frontopolar area as well as the posterior parietal region, which also showed significant neural changes in this study. Although, in the previous study, this activation was found in the left hemisphere, in this study, a similar tendency was also seen in the left hemisphere. Also, a closer look at the clusters of significant effects for the two studies in this area revealed a substantial overlap. Thus, as the present study involved training for exact number calculation, the changes in neural systems in this study may be regarded as changes in neural systems involved in exact calculation.

Decreases in rGMV observed after just 1 week of intensive (such as four hours per day) cognitive training are consistent with those observed in two of our previous studies of cognitive training [10, 22]. Decrease of rGMV in some regions after a short period of the intervention was seen in a number of other previous studies [48, 61, 62]. In our previous studies [10, 22], we suggested that cognitive training may lead to nonlinear changes (an initial increase followed by a decrease) in rGMV and that these changes are affected by training length and intensity (greater intensity leads to a more rapid nonlinear change) [10]. This suggestion was based on the results of previous studies as well as on a review of previous studies. Further, the increase of rGMV, which is often seen in longitudinal studies (e.g., [63]), was suggested to correspond to an initial increase in this process. We regarded usage-dependent selective elimination of synapses [64], which underlies day-to-day experience-dependent neural plasticity [65], as a potential mechanism underlying the decrease in rGMV. These notions may be consistent with the recent findings that learning new processes can lead to a transient increase of spine formation and that this rapid spinogenesis is followed by an enhanced elimination of spines that existed before training [66].

FSNC training may increase resting rCBF in the frontopolar and posterior regions through changes in the capillary network and increases in metabolic demand. The posterior parietal, posterior temporal, and occipital regions are recruited during simple numerical calculation [5]. In particular, previously known arithmetical facts appear to be accessed from the memory via the angular gyrus, while the intraparietal sulci are involved in tasks involving explicit representation of magnitude, such as subtraction [67]. Thus, together with the frontopolar area, the present results relating to resting rCBF may show experience-dependent plasticity of resting rCBF in performance on FSNC tasks. Experience-dependent neural plasticity involves increases in the width and density of capillaries [68, 69] and mitochondria [68], which lead to increased metabolic demand. FSNC training may increase resting rCBF through such changes. Alternatively, the increase of rCBF may just reflect prolonged enhancement of the default activity in these regions. These activity changes as well as changes of synapse and spines, genesis of cells, and angiogenesis occur within days to weeks [70]. Thus, it is unsurprising that there can be neural changes after only 1 week of intervention.

Consistent with our hypothesis, FSNC training led to improved performance on executive function (reverse Stroop task), processing speed, and simple and complex arithmetic tasks. Our previous study showed that a brain training game including FSNC improved executive function and processing speed [4]. This study further supports the notion that training involving complex speeded tasks leads to improvements in processing speed and certain types of executive functioning (possibly in speeded tasks). Moreover, our study extended the previous findings and showed that FSNC training alone can improve processing speed and executive function in healthy young adults. Furthermore, this study is consistent with our previous study in that training did not lead to improved performance on working memory measures [4]. FSNC training also did not improve performance on nonverbal reasoning and creativity tasks. If anything, a trend for FSNC training to decrease performance on creativity tasks was observed. Thus, these findings appear to show that this type of training does not lead to improvements in nonverbal reasoning, working memory, or creativity.

This study has the same limitations as our previous studies of cognitive training [10, 15, 19, 22]. The multiple training programs used in this study (computerized and paper-and-pencil tasks and three different types of operations) and previous studies ([33], e.g., [71]) are believed to strengthen transfer effects [8, 9]. They may also make it difficult to observe the effects of each training program individually [72]. The next limitation of this study is the complex training protocols [37, 43], which are commonly observed in this kind of study whether the training is about working memory training [43], video gaming [73], or meditation [37]. They typically have none of the strict control groups or conditions included in normal fMRI studies. Thus, the present study did not and did not even attempt to separate the effects of motor components, attention components, speed components, and components of some types of multitasking as described above from the effects of FSNC and we regard them essential and inseparable components of FSNC whenever the FSNC is performed. Furthermore, we used a lenient voxel-determining threshold in the rCBF analysis because ASL may lack in regional sensitivity due to many reasons. However, weak but widespread effects were still observed. The cluster-level statistic can control for type I error when the cluster-determining voxel-level threshold is lenient [74]. However, because of the nature of this statistical method, when a lenient cluster-determining voxel-level threshold is applied, the results cannot specify the exact location of the regions that have effects of interest [59]. Thus, the exact locations in which FSNC training has an effect on resting rCBF remain to be verified.

The executive function and frontopolar areas affected by FSNC training play important roles in higher order cognitive functioning in humans [60, 75]. Thus, the present findings have implications for plasticity of human higher-order cognitive functioning as well as implications for the application of FSNC training in fields such as education. Further, present results give new insights into the neural and the cognitive mechanisms, with which the FSNC training improves cognitive functions.

Conflict of Interests

The authors declare no conflict of interests.

Authors’ Contribution

Hikaru Takeuchi, Tomomi Nagase, Yasuyuki Taki, Yuko Sassa, Hiroshi Hashizume, and Rui Nouchi performed the experiment. All authors discussed the findings. Hikaru Takeuchi, Tomomi Nagase, Yasuyuki Taki, and Ryuta Kawashima conceived the experiments. Hikaru Takeuchi wrote the paper and developed the computer program for the training and Tomomi Nagase prepared the paper-and-pencil version’s training. Hikaru Takeuchi and Tomomi Nagase contributed equally to this work.

Acknowledgments

The authors thank Yuki Yamada for operating the MRI scanner, the participants, the testers for the psychological tests, and all their other colleagues in Institute of Development, Aging and Cancer, Tohoku University, for their support. This study was supported by Japan Science and Technology Agency (JST)/Research Institute of Science and Technology for Society and JST/Core Research for Evolutional Science and Technology.

Supplementary Materials

Supplementary materials contain the table which described the data of the active control group in our previous study (Supplemental Table 1), the figure which showed the schema of the stroop tasks that we used (Supplemental Figure 1), and the figure which showed the areas of regional cerebral blood flow during rest (Supplemental figure 2).

  1. Supplementary Material

References

  1. P. LeAdelle, K. S. McGrew, S. N. Knopik, and L. Ford, “The general (g), broad, and narrow CHC stratum characteristics of the WJ III and WISC-III tests: a confirmatory cross-battery investigation,” School Psychology Quarterly, vol. 20, no. 1, pp. 66–88, 2005. View at: Publisher Site | Google Scholar
  2. S. Uchida and R. Kawashima, “Reading and solving arithmetic problems improves cognitive functions of normal aged people: a randomized controlled study,” Age, vol. 30, no. 1, pp. 21–29, 2008. View at: Publisher Site | Google Scholar
  3. R. Kawashima, K. Okita, R. Yamazaki et al., “Reading aloud and arithmetic calculation improve frontal function of people with dementia,” Journals of Gerontology—Series A: Biological Sciences and Medical Sciences, vol. 60, no. 3, pp. 380–384, 2005. View at: Publisher Site | Google Scholar
  4. R. Nouchi, Y. Taki, H. Takeuchi et al., “Brain training game improves executive functions and processing speed in the elderly: a randomized controlled trial,” PLoS ONE, vol. 7, no. 1, Article ID e29676, 2012. View at: Publisher Site | Google Scholar
  5. R. Kawashima, M. Taira, K. Okita et al., “A functional MRI study of simple arithmetic—a comparison between children and adults,” Cognitive Brain Research, vol. 18, no. 3, pp. 227–233, 2004. View at: Publisher Site | Google Scholar
  6. S. Dehaene, E. Spelke, P. Pinel, R. Stanescu, and S. Tsivkin, “Sources of mathematical thinking: behavioral and brain-imaging evidence,” Science, vol. 284, no. 5416, pp. 970–974, 1999. View at: Publisher Site | Google Scholar
  7. K. M. Kennedy and N. Raz, “Aging white matter and cognition: differential effects of regional variations in diffusion properties on memory, executive functions, and speed,” Neuropsychologia, vol. 47, no. 3, pp. 916–927, 2009. View at: Publisher Site | Google Scholar
  8. J. Sweller, J. J. G. Van Merrienboer, and F. G. W. C. Paas, “Cognitive architecture and instructional design,” Educational Psychology Review, vol. 10, no. 3, pp. 251–296, 1998. View at: Publisher Site | Google Scholar
  9. R. L. Goldstone, “Perceptual learning,” Annual Review of Psychology, vol. 49, pp. 585–612, 1998. View at: Publisher Site | Google Scholar
  10. H. Takeuchi, Y. Taki, H. Hashizume et al., “Effects of training of processing speed on neural systems,” The Journal of Neuroscience, vol. 31, no. 34, pp. 12139–12148, 2011. View at: Publisher Site | Google Scholar
  11. R. C. Oldfield, “The assessment and analysis of handedness: the Edinburgh inventory,” Neuropsychologia, vol. 9, no. 1, pp. 97–113, 1971. View at: Publisher Site | Google Scholar
  12. J. Raven, Manual for Raven's Progressive Matrices and Vocabulary Scales, Oxford Psychologists Press, Oxford, UK, 1998.
  13. R. E. Snow, “Toward assessment of cognitive and conative structures in learning,” Educational Researcher, vol. 18, pp. 8–14, 1989. View at: Publisher Site | Google Scholar
  14. E. Dahlin, A. S. Neely, A. Larsson, L. Bäckman, and L. Nyberg, “Transfer of learning after updating training mediated by the striatum,” Science, vol. 320, no. 5882, pp. 1510–1512, 2008. View at: Publisher Site | Google Scholar
  15. H. Takeuchi, Y. Taki, R. Nouchi et al., “Effects of working memory-training on functional connectivity and cerebral blood flow during rest,” Cortex, vol. 49, no. 8, pp. 2106–2125, 2013. View at: Publisher Site | Google Scholar
  16. L. Bäckman, L. Nyberg, A. Soveri et al., “Effects of working-memory training on striatal dopamine release,” Science, vol. 333, no. 6043, p. 718, 2011. View at: Publisher Site | Google Scholar
  17. R. Ilg, A. M. Wohlschläger, C. Gaser et al., “Gray matter increase induced by practice correlates with task-specific activation: a combined functional and morphometric magnetic resonance imaging study,” The Journal of Neuroscience, vol. 28, no. 16, pp. 4210–4215, 2008. View at: Publisher Site | Google Scholar
  18. J. Scholz, M. C. Klein, T. E. J. Behrens, and H. Johansen-Berg, “Training induces changes in white-matter architecture,” Nature Neuroscience, vol. 12, no. 11, pp. 1370–1371, 2009. View at: Publisher Site | Google Scholar
  19. H. Takeuchi, Y. Taki, R. Nouchi et al., “Effects of multitasking-training on gray matter structure and resting state neural mechanisms,” Human Brain Mapping, vol. 35, no. 8, pp. 3646–3660, 2014. View at: Publisher Site | Google Scholar
  20. C. S. Green and D. Bavelier, “Exercising your brain: a review of human brain plasticity and training-induced learning,” Psychology and Aging, vol. 23, no. 4, pp. 692–701, 2008. View at: Publisher Site | Google Scholar
  21. S. Yamnill and G. N. McLean, “Theories supporting transfer of training,” Human Resource Development Quarterly, vol. 12, no. 2, pp. 195–208, 2001. View at: Publisher Site | Google Scholar
  22. H. Takeuchi, Y. Taki, Y. Sassa et al., “Working memory training using mental calculation impacts regional gray matter of the frontal and parietal regions,” PLoS ONE, vol. 6, no. 8, Article ID e23175, 2011. View at: Publisher Site | Google Scholar
  23. R. H. Grabner, D. Ansari, G. Reishofer, E. Stern, F. Ebner, and C. Neuper, “Individual differences in mathematical competence predict parietal brain activation during mental calculation,” NeuroImage, vol. 38, no. 2, pp. 346–356, 2007. View at: Publisher Site | Google Scholar
  24. R. B. Cattell and A. K. S. Cattell, Measuring Intelligence with the Culture Fair Tests, Institute for Personality and Ability Testing, Champaign, Ill, USA, 1973.
  25. H. Takeuchi, Y. Taki, H. Hashizume et al., “Failing to deactivate: the association between brain activity during a working memory task and creativity,” NeuroImage, vol. 55, no. 2, pp. 681–687, 2011. View at: Publisher Site | Google Scholar
  26. K. Tanaka, K. Okamoto, and H. Tanaka, Manual of New Tanaka B Type Intelligence Test, Kaneko Syobo, Tokyo, Japan, 2003.
  27. Y. Hakoda and M. Sasaki, “Group version of the Stroop and reverse-Stroop Test: the effects of reaction mode, order and practice,” The Japanese Journal of Educational Psychology, vol. 38, no. 4, pp. 389–394, 1990. View at: Publisher Site | Google Scholar
  28. H. Takeuchi, Y. Taki, Y. Sassa et al., “Regional gray and white matter volume associated with stroop interference: evidence from voxel-based morphometry,” NeuroImage, vol. 59, no. 3, pp. 2899–2907, 2012. View at: Publisher Site | Google Scholar
  29. Society For Creative Minds, Manual of S-A Creattvity Test, Tokyo Shinri Corporation, Tokyo, Japan, 1969.
  30. H. Takeuchi, Y. Taki, Y. Sassa et al., “Regional gray matter volume of dopaminergic system associate with creativity: evidence from voxel-based morphometry,” NeuroImage, vol. 51, no. 2, pp. 578–585, 2010. View at: Publisher Site | Google Scholar
  31. H. Takeuchi, Y. Taki, Y. Sassa et al., “White matter structures associated with creativity: evidence from diffusion tensor imaging,” NeuroImage, vol. 51, no. 1, pp. 11–18, 2010. View at: Publisher Site | Google Scholar
  32. T. Klingberg, E. Fernell, P. J. Olesen et al., “Computerized training of working memory in children with ADHD-a randomized, controlled trial,” Journal of the American Academy of Child & Adolescent Psychiatry, vol. 44, no. 2, pp. 177–186, 2005. View at: Publisher Site | Google Scholar
  33. T. Klingberg, H. Forssberg, and H. Westerberg, “Training of working memory in children with ADHD,” Journal of Clinical and Experimental Neuropsychology, vol. 24, no. 6, pp. 781–791, 2002. View at: Publisher Site | Google Scholar
  34. E. Necka, “Creativity and attention,” Polish Psychological Bulletin, vol. 30, pp. 85–98, 1999. View at: Google Scholar
  35. H. W. Mahncke, B. B. Connor, J. Appelman et al., “Memory enhancement in healthy older adults using a brain plasticity-based training program: a randomized, controlled study,” Proceedings of the National Academy of Sciences of the United States of America, vol. 103, no. 33, pp. 12523–12528, 2006. View at: Publisher Site | Google Scholar
  36. S. M. Jaeggi, M. Buschkuehl, J. Jonides, and P. Shah, “Short- and long-term benefits of cognitive training,” Proceedings of the National Academy of Sciences of the United States of America, vol. 108, no. 25, pp. 10081–10086, 2011. View at: Publisher Site | Google Scholar
  37. Y.-Y. Tang, Y. Ma, J. Wang et al., “Short-term meditation training improves attention and self-regulation,” Proceedings of the National Academy of Sciences of the United States of America, vol. 104, no. 43, pp. 17152–17156, 2007. View at: Publisher Site | Google Scholar
  38. M. R. Rueda, M. K. Rothbart, B. D. McCandliss, L. Saccomanno, and M. I. Posner, “Training, maturation, and genetic influences on the development of executive attention,” Proceedings of the National Academy of Sciences of the United States of America, vol. 102, no. 41, pp. 14931–14936, 2005. View at: Publisher Site | Google Scholar
  39. K. I. Erickson, M. W. Voss, R. S. Prakash et al., “Exercise training increases size of hippocampus and improves memory,” Proceedings of the National Academy of Sciences of the United States of America, vol. 108, no. 7, pp. 3017–3022, 2011. View at: Publisher Site | Google Scholar
  40. H. J. Neville, C. Stevens, E. Pakulak et al., “Family-based training program improves brain function, cognition, and behavior in lower socioeconomic status preschoolers,” Proceedings of the National Academy of Sciences of the United States of America, vol. 110, no. 29, pp. 12138–12143, 2013. View at: Publisher Site | Google Scholar
  41. S. Anderson, T. White-Schwoch, A. Parbery-Clark, and N. Kraus, “Reversal of age-related neural timing delays with training,” Proceedings of the National Academy of Sciences of the United States of America, vol. 110, no. 11, pp. 4357–4362, 2013. View at: Publisher Site | Google Scholar
  42. Y. Benjamini, A. M. Krieger, and D. Yekutieli, “Adaptive linear step-up procedures that control the false discovery rate,” Biometrika, vol. 93, no. 3, pp. 491–507, 2006. View at: Publisher Site | Google Scholar | MathSciNet
  43. S. M. Jaeggi, M. Buschkuehl, J. Jonides, and W. J. Perrig, “Improving fluid intelligence with training on working memory,” Proceedings of the National Academy of Sciences of the United States of America, vol. 105, no. 19, pp. 6829–6833, 2008. View at: Publisher Site | Google Scholar
  44. E. T. Petersen, T. Lim, and X. Golay, “Model-free arterial spin labeling quantification approach for perfusion MRI,” Magnetic Resonance in Medicine, vol. 55, no. 2, pp. 219–232, 2006. View at: Publisher Site | Google Scholar
  45. E. T. Petersen, K. Mouridsen, and X. Golay, “The QUASAR reproducibility study, part II: results from a multi-center arterial spin labeling test-retest study,” NeuroImage, vol. 49, no. 1, pp. 104–113, 2010. View at: Publisher Site | Google Scholar
  46. E. T. Petersen, I. Zimine, Y.-C. L. Ho, and X. Golay, “Non-invasive measurement of perfusion: a critical review of arterial spin labelling techniques,” British Journal of Radiology, vol. 79, no. 944, pp. 688–701, 2006. View at: Publisher Site | Google Scholar
  47. Y. Taki, H. Hashizume, Y. Sassa et al., “Correlation between gray matter density-adjusted brain perfusion and age using brain MR images of 202 healthy children,” Human Brain Mapping, vol. 32, no. 11, pp. 1973–1985, 2011. View at: Publisher Site | Google Scholar
  48. J. Driemeyer, J. Boyke, C. Gaser, C. Büchel, and A. May, “Changes in gray matter induced by learning—revisited,” PLoS ONE, vol. 3, no. 7, Article ID e2669, 2008. View at: Publisher Site | Google Scholar
  49. C. Gaser, “VBM Toolbox for SPM2, VBM Toolbox for SPM5,” 2007, http://www.neuro.uni-jena.de/. View at: Google Scholar
  50. M. Reuter and B. Fischl, “Avoiding asymmetry-induced bias in longitudinal image processing,” NeuroImage, vol. 57, no. 1, pp. 19–21, 2011. View at: Publisher Site | Google Scholar
  51. H. Takeuchi, Y. Taki, Y. Sassa et al., “Regional gray matter density associated with emotional intelligence: evidence from voxel-based morphometry,” Human Brain Mapping, vol. 32, no. 9, pp. 1497–1510, 2011. View at: Publisher Site | Google Scholar
  52. J. Ashburner and K. J. Friston, “Voxel-based morphometry—the methods,” NeuroImage, vol. 11, no. 6, pp. 805–821, 2000. View at: Publisher Site | Google Scholar
  53. H. Takeuchi, Y. Taki, R. Nouchi et al., “A voxel-based morphometry study of gray and white matter correlates of a need for uniqueness,” NeuroImage, vol. 63, no. 3, pp. 1119–1126, 2012. View at: Publisher Site | Google Scholar
  54. S. Hayasaka, K. L. Phan, I. Liberzon, K. J. Worsley, and T. E. Nichols, “Nonstationary cluster-size inference with random field and permutation methods,” NeuroImage, vol. 22, no. 2, pp. 676–687, 2004. View at: Publisher Site | Google Scholar
  55. M. Silver, G. Montana, and T. E. Nichols, “False positives in neuroimaging genetics using voxel-based morphometry data,” NeuroImage, vol. 54, no. 2, pp. 992–1000, 2012. View at: Publisher Site | Google Scholar
  56. H. Takeuchi, Y. Taki, H. Hashizume et al., “Cerebral blood flow during rest associates with general intelligence and creativity,” PLoS ONE, vol. 6, no. 9, Article ID e25532, 2011. View at: Publisher Site | Google Scholar
  57. H. Takeuchi, A. Sekiguchi, Y. Taki et al., “Training of working memory impacts structural connectivity,” Journal of Neuroscience, vol. 30, no. 9, pp. 3297–3303, 2010. View at: Publisher Site | Google Scholar
  58. R. Casanova, R. Srikanth, A. Baer et al., “Biological parametric mapping: a statistical toolbox for multimodality brain image analysis,” NeuroImage, vol. 34, no. 1, pp. 137–143, 2007. View at: Publisher Site | Google Scholar
  59. K. J. Friston, A. Holmes, J.-B. Poline, C. J. Price, and C. D. Frith, “Detecting activations in PET and fMRI: levels of inference and power,” NeuroImage, vol. 4, no. 3, pp. 223–235, 1996. View at: Publisher Site | Google Scholar
  60. J.-C. Dreher, E. Koechlin, M. Tierney, and J. Grafman, “Damage to the fronto-polar cortex is associated with impaired multitasking,” PLoS ONE, vol. 3, no. 9, Article ID e3227, 2008. View at: Publisher Site | Google Scholar
  61. M. M. Quallo, C. J. Price, K. Ueno et al., “Gray and white matter changes associated with tool-use learning in macaque monkeys,” Proceedings of the National Academy of Sciences of the United States of America, vol. 106, no. 43, pp. 18379–18384, 2009. View at: Publisher Site | Google Scholar
  62. A. May, G. Hajak, S. Gänssbauer et al., “Structural brain alterations following 5 days of intervention: dynamic aspects of neuroplasticity,” Cerebral Cortex, vol. 17, no. 1, pp. 205–210, 2007. View at: Publisher Site | Google Scholar
  63. H. Takeuchi, Y. Taki, R. Nouchi et al., “Effects of working memory training on functional connectivity and cerebral blood flow during rest,” Cortex, vol. 49, no. 8, pp. 2106–2125, 2013. View at: Publisher Site | Google Scholar
  64. P. R. Huttenlocher and A. S. Dabholkar, “Regional differences in synaptogenesis in human cerebral cortex,” The Journal of Comparative Neurology, vol. 387, no. 2, pp. 167–178, 1997. View at: Google Scholar
  65. J. T. Trachtenberg, B. E. Chen, G. W. Knott et al., “Long-term in vivo imaging of experience-dependent synaptic plasticity in adult cortex,” Nature, vol. 420, no. 6917, pp. 788–794, 2002. View at: Publisher Site | Google Scholar
  66. T. Xu, X. Yu, A. J. Perlik et al., “Rapid formation and selective stabilization of synapses for enduring motor memories,” Nature, vol. 462, no. 7275, pp. 915–919, 2009. View at: Publisher Site | Google Scholar
  67. B. Butterworth and V. Walsh, “Neural basis of mathematical cognition,” Current Biology, vol. 21, no. 16, pp. R618–R621, 2011. View at: Publisher Site | Google Scholar
  68. A. M. Sirevaag and W. T. Greenough, “Differential rearing effects on rat visual cortex synapses. III. Neuronal and glial nuclei, boutons, dendrites, and capillaries,” Brain Research, vol. 424, no. 2, pp. 320–332, 1987. View at: Publisher Site | Google Scholar
  69. I. W. Borowsky and R. C. Collins, “Metabolic anatomy of brain: a comparison of regional capillary density, glucose metabolism, and enzyme activities,” Journal of Comparative Neurology, vol. 288, no. 3, pp. 401–413, 1989. View at: Publisher Site | Google Scholar
  70. H. Johansen-Berg, C. S. Baptista, and A. G. Thomas, “Human structural plasticity at record speed,” Neuron, vol. 73, no. 6, pp. 1058–1060, 2012. View at: Publisher Site | Google Scholar
  71. G. E. Hogarty, S. Flesher, R. Ulrich et al., “Cognitive enhancement therapy for schizophrenia: effects of a 2-year randomized trial on cognition and behavior,” Archives of General Psychiatry, vol. 61, no. 9, pp. 866–876, 2004. View at: Publisher Site | Google Scholar
  72. H. Takeuchi, Y. Taki, and R. Kawashima, “Effects of working memory training on cognitive functions and neural systems,” Reviews in the Neurosciences, vol. 21, no. 6, pp. 427–449, 2010. View at: Google Scholar
  73. C. S. Green and D. Bavelier, “Action video game modifies visual selective attention,” Nature, vol. 423, no. 6939, pp. 534–537, 2003. View at: Publisher Site | Google Scholar
  74. S. Hayasaka and T. E. Nichols, “Validating cluster size inference: random field and permutation methods,” NeuroImage, vol. 20, no. 4, pp. 2343–2356, 2003. View at: Publisher Site | Google Scholar
  75. K. Christoff and J. D. E. Gabrieli, “The frontopolar cortex and human cognition: evidence for a rostrocaudal hierarchical organization within the human prefrontal cortex,” Psychobiology, vol. 28, no. 2, pp. 168–186, 2000. View at: Google Scholar

Copyright © 2016 Hikaru Takeuchi et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1026 Views | 443 Downloads | 1 Citation
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder