Significance Human numerical reasoning relies on a cortical network that includes frontal and parietal regions. We asked how the neural basis of numerical reasoning is shaped by experience by comparing congenitally blind and sighted individuals. Participants performed auditory math and language tasks while undergoing fMRI. Both groups activated frontoparietal number regions during the math task, suggesting that some aspects of the neural basis of numerical cognition develop independently of visual experience. However, blind participants additionally recruited early visual cortices that, in sighted populations, perform visual processing. In blindness, these “visual” areas showed sensitivity to mathematical difficulty. These results suggest that experience can radically change the neural basis of numerical thinking. Hence, human cortex has a broad computational capacity early in development.

Abstract In humans, the ability to reason about mathematical quantities depends on a frontoparietal network that includes the intraparietal sulcus (IPS). How do nature and nurture give rise to the neurobiology of numerical cognition? We asked how visual experience shapes the neural basis of numerical thinking by studying numerical cognition in congenitally blind individuals. Blind (n = 17) and blindfolded sighted (n = 19) participants solved math equations that varied in difficulty (e.g., 27 − 12 = x vs. 7 − 2 = x), and performed a control sentence comprehension task while undergoing fMRI. Whole-cortex analyses revealed that in both blind and sighted participants, the IPS and dorsolateral prefrontal cortices were more active during the math task than the language task, and activity in the IPS increased parametrically with equation difficulty. Thus, the classic frontoparietal number network is preserved in the total absence of visual experience. However, surprisingly, blind but not sighted individuals additionally recruited a subset of early visual areas during symbolic math calculation. The functional profile of these “visual” regions was identical to that of the IPS in blind but not sighted individuals. Furthermore, in blindness, number-responsive visual cortices exhibited increased functional connectivity with prefrontal and IPS regions that process numbers. We conclude that the frontoparietal number network develops independently of visual experience. In blindness, this number network colonizes parts of deafferented visual cortex. These results suggest that human cortex is highly functionally flexible early in life, and point to frontoparietal input as a mechanism of cross-modal plasticity in blindness.

Numerical reasoning pervades modern human culture. We readily represent quantity, whether thinking about apples, hours, people, or ideas. It has been suggested that this competence is rooted in a primitive nonsymbolic system of numerical representation that is shared among adults of diverse cultures, as well as with preverbal infants and nonhuman animals (1, 2). This nonsymbolic system allows these populations to estimate numbers of visual or auditory items and to compute over these quantities. For example, infants and monkeys can detect which of two arrays contains more items, and can add and subtract approximate quantities (1⇓⇓–4). The nonverbal, nonsymbolic system underlying this performance represents number in an inherently approximate way (5). However, numerate humans also have the unique ability to reason about quantities precisely using an acquired system of number symbols (5).

Reasoning about approximate and exact number depends on a frontoparietal network, a key node of which is the intraparietal sulcus (IPS) (6). The IPS is active when participants estimate the number of items in a nonsymbolic display as well as when they solve symbolic math problems (e.g., 23 − 19 = x), with more IPS activity during hard math problems than easier ones (6, 7). Temporary deactivation of the IPS with transcranial magnetic stimulation (TMS) impairs performance on numerical tasks (8). In monkeys, the IPS contains numerosity-selective neurons that are tuned to specific numerosities (9).

Although these findings highlight the critical role of the IPS in numerical reasoning, the developmental origins of the neural basis of number representations remain largely unknown. IPS activity during numerical processing is seen in children as young as 4 y old, but these children have had years of experience with numerical information (10). How does the nature of early experience affect the development of the IPS? Here we investigated this question by probing numerical representations following atypical perceptual experience. Specifically, we tested the role of visual experience in the development of numerical representations by studying individuals who are blind from birth.

One possibility is that number is represented differently in blindness, because representations of number in the IPS are fundamentally visuospatial and develop from accumulated experience with seeing sets of items. Like early visual features such as color, contrast, and orientation, numerosity is susceptible to aftereffects. For example, viewing a large quantity of dots causes a subsequent set to be perceived as less numerous than its true quantity (11). Numerosity judgments are also influenced by the visual spatial frequency of arrays (12), suggesting that numerical estimation may tap a form of visual texture perception (13). Furthermore, the neuroanatomical location of number responses in the posterior parietal lobe is consistent with the suggestion that numerical processing is partially visual in nature (14, 15). The parietal lobe plays a central role in visuospatial processing: it is involved in guiding hand and eye movements, orienting spatial attention, mentally rotating objects, and maintaining spatial information in working memory (14, 16, 17). Hierarchical generative models trained on visual arrays develop “numerosity detectors” akin to the number neurons found in monkey IPS (18). Together, these findings suggest that visual experience may play a foundational role in the development of IPS number representations.

An alternative hypothesis is that IPS representations of number are modality independent. In sighted adults, the neurobiological underpinnings of number are similar across sensory modalities and input formats; the IPS is active not only when adults estimate numbers of visual objects but also when they estimate numbers of tones or view number symbols (19, 20). Behavioral evidence shows that newborn infants can approximate numbers of visual objects and numbers of events in auditory sequences (21). However, these findings leave open the question of whether visual experience is necessary to establish the role of the IPS in numerical thinking.

The available data are thus consistent with two possibilities. On the one hand, IPS number representations might depend on vision for normal development. Once instantiated, these representations become available for processing quantity in any modality. Alternatively, IPS quantity representations may be modality invariant throughout the lifespan. Studying the neural basis of numerical representations in congenitally blind individuals enables us to distinguish between these hypotheses.

The neural basis of numerical cognition in blind individuals is of interest for another, independent reason. In blindness, “visual” areas of the brain are colonized by nonvisual functions. These occipital areas respond to auditory and tactile stimuli, a phenomenon termed cross-modal plasticity (22⇓⇓–25). Some of these plastic responses appear to be related to higher-cognitive functions—most prominently, language processing and verbal memory (23, 25⇓–27). For example, the visual system of blind individuals is active during spoken sentence comprehension and is sensitive to grammatical complexity (25, 27). There is evidence that this activity is functionally relevant: TMS to visual areas of blind participants impairs verb generation and Braille reading (28, 29). A key outstanding question is whether this plasticity is an example of a broader pattern whereby visual cortices are recruited for diverse higher cognitive functions, or whether it is an isolated phenomenon resulting from shared computations between vision and language processing (30). Here we addressed this question by asking whether the visual cortices of blind individuals are involved in mathematical reasoning. Like language, math is symbolic, yet it depends on different neural networks (6, 31); this raises the question of whether symbolic math and language elicit dissociable neural responses in the visual cortices of blind individuals.

In the current study, congenitally blind and blindfolded sighted adults (Table S1) performed a math task and a language control task while undergoing fMRI. In the math task, participants heard pairs of spoken subtraction equations, each containing an unknown variable x, and decided whether the value of x was the same in the two equations (e.g., 7 − 2 = x; 6 − 1 = x). We used subtraction because it requires active quantity manipulation rather than long-term memory retrieval, and has been shown to recruit the IPS more than operations such as addition and multiplication (32). We included two orthogonal math difficulty manipulations. Equation pairs contained either single-digit (easy) or double-digit (difficult) numbers (e.g., 7 − 2 = x vs. 27 − 12 = x), and were either algebraically simple (solving for an unknown difference) or complex (solving for an unknown minuend; e.g., 7 − 2 = x vs. x − 2 = 7). No “carry-over” operations were required to solve any of the math equations (e.g., 27 − 12 = x does not require carry-over, 27 − 19 = x requires carry-over). Thus, the difficulty of math problems did not depend on whether or not they involved carry-over. In the language control task, participants heard pairs of spoken sentences and judged whether the meanings of the sentences were the same.

Table S1. Participant demographic information

Discussion IPS Number Representations Develop Independent of Visual Experience. Previous studies show that the IPS is active when adults solve math equations and estimate nonsymbolic quantities (6, 7). This IPS sensitivity to number is present by 4 y of age, before formal math training (10, 33, 34). However, the effect of experience on the neural basis of number processing has remained largely unknown. Here we shed light on the role of early visual experience in the emergence of IPS number representations. We report that the functional profile of the IPS in numerical processing is preserved in individuals who are blind from birth, demonstrating that visual experience with numerical sets is not necessary for the typical development of IPS number responses. The resilience of number representations in blindness is noteworthy in light of the links between number and visuospatial processing. In adults, individual differences in both nonsymbolic and symbolic number performance correlate with individual differences in visual discriminations involving area, density, and orientation (35, 36). Children who are better at mentally rotating visual objects perform better on math tasks (37). Numerical estimation and visuospatial functions, like orienting visual attention, are supported by neighboring regions of parietal cortex (14⇓⇓–17). Despite these links between numerical and visual processing, we find that IPS representations of number develop independently of visual experience with sets. The abstract nature of number representations is also noteworthy given the differences in how sensory modalities convey numerical information. Whereas humans can rapidly perceive large numbers of objects simultaneously through vision (e.g., hundreds within 1.5 s), the number of objects that can be concurrently perceived through audition or touch is limited to ∼10 or fewer (observers can estimate large numerosities through audition, but require sequential presentation) (12, 20, 38⇓⇓⇓–42). The present results suggest that IPS representations of number are resilient to such differences in input, and suggest that IPS representations of number are not rooted in any one sensory modality, but rather are intrinsically modality independent. Visual Cortex of Congenitally Blind Adults Is Recruited into Number-Processing Network. Although we found that visual experience is not required for IPS representations of number, blindness does change the neural basis of numerical cognition in a surprising way. We found that in blind individuals, a subset of early visual cortices is active while solving math equations, and this activity scales with mathematical difficulty. Much evidence has documented responses to auditory and tactile stimuli in visual cortices of congenitally blind individuals (22⇓⇓⇓⇓–27). The mechanisms and the scope of functional reorganization in cross-modal plasticity remain debated (43). On the one hand, some examples of visual cortex plasticity preserve aspects of the original visual functions. Visual motion-responsive area MT+ responds to auditory motion in blindness (44), and parts of visual cortex typically involved in visuospatial localization are active when blind individuals localize sounds (45). On the other hand, visual cortices of blind individuals are also active during high-level language tasks such as remembering words and understanding sentences (23⇓⇓⇓–27). Here we find that these visual cortex responses to language coexist with responses to number. Our results thus suggest that previously observed plasticity for language is part of a broader pattern whereby the visual system of blind individuals takes on higher cognitive functions. The responses to math that we observed in the occipital cortices of blind individuals overlap with early visual areas that, in sighted individuals, contain retinotopic maps, support visual functions such as motion detection, shape representation, and visuospatial attention (46⇓–48). Unlike these visual functions, mathematics is symbolic and depends on cultural experience. The present results thus show that plasticity need not preserve the “typical” functions of cortex, and that the same cortical circuit can participate in widely different cognitive functions depending on experience (26, 27). A full test of this idea will require evidence of the functional relevance of visual cortices to numerical behavior. For example, studies using TMS suggest that visual cortices are functionally relevant for Braille reading, verb generation, and tactile discrimination (28, 29, 49). In the present study we observed a relationship between numerical performance and neural activity in number-responsive visual cortex of blind individuals, suggesting that visual cortex plasticity may play a role in modulating behavior. The functional relevance of the visual cortex for numerical cognition should be directly tested using techniques such as TMS. It is also important to investigate the representational content of number-responsive visual regions in blindness. It is not yet known whether, like the IPS, number-responsive visual regions participate in nonsymbolic number processing (e.g., numerical approximation), and whether activity in visual cortices codes for different numerosities (6). Additionally, we found that in resting-state data, number-responsive occipital areas of blind individuals were correlated with the frontoparietal number network, whereas language-responsive visual areas correlated with the language network. These data show that resting-state patterns relate to functional dissociations in visual cortices of blind individuals. This pattern points to input from frontoparietal cortices as a possible mechanism for the dramatic functional reorganization from low-level vision to symbolic number. Finally, this work raises questions regarding the timing of radical cortical plasticity. We hypothesize that such extreme functional repurposing—here from vision to symbolic number—is restricted to a critical period during development. Previous work has shown that congenital and late blindness lead to different plasticity patterns (50). An intriguing possibility, then, is that cortex is cognitively pluripotent only in early development. If so, the functions of visual cortices in late blind individuals may resemble the original functions of visual cortices in the sighted. Testing these predictions will further inform our understanding of how biology and experience shape the neural basis of thought.

Experimental Procedures Participants. Nineteen sighted (9 females, mean age 46 y, SD = 16) and 17 congenitally blind adults (12 females, mean age 47 y, SD = 16) participated (Table S1). Thirteen of the blind and nine of the sighted participants contributed resting-state data. No sighted or blind participants had cognitive or neurological disabilities (screened through self-report). All blind participants lost their vision due to pathology at or anterior to the optic chiasm, not due to brain damage, and had at most minimal light perception from birth (never saw colors, shapes, or motion; Table S1). Informed consent from participants was obtained in accordance with the Johns Hopkins Medicine Institutional Review Boards. Four additional blind participants were scanned but not included in the final sample because their average accuracy across math and language trials was significantly lower than the group mean (performance outside the 95% confidence interval). Two sighted participants were excluded due to an error in MRI data acquisition. Behavioral Paradigm. Participants completed a math task and a language control task. On each trial in the math task, participants heard two subtraction equations, each containing an unknown variable x. Participants judged whether the value of x was the same across the two equations (e.g., 7 − 2 = x, 6 − 1 = x; value of x is same). Trials were divided into four categories based on two orthogonal difficulty manipulations: (i) digit–number and (ii) algebraic complexity. Half the math trials consisted of easy, single-digit problems, and half consisted of difficult, double-digit problems (e.g., 7 − 2 = x vs. 27 − 12 = x). Orthogonally, half the trials contained simpler algebraic equations in which the variable was to the right of the equal sign, and half contained harder equations in which the variable was to the left of the equal sign (e.g., 7 − 2 = x vs. x − 2 = 7). Each trial began with a tone (0.25 s), followed by one spoken math equation (3.5 s), a brief delay (2.75 s), a second spoken math equation (3.5 s), and a response period (4 s; 14 s total). Participants responded by pressing the left (“same”) or right (“different”) button on an MRI-compatible response pad. The control task was structured similarly to the math task: participants listened to pairs of sentences and indicated whether the sentences’ meanings were the same. One sentence in each pair was presented in the active voice and the other in passive voice, in counterbalanced order. On “different” trials, who-did-what-to-whom was changed across the pairs. Participants heard each sentence and each combination of math equations once during the course of the experiment. Each of the six runs consisted of 24 trials (14 s each) and six periods of rest (16 s each) for a total of 7.2 min. The order of trial types was counterbalanced across runs in a Latin square design. Responses were collected using a Cedrus response pad. Trials on which participants did not respond were excluded from the behavioral and fMRI data analysis [blind: 2.05% of trials, SD = 2.09; sighted: 2.81% of trials, SD = 3.23; t(34) = −0.83, P = 0.41]. fMRI Data Acquisition. Whole-brain MRI structural and functional data were collected with a 3T Phillips scanner. T1-weighted anatomical images were collected in 150 1-mm axial slices (1-mm isotropic voxels). Functional BOLD data were acquired in 36 3-mm axial slices (2.4 × 2.4 × 3 mm voxels; repetition time 2 s). The same image-acquisition parameters were used for the task-based and resting-state data. Task-based fMRI data were acquired in six runs. One 8-min run of resting-state data was acquired for some participants. For the resting-state data collection, participants were instructed to relax and remain awake. All participants were blindfolded throughout the entire experiment. fMRI Task-Based Data Analysis. Data were analyzed using Freesurfer, FSL, HCP workbench, and custom in-house software (51⇓–53). Functional data were motion corrected, high-pass filtered (128 s), mapped to the cortical surface using Freesurfer, spatially smoothed on the surface (6-mm FWHM Gaussian kernel), and prewhitened to remove temporal autocorrelation. Each type of math and language trial was entered as a separate predictor in a general linear model (GLM) after convolving with a canonical hemodynamic response function and its first temporal derivative. Trials in which participants failed to respond (Behavioral Paradigm) and trials with excessive motion [>1.5 mm; blind: 1.45 drops per run, SD = 1.32; sighted: 1.48 drops per run, SD = 3.06, t(34) = 0.03, P = 0.98] were excluded by modeling with a separate regressor. Each run was modeled separately, and runs were combined within subject using a fixed-effects model. Random-effects analyses were used to analyze data within and across groups. Whole-cortex analyses were thresholded at a voxel-wise threshold of P < 0.01 and corrected for multiple comparisons at P < 0.05 using cluster-based permutation tests. IPS and visual cortex (rMOG) ROIs were defined in individual subjects using the math > language contrast (orthogonal to the differences between math conditions). ROIs were defined using a leave-one-run-out procedure. For each participant, using all but one run, ROIs were defined as the top 5% of voxels within an IPS and visual cortex search-space with the highest math > language z value (search-space definition detailed in SI Experimental Procedures, Definition of ROI for Task-Based Analyses). For each ROI, we extracted percentage signal change (PSC) from 2 mm smoothed data during the stimulus portion of the trial (0.25–10 s after trial onset) and averaged PSC across voxels. PSC was computed relative to rest not including the 2 s following the offset of the previous trial. This process was repeated iteratively until every run was excluded from ROI definition. Therefore, ROIs were defined using independent data as well as using a contrast that is orthogonal to the conditions of interest. Resting-State Functional Connectivity Analysis. Resting-state data were analyzed using the standard procedures of the CONN 15.c Functional Connectivity Toolbox (54, 55) (detailed in SI Experimental Procedures, Definition of ROIs for Resting-State Correlation Analyses). BOLD data were smoothed 23 diffusion steps on the surface (corresponding to ∼6 mm of smoothing in the volume) (56), despiked, and bandpass filtered (0.008–0.1). White matter and cerebrospinal fluid BOLD signals were regressed out, and low-frequency drift was removed. Anatomically defined left and right IPS ROIs were used as seeds in a whole-cortex analysis (57). The primary ROI-to-ROI analyses were conducted in the right hemisphere, where cross-modal responses were larger (left hemisphere analyses detailed in SI Experimental Procedures, Definition of ROIs for Resting-State Correlation Analyses). We defined three number-responsive ROIs: occipital rMOG (math > language, blind > sighted), parietal rIPS (math > language, blind and sighted groups’ averages), and prefrontal rPFC (math > language, blind and sighted groups’ averages); and two language-responsive ROIs: occipital rVOT (sentences > math, blind > sighted) and rIFC (sentences > math, blind and sighted groups’ averages) (58). Resting-state ROI definition detailed in SI Experimental Procedures, Definition of ROIs for Resting-State Correlation Analyses.

SI Experimental Procedures Definition of ROI for Task-Based Analyses. IPS and visual cortex (rMOG) ROIs were defined in individual subjects using the math > sentence contrast (orthogonal to the differences between math conditions). ROIs were defined in each participant as the top 5% of voxels with the highest math > sentences z value. Individual-subject ROIs were defined within search spaces created based on group data. Search space definition was orthogonal to the contrast of interest (differences between math conditions) and orthogonal to subject (see below). The left and right IPS search spaces were defined using the sighted and congenitally blind groups’ average responses for the math > sentences contrast within the anatomical location of the IPS (P < 0.01, uncorrected) (57). rMOG search spaces were defined using a leave-one-subject-out analysis (on 10 mm smoothed data). We iteratively excluded one subject and defined the rMOG search space, based on the remaining subjects, as the cluster within visual cortex that showed an interaction between the math > sentences contrast and blind > sighted contrast (P < 0.001, uncorrected). This procedure ensures that search space definition is orthogonal to subject—that is, a given subject did not contribute to the definition of his or her own search space. Before data extraction, the resulting search spaces of all sighted and blind participants were manually trimmed to ensure that they did not extend into the IPS and to avoid irregularly shaped search spaces. Definition of ROIs for Resting-State Correlation Analyses. We correlated activity in left and right IPS with the rest of the cortex in blind and sighted participants. Anatomically defined left and right IPS ROIs were used as seeds in the resting-state analyses (57). Time series were averaged over voxels in left and right IPS seeds and then correlated with voxels across the whole cortex. Results are reported for the left and right IPS seeds separately (Fig. 1, Lower and Fig. S6). Five group ROIs were defined for the ROI-to-ROI resting-state correlation analyses in the right hemisphere: three math-responsive ROIs [rMOG (visual), rIPS (parietal), rPFC (frontal)] and two language-responsive ROIs [rVOT (visual) and rIFC (frontal)]. The IPS ROI was identical the IPS search spaces used in the functional ROI analysis (described above). A right math-responsive PFC ROI was defined for the blind and sighted group, separately, by taking a cluster in the right prefrontal cortex that responded to the math > language contrast in each group (blind threshold: P < 0.01, uncorrected; sighted threshold: P < 0.1, uncorrected). In visual cortex, the rMOG seed was defined by taking a cluster that responded to the math > language in all blind > sighted (P < 0.001, uncorrected; manually trimmed; see above). The language-responsive IFC seed was defined as a cluster within literature-defined language-responsive inferior frontal cortex that responded to language > math in the blind and sighted groups, separately (blind: P < 0.01, uncorrected; sighted P < 0.1, uncorrected). The rVOT seed was defined by taking a cluster in ventral occipitotemporal cortex that responded to the language > math contrast in all blind > sighted (P < 0.001, uncorrected; manually trimmed; see above). A similar seed-to-seed resting-state analysis was performed in the left hemisphere using analogously defined seeds. In blind individuals, we find a dissociation in the functional connectivity patterns of left math and language visual areas with left math-responsive IPS and left language-responsive IFC. However, we did not find this same dissociation between left math and language visual areas and left math and language prefrontal areas.

SI Results Behavioral Results. Accuracy (percentage correct) and response time data were analyzed using a 2 × 2 × 2 repeated-measures ANOVA with group (blind vs. sighted) as a between-subjects factor and digit–number (single vs. double-digit) and algebraic complexity (algebraically simple vs. complex) as within-subject factors. Two-way interactions and main effects are reported in the main text. There were no three-way interactions among group, digit–number, and algebraic complexity [accuracy: group × digit–number × algebraic complexity interaction: F(1, 34) = 1.82, P = 0.19; response time: F(1, 34) = 1.11, P = 0.30]. Preserved Frontoparietal Responses to Number in Congenital Blindness. Within-group (blind and sighted separately) results from the IPS ROI analysis were analyzed using a 2 × 2 × 2 repeated-measures ANOVA with digit–number (single vs. double-digit), algebraic complexity (algebraically simple vs. complex), and hemisphere (left vs. right) as within-subject factors. The left and right IPS of congenitally blind adults responded more to trials with double-digit math equations than single-digit math equations [F(1, 16) = 36.70, P < 0.001; digit–number × hemisphere interaction: F(1, 16) = 0.02, P = 0.9] and more to algebraically complex equations than algebraically simpler equations [F(1, 16) = 8.13, P = 0.01; algebraic complexity × hemisphere interaction: F(1, 16) = 1.74, P = 0.21] (Fig. 2 and Fig. S3). A similar pattern was observed in the sighted group [main effect of digit–number: F(1, 18) = 13.85, P = 0.002; main effect of algebraic complexity: F(1, 18) = 11.05, P = 0.004] (Fig. 2 and Fig. S3). Between-group results from the IPS ROI analysis were analyzed using a 2 × 2 × 2 × 2 repeated-measures ANOVA with group (blind vs. sighted) as a between-subjects factor and digit–number (single vs. double-digit), algebraic complexity (algebraically simple vs. complex), and hemisphere (left vs. right) as within-subject factors. Two-way interactions and main effects are reported in the main text. There was no digit–number × algebraic complexity interaction [F(1, 34) = 1.92, P = 0.18]. There was no three-way interaction among group, digit–number, and algebraic complexity [F(1, 34) = 0.00, P = 1.0]. Responses to Number in Visual Cortex of Blind Adults. Within-group (blind and sighted separately) results from the rMOG ROI analysis were analyzed using a 2 × 2 repeated-measures ANOVA with digit–number (single vs. double-digit) and algebraic complexity (algebraically simple vs. complex) as within-subject factors. The rMOG of congenitally blind individuals was sensitive to algebraic complexity [marginal effect of algebraic complexity: F(1, 16) = 4.20, P = 0.06; digit–number × algebraic complexity interaction: F(1, 16) = 1.19, P = 0.30] (Fig. 2). By contrast, the rMOG of sighted individuals was not sensitive to algebraic complexity [main effect of algebraic complexity: F(1, 18) = 2.01, P = 0.17; digit–number × algebraic complexity interaction: F(1, 18) = 0.30, P = 0.59]. Within group effects of digit–number are reported in the main text. Between-group results from the rMOG ROI analysis were analyzed using a 2 × 2 × 2 repeated-measures ANOVA with group (blind vs. sighted) as a between-subjects factor and digit–number (single vs. double-digit) and algebraic complexity (algebraically simple vs. complex) as within-subject factors. Two-way interactions and main effects are reported in the main text. There was no digit–number × algebraic complexity interaction [F(1, 34) = 1.0, P = 0.33]. There was no three-way interaction among group, digit–number, and algebraic complexity [F(1, 34) = 0.08, P = 0.78].

Acknowledgments We thank the F. M. Kirby Research Center for Functional Brain Imaging at the Kennedy Krieger Institute for their assistance with data collection; Aloma Bouma for assisting with participant recruitment; and the Baltimore and Washington, DC, blind communities. This work was supported by Science of Learning Institute at Johns Hopkins University Award 80034917 and National Science Foundation Graduate Research Fellowship DGE-1232825 (to S.K.).