Table of Contents Author Guidelines Submit a Manuscript
Education Research International
Volume 2012 (2012), Article ID 863286, 6 pages
Research Article

The Impact of Arithmetic Skills on Mastery of Quantitative Analysis

Department of Management, College of Business and Economics, Radford University, Radford, VA 24142, USA

Received 30 March 2012; Revised 16 August 2012; Accepted 19 August 2012

Academic Editor: Yi-Shun Wang

Copyright © 2012 Bruce K. Blaylock and Jerry M. Kopf. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Over the past several years math education has moved from a period where all math calculations were done by hand to an era where most calculations are done using a calculator or computer. There are certainly benefits to this approach, but when one concomitantly recognizes the declining scores on national standardized mathematics exams, it raises the question, “Could the lack of technology-assisted arithmetic manipulation skills have a carryover to understanding higher-level mathematical concepts or is it just a spurious correlation?” Eighty-seven students were tested for their ability to do simple arithmetic and algebra by hand. These scores were then regressed on three important areas of quantitative analysis: recognizing the appropriate tool to use in an analysis, creating a model to carry out the analysis, and interpreting the results of the analysis. The study revealed a significant relationship between the ability to accurately do arithmetic calculations and the ability to recognize the appropriate tool and creating a model. It found no significant relationship between results interpretation and arithmetic skills.

1. Introduction

For more than 30 years, the United States has been concerned about the performance of their students in mathematics [1]. But things seem to remain the same with respect to mathematical training. Witness the following headlines: “In a Global Test of Math Skills, U.S. Students Behind the Curve [2]; U.S. Teens Trail Peers Around World on Math-Science Test [3]; and U.S. Math Scores Hit a Wall National Test Shows No Gains for Fourth-Graders, Slight Rise for Eighth-Graders [4].” Various explanations have been offered for American students' poor relative math skills including not enough time in the classroom [1], cultural differences and expectations [5], and too much emphasis being placed on making the subject “accessible and fun,” and not enough on repetitive drills [6]. This later study specifically stated, “Two key reasons … that students in other countries tend to follow math curricula that involve significantly more drilling of basic math operations, and also tend to use calculators much less in the classroom than do students in the U.S.” [6]. One may be tempted to ask, “In an era of computers, why should students be required to approach arithmetic through repetitive drills?” Indeed, this issue has been examined by several authors. Henningsen and Stein [7] suggest classroom time should be devoted to developing students' “21st century skills.” They suggest mathematical reasoning and communication rather than doing arithmetic will improve mathematical understanding and skill. In Sweden, Brolin and Björk [8] did a long-term study on the use of calculators and concluded they did not negatively impact mathematical understanding. Similar results were reported in Australia [9] and England [10]. More recent studies have reached different conclusions. Loveless [11], in a report for the National Research Council, concluded that mastery of basic mathematical operations, including computational skills, was obligatory for solving more complicated mathematical problems. It may be that those arguing for and against using calculators may be missing an important point. Although there is not a lot of research at the brain level, Dr. Moocow's research ( has concluded that one reason students donot do well in math is because of deficiencies in the parietal cortex in the top back part of the head. She found that students with “math dyslexia” do not stimulate parietal cortex as much as students who are good at math. Although she was able to demonstrate the lack of development in the parietal cortex contributed to poor math scores, she also states that little is known about what, if anything, could be done to increase stimulation in that area. What if performing math calculations by hand had the same effect on physical development of the parietal cortex area of the brain that the physical act of reading has been shown to have on other areas of the brain?

In short, there is a battle going on at the secondary level about how to teach mathematics from grade school through the middle school levels. Those in higher education are not a direct part of this battle but are certainly “collateral damage” from its impact. The authors of this research are not secondary school educators and therefore lack the credentials to enter the fray about HOW to teach fundamental mathematics, but we are positioned to question the impact of what is being done, whatever it is, on students' abilities to master the skills required for business quantitative analysis. The impetus for this research was motivated, in part, by the divergent opinions outlined above. More specifically we are interested in the impact that fundamental mathematical skills have, if any, on specific areas of quantitative analysis: tool recognition, model formulation, and interpretation of solutions. The next section describes how the research was conducted. That section is followed by a report of the results and an explanation of why they occurred.

2. Research Methodology and Analysis

Quantitative analysis (QA) is a class required of all business majors. On the second day of that class, 124 students enrolled in three QA sections were given a computational skills/algebra test and told they could not use a calculator. A study number was assigned to each student. These numbers were recorded and given to a graduate student to assure anonymity from the course instructor, who taught all three sections. The computational skills/algebra test was divided into the following sections: mathematical manipulation, which included adding, subtracting, multiplication, and division of two or more digit numbers; fractions, including adding, subtracting, multiplication, and division; decimals, including converting fractions to decimals, weighted averages, making change, and finding percentages of numbers; and algebra, including expansion of algebraic relations and gathering terms to force an equation to have all variables on one side of the equality and constants on the other. All sections had multiple items to assess each skill. An overall score was calculated as well as a score for each section.

The final exam for the course was comprehensive. It has been used by the College of Business for six years as part of its AACSB Assurance of Learning measurement. The exam is structured such that, in addition to other items, three important areas of quantitative analysis are measured across all topics: tool recognition, the ability to examine a situation and select the most appropriate quantitative analysis tool; model development, the ability to represent a situation in the correct mathematical form for solution; and interpretation, the ability to address specific questions about the results from a quantitative analysis. Scores for each area were recorded as well as an overall score for the exam. A student's scores on the final exam were paired with his/her scores on the computational skills test. Eighty-eight students completed the course and had usable scores for analysis (NOTE: 9 students completed the course who did not have usable data because of missing study numbers or missing computational skills/algebra test scores).

After all data were recorded on a spreadsheet for analysis, two of the computational skills test scores were salient by their consistency: the scores on the fractions and decimals sections. Only six students correctly answered more than three questions in the two sections combined. The lack of variability in these scores led the researchers to eliminate them from further consideration as individual predictors of the dependent variables under consideration. The weighting of these scores remained, however, in the overall computational skills/algebra test score.

Several researchers have argued that arithmetic manipulation skills have an influence on students' algebra skills ([6, 12, 13]). To examine that claim and to check the strength of relationship among the dependent variables and independent variables, a correlation analysis was performed. Table 1 reports the results of that analysis. The data showed a strong correlation between arithmetic manipulation skills and algebra skills; therefore, it appeared the data had the important characteristic reported by other researchers—a link between arithmetic manipulation skills and algebra knowledge. Furthermore, the correlations between the QA skills and the computational/algebra test and test sections appear strong enough to warrant further study.

Table 1: Correlation analysis.

The first statistical tests performed were regression analyses using the areas of quantitative analysis as the dependent variables against the overall computational skills test score. Table 2 reports the regression of computational test against the skill of tool recognition. The P value for this equation is 0.006. Based on this analysis, there appears to be a highly significant relationship between a student's ability to do computational mathematics and his/her ability to determine which quantitative tool is appropriate for a situational analysis.

Table 2: Tool recognition versus computational test.

Table 3 reports the regression of computational test against the skill of model development. The P value for this equation is 0.0110. Based on this analysis, there appears to be a significant relationship between a student's ability to do computational mathematics and his/her ability to create a quantitative model for situational analysis.

Table 3: Model development versus computational test.

Table 4 reports the regression of computational test against the skill of results interpretation. The P value for this equation is 0.1134. Based on this analysis, there appears to be no significant relationship between a student's ability to do computational mathematics and his/her ability to interpret results from a model.

Table 4: Results interpretation versus computational test.

The above analysis shows overall computational skills have a significant relationship to a student's ability both to select an appropriate quantitative tool and to create a quantitative model. We found no significant relationship between a student's computational skills and his/her ability to interpret the results of a quantitative analysis—it appears this is a skill that can be taught to students regardless of their mathematical backgrounds.

The next logical inquiry is to determine if arithmetic skills or algebra skills are at the root of students' problems with selecting and creating quantitative models. Since previous researchers found significant relationships between arithmetic skills and algebra, we also included a cross term composed of these two variables. In all analyses reported in Tables 5 and 6, inclusion of this variable yielded a higher adjusted R square.

Table 5: Tool recognition versus items.
Table 6: Model development versus items.

Table 5 reports the regression of arithmetic manipulation, algebra, and the cross term on tool recognition. The significance F for the equation is 0.0006, highly significant. Interpretation of the individual P values must be done prudently, because of the correlation among the independent variables. A student's ability to recognize the appropriate analysis tool for problem solving appears to be related to his/her arithmetic skills and algebra skills.

Table 6 reports the regression of manipulation, algebra, and the cross term on model development. The significance F for the equation is 0.001, highly significant. A student's ability to construct mathematical models appears to be related to his/her arithmetic and algebra skills.

The data in this study does support, at least at the correlational level, the notion that the ability to solve math problems by hand does have a connection to how well students are able to perform more advanced work that doesnot directly require computation but requires the ability to use higher-order quantitative reasoning skills which are assumed to be associated with physical brain development. Given the result of this study, it would be fascinating to partner with someone doing brain research to see if doing math computations by hand would increase the stimulation in the parietal cortex. If so, we would be much closer to a casual explanation that could have a significant impact on how math deficiencies are viewed and potential strategies for addressing the math deficiency problem. On a more practical level, it does suggest one possible strategy for improving performance on quantitative task: requiring students to work through at least some calculations by hand. The correlations in our student would suggest that teaching students by requiring them to perform at least some math calculations and manipulations by hand could improve their ability to perform more complex quantitative reasoning. In an era of declining math performance it certainly seems like an option worthy of further study and research.


  1. K. Vernille, “Why are U.S. Mathematics Students Falling Behind Their International Peers?” 2001,
  2. M. Dobbs, In a Global Test of Math Skills, U.S. Students Behind the Curve, Washington Post, 2004.
  3. M. Glod, U.S. Teens Trail Peers Around the World on Math-Science Test, Washington Post, 2007.
  4. R. Tomsho, U.S. Math Scores Hit a Wall, The Wall Street Journal, 2009.
  5. C. Prystay, As Math Skills Slips, U.S. Schools Seek Answers from Asia, The Wall Street Journal, 2004.
  6. A. Ginsburg, G. Cooke, S. Leinwand, J. Noell, and E. Pollock, “Reassessing U.S. International Mathematics Performance: New Findings from the 2003 TIMSS and PISA,” American Institutes for Research, 2005,
  7. M. Henningsen and M. Stein, “Mathematical tasks and students' cognition: classroom-based factors that support and inhibit high-level mathematical thinking and reasoning,” Journal of Research in Mathematics Education, vol. 28, no. 5, pp. 524–549, 1997. View at Google Scholar
  8. H. Brolin and L.-E. Björk, “Introducing calculators in Swedish schools,” in Calculators in Mathematics Education, J. T. Fey and C. R. Hirsch, Eds., Yearbook of the National Council of Teachers of Mathematics, pp. 226–232, NCTM, Reston, Va, USA, 1992. View at Google Scholar
  9. S. Groves, “The effect of calculator use on third and fourth graders’ computation and choice of calculating device,” in Proceedings of the Eighteenth International Conference for the Psychology of Mathematics Education PME Program Committee, J. da Ponte and J. F. Matos, Eds., ERIC Document Reproduction Service, no. ED 383–537, pp. 33–40, Lisbon, Portugal, 1994.
  10. H. Shuard, “CAN: calculator use in the primary grades in England and Wales,” in Calculators in Mathematics Education, National Council of teachers of Mathematics, J. Fey and C. Hirsch, Eds., pp. 33–45, Reston, Va, USA, 1992. View at Google Scholar
  11. T. Loveless, “Computation skills, calculators, and achievement gaps: an analysis of NAEP items,” Paper Presented at AERA Annual Conference, 2004.
  12. D. Klein, “A quarter century of US “math wars” and political partisanship,” Journal of the British Society for the History of Mathematics, vol. 22, no. 1, pp. 22–33, 2007. View at Google Scholar
  13. H. Wu, “Basic Skills Versus Conceptual Understanding: A Bogus Dichotomy in Mathematics Education,” American Educator, American Federation of Teachers, Fall, 1999.