#### Abstract

Study of dynamic equations in time scale is a new area in mathematics. Time scale tries to build a bridge between real numbers and integers. Two derivatives in time scale have been introduced and called as delta and nabla derivative. Delta derivative concept is defined as forward direction, and nabla derivative concept is defined as backward direction. Within the scope of this study, we consider the method of obtaining parameters of regression equation of integer values through time scale. Therefore, we implemented least squares method according to derivative definition of time scale and obtained coefficients related to the model. Here, there exist two coefficients originating from forward and backward jump operators relevant to the same model, which are different from each other. Occurrence of such a situation is equal to total number of values of vertical deviation between regression equations and observation values of forward and backward jump operators divided by two. We also estimated coefficients for the model using ordinary least squares method. As a result, we made an introduction to least squares method on time scale. We think that time scale theory would be a new vision in least square especially when assumptions of linear regression are violated.

#### 1. Introduction

Although theoretical to a high extent, time scale tries to link a bridge between continuous and discrete analysis [1, 2]. Such calculations provide an integrated structure for the analysis of difference and differential equations [3–5]. Those equations [6, 7] have been applied in dynamic programming [8–13], neural network [10, 14, 15], economic modeling [10, 16], population biology [17], quantum calculus [18], geometric analysis [19], real-time communication networks [20], intelligent robotic control [21], adaptive sampling [22], approximation theory [13], financial engineering [12] on time scales, and switched linear circuits [23] among others.

This study deals with the estimation of parameters of regression equation by using least squares method through time scale.

The main purpose of least squares method is to minimize the sum of squared vertical deviations. For this, to find the coefficients in simple linear regression model , partial derivatives related to coefficients of equation are obtained and the stage takes place, in which normal equations are to be obtained by setting partial derivatives for each coefficient equal to zero. And from normal equations, equation , predictive model for (9) is obtained.

In statistics, least squares method is used in accordance with the known derivative definition when parameters of regression equation are obtained. In this study, time scale derivative definition is applied to the least squares method. In this regard, (9) simple linear regression model is considered and and , estimators of and coefficients, are obtained in accordance with forward and backward jump operators. Different and parameter values are obtained, originating from forward and backward jump operators related to the simple linear regression equation. In cases when observation values are discrete, forward jump operator equation and backward jump operator equation are taken.

The main approach in regression analysis is to minimize the sum of squared (vertical) deviations between actual and estimated values. This is a weighed method for regression analysis due to its statistical properties [24]. Here, the point which should be considered is that, since in least squares method, sum of squared vertical deviations is of interest, analysis would also operate accordingly. The issue of applying first and second derivatives should also be taken into consideration since the aim is to minimize the sum of squared vertical deviations.

The study consists of two main parts and in the last part takes place an implementation regarding time scale theory. Main parts discussed are time scale theory and simple linear regression, respectively. Time scale theory part includes an explanation of time scale derivative related to and the definitions of forward and backward jump operators. The other main part includes explanations to simple linear regression model being in the form of (9) and to the calculation of and , estimators of and , by using the method of least squares. Time scale derivative definition includes normal equations for forward and backward jump operators, as well as and values, which are and estimators of forward and backward jump operators.

#### 2. Time Scale Preliminaries

Bohner and Peterson [1] suggested the concept of time scale in their studies. Their purpose was to bring together discrete analysis and continuous analysis under one model. Any closed nonempty subset of real numbers is called as time scale and is indicated with symbol . Thus, , real numbers, integers, natural numbers, and positive natural numbers, respectively, are examples of time scale and considered as , ; meaning and meaning = .

, , , , rational numbers, irrational numbers, complex numbers, and open interval of 0 and 1, respectively, are not included in time scale. Since time scale is closed, it can clearly be seen that rational numbers are never a time scale.

Delta derivative for function defined at is actually defined as follows.(i)If , , then it is a general derivative.(ii)If , , then it is a forward difference operator.

*Definition 1. *Given the function and , there exists a neighborhood of and such that
is defined for all , and called as delta derivative of at [1, 25, 26].

*Definition 2. *Given the function and , there exists a neighborhood of and such that
is defined for all , and called as nabla derivative of at [16, 26].

*Definition 3. *Given the function and , there exists a neighborhood of and such that
is defined for all , and called as center derivative of at [27].

*Definition 4 (forward jump operator [1]). *Let be a time scale. Then, for , forward jump operator is defined by
For right-graininess function for all , is defined as follows:

*Definition 5 (backward jump operator [1]). *Let be a time scale. Then, for , backward jump operator is defined by
Left-graininess function is defined as follows:

#### 3. Simple Linear Regression Analysis

Simple linear regression [28, 29] consists of one single explanatory variable or independent variable or response variable or dependent variable. Let us take that a real relationship exists between and and values observed at each level are random. Expected value of at each level is defined by the equation where is the interceptor and is the slope and and are unknown regression coefficients. Each observation can be defined with the model below.

In equation is the random error term with zero mean constitution of (unknown) variance .

Take binary observations of . Figure 1 shows the distribution diagram of observed values and the estimated regression line. Estimators of and would pass the “best line” through the data. German scientist Karl Gauss (1777–1855) made a suggestion in estimation of parameters and in (9) to minimize the sum of squared vertical deviations in Figure 1.

This criterion used in estimation of regression parameters is the* least squares method*. By using (9), observations in the sample can be defined as follows:
and sum of squared deviations of observations from the actual regression line is as follows:
Least square estimators of and are and , which may be calculated as follows:
When these two equations are simplified
These equations may be called as* normal equations of least squares*.

Thus,* estimated regression line* will be as follows:
Each observation pair is provided with the relation below:
where :* residuals*; : observation values; : estimated values.

#### 4. Ordinary Least Square Method

##### 4.1. Normal Equations

These include normal situation and formula values of and estimators for forward and backward operators.

Normal equations in normal (usual) situations Time scale normal equations (forward jump operator) Time scale normal equations (backward jump operator)

##### 4.2. Graphs with Data 10-20-30-40-50 of Normal Situation, Forward and Backward Jump Operator

Respective illustrations of simple linear regression equations obtained from samples of 10, 20, 30, 40, and 50 data for normal situation forward and backward jump operators are shown in Figure 2.

Sum of vertical distances between simple linear regression model for normal and forward and backward operators and observation values for sample sizes of = 10, 20, 30, 40, and 50 are provided in Table 1.

There is a relationship between sample size and sum of vertical distances between regression line and observation values . This is derived from the result of sum of vertical distances between regression lines equal to half of sample size and observation values . Results of implementation can be seen in Table 1.

##### 4.3. Minimum Test

Let , the critical point; meaning: = 0 and If and , has a minimum value. Minimum test calculated according to normal derivative definition is as follows [30]: Minimum test calculated according to time scale derivative definition for forward and backward jump operators is as follows: Since and always, always has a minimum value.

#### 5. Results

In statistics, least squares method is used in accordance with the known derivative definition when parameters of regression equation are obtained. In the study, time scale derivative definition is applied to the least squares method. In this regard, (9) simple linear regression model is considered and both and , estimators of and coefficients, are obtained in accordance with forward and backward jump operators. Different and parameter values are obtained, originating from forward and backward jump operators related to the simple linear regression equation. In cases when observation values are discrete, forward jump operator equation and backward jump operator equation are taken [31].

The study includes the analysis of integers in accordance with time scale derivative definition. Standard approach in regression analysis is to minimize the sum of squared (vertical) deviations between actual and estimated values. Here, the point which should be considered is that, since in least squares method, sum of squared vertical deviations is of interest, analysis would also operate accordingly. The issue of applying first and second derivatives should also be taken into consideration since the aim is to minimize the sum of squared vertical deviations. Since and always, always has a minimum value.

Least square method yields results such that sum of vertical deviations is minimum. When least squares method is used according to time scale derivative definition, a relationship emerges between sample size and sum of vertical distances between regression line and observation values . This is derived from the sum of vertical distances between regression lines equal to half of sample size and observation values . To minimize the sum of vertical distances, different and parameter values resulting from forward and backward jump operators for simple linear regression equation have been of interest. An alternative solution is to create a regression equation in time scale. As a conclusion, time scale derivative definition is applied to the study with integers and suggested solutions are proposed for the results obtained.

#### 6. Discussion and Possible Future Studies

This study is only introducing a very basic derivative concept from the time scale and applying for obtaining the regression parameters. An extension of this study would be determining the integer apart from the value 1. This would be useful especially when the actual assumptions of linear regression model have been violated and need a robust estimation of linear regression line. Perhaps it is hard to determine an optimum of analytically. We would suggest that the study should be performed for generated data containing outliers that is replicated at least 1000 times.

It is also possible to extend the number of regressors from a single explanatory variable to multiple ones in the linear regression model and estimate the parameters using time scale of both forward and backward jump operators. Analytically speaking, it will not be easy to estimate the parameters in a simple form of equation using time scale. In fact we would be dealing with very complicated either formulas or matrices. However it may be worth extending the study to a multiple linear regression model.

In the meantime taking to be 1 results with the sum of vertical distances between regression lines equal to half of sample size (See Table 1 and Figure 2). The value in both forward and backward jump operators should be determined in a way that the estimated regression line from both forward and backward operator is fairly close to the actual regression line especially when the assumptions of linear regression model are violated.

#### Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.