Journal of Applied Mathematics

Journal of Applied Mathematics / 2014 / Article

Erratum | Open Access

Volume 2014 |Article ID 321932 | 1 page | https://doi.org/10.1155/2014/321932

Erratum to “Asymptotic Behavior of the Likelihood Function of Covariance Matrices of Spatial Gaussian Processes”

Received12 May 2014
Accepted26 Jun 2014
Published06 Jul 2014

In the article titled “Asymptotic Behavior of the Likelihood Function of Covariance Matrices of Spatial Gaussian Processes,” some errors have occurred and should be corrected as follows.(i)Clarification: Equation (2.11) reflects the assumption of a stationary covariance structure, which is the standard setting for Kriging.(ii)The sentence above (3.1) should read “For the matrix norm induced by the Euclidean vector norm and a symmetric matrix , one can show that….”(iii)Clarification: the result of Theorem 3.1 holds along sequences , , along which the directional derivatives of the eigenvalues in direction do not vanish; that is, , . The vector serves here and in the following as a place holder but may without loss of generality be replaced throughout by any other fixed direction with the above property. This comment applies to all the following results in the paper accordingly. In general, the eigenvalue decomposition is guaranteed to be differentiable only for correlation models that are real analytic in ; see [1, Sections 7.2 and 7.7]. The method of Van Der Aa et al., referenced as [22] in the original paper, works to compute eigenvector derivatives for multiple eigenvalues, provided that, for some order , the th-order derivatives of the eigenvalues are mutually distinct.(iv)The third sentence in Remark 3.2 (1) should read “In Appendix , a relationship between condition 2 and the regularity of is established, giving strong support that condition 2 is generally valid, if is regular.”Addendum: the conditions of Theorem 3.1 do not apply to the Gaussian correlation model. Here, the first order derivative vanishes.(v)Formally speaking, the proof given for Lemma 3.4 applies only if . However, the case , which corresponds to constant regression, may be treated in a similar and in fact more straightforward manner, since introducing the auxiliary matrix , as in (3.7), becomes unnecessary.(vi)The first sentence on page 9 should read “It holds that ; see the proof of Lemma 3.3.”(vii)The first sentence in Remark 3.5 should read “Actually, one cannot prove for to be of full rank in general, since.”(viii)In the appendix, there is a minus sign missing in the vector defined just before (). It should feature alternate signs in its nonzero components. It is understood that the nonzero entries should appear in reversed order than the entries in the arrow matrix in () so that is orthogonal to the arrow matrix’ first row.

References

  1. D. Alekseevsky, A. Kriegl, P. W. Michor, and M. Losik, “Choosing roots of polynomials smoothly,” Israel Journal of Mathematics, vol. 105, no. 1, pp. 203–233, 1998. View at: Publisher Site | Google Scholar | MathSciNet

Copyright © 2014 Ralf Zimmermann. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

528 Views | 281 Downloads | 0 Citations
 PDF  Download Citation  Citation
 Download other formatsMore
 Order printed copiesOrder

We are committed to sharing findings related to COVID-19 as quickly and safely as possible. Any author submitting a COVID-19 paper should notify us at help@hindawi.com to ensure their research is fast-tracked and made available on a preprint server as soon as possible. We will be providing unlimited waivers of publication charges for accepted articles related to COVID-19.