One of the central themes of statistical theory and practice is the problem of the quality of goodness-of-fit tests. The problems of constructing the quality of goodness-of-fit tests in the case of i.i.d. are well studied in . To set up a test that allows, if possible, accepting or rejecting the hypothesis to be tested against a given alternative, depending on a data set, a nonparametric study of the hypothesis tests is required, including a typical example that is the goodness-of-fit test and other important examples for applications that are the tests for symmetry, independence and homogeneity.  , and many other authors have worked in this area mainly in the mini max approach which is considered in nonparametric statistics as a good framework for determining the performance of an estimator.
In classical mathematical statistics,  intensely studied the Chi-square, Kolmogorov-Smirnov and Cramér-von Mises tests, and the Kolmogorov-Smirnov and Cramér-von Mises goodness-of-fit tests shown are asymptotically statistically free (i.e. have independent laws of the distribution under the null hypothesis).
 recently studied in their paper the tests of nonparametric hypotheses for the intensity of the inhomogeneous Poisson process. The study they carried out is an extension to the Poisson processes of Ingster’s work.  studied nonparametric tests for Gaussian white noise models with a noise level tending to 0.  presented in their article a review of several results concerning the construction of Kolmogorov-Smirnov-type and Cramér-von Mises-type fit tests for continuous-time processes. As models, they considered a small noise stochastic differential equation, an ergodic diffusion process, a Poisson process, and self-exciting or self-exciting point processes.   consider the shift parameter model and the shift and scale parameter model, and show that the Cramér-von Mises test is asymptotically distribution free and asymptotically partially distribution free, and consistent. For each model, they proposed the tests which provide the asymptotic size and describe the form of the power function under the local alternatives.
In applications, the hypotheses to be tested are often of a more complex nature. The first works on the problems of goodness-of-fit testing of composite hypotheses concerning classical statistics are due to  (  ) who proposed to test composite hypotheses, in the case where the distribution function under the hypothesis to be tested depends on a multidimensional unknown parameter. The null hypothesis therefore becomes composite, i.e. it does not determine the distribution of the sample in a unique way. In the case where the parameters are estimated, the Kolmogorov-Smirnov test, as well as the Cramér-von Mises test is no longer asymptotically distribution free.
It follows that the critical values change from one null hypothesis to another. Different values of the parameter result in different critical values, often within the same parametric family. The distribution free character is therefore crucial in applications since the critical values are calculated only once for any distribution defined under the hypothesis to be tested. To work around this problem,  suggested the split sample method. Durbin’s problem involves a martingale transformation of the parametric empirical process which was proposed by .
The martingale approach of  allows building asymptotically distribution free hypothesis tests. This approach proposed by  is used by various authors including  in the regression models, . We use an approach similar to that of  to construct, in this article, Kolmogorov-Smirnov-type asymptotically distribution free and consistent goodness-of-fit tests.
We will consider the same model as . In general, dealing with the measurement of the intensity of the Poisson process, we will consider the model depending on an unknown translation parameter with a composite parametric base assumption and show that the Kolmogorov-Smirnov test is asymptotically parameter free.
2. Statement of the Problem and Auxiliary Results
Suppose that we observe n independents inhomogeneous Poisson processes where , are trajectories of the Poisson processes with the mean function . Here is the corresponding intensity function.
Let us remind the construction of GoF test of Kolmogorov-Smirnov type in the case of simple null hypothesis. The class of tests of asymptotic size is
Suppose that the basic hypothesis is simple, say, , where is a known function which is continuous and differentiable, and satisfies . The alternative is composite (non parametric) . Then we can introduce the Kolmogorov-Smirnov (K-S) type statistic
where is the empirical mean of the Poisson process. It can be verified that under , this statistic converges to the following limit:
where is a standard Wiener process. Therefore the K-S type test with the threshold defined by the equation
belongs to . This test is asymptotically distribution free (ADF) (see, e.g.,   ). Remind that the test is called ADF if the limit distribution of the test statistic under hypothesis does not depend on the mean function .
Let us consider the case of the parametric null hypothesis. It can be formulated as follows. We have to test the null hypothesis
against the alternative . Here is a known mean function of the Poisson process depending on some finite-dimensional unknown parameter . Note that under there exists the true value such that the mean of the observed Poisson process .
The K-S type GoF test can be constructed by a similar way. Introduce the normalized process , where is the maximum likelihood estimator of the unknown parameter which is (under hypothesis ) consistent and asymptotically normal .
Therefore if we propose a goodness of fit test based on this statistic, say, then to find the threshold such that we have to solve the equation . The goal of this work is to show that if the unknown parameter , when is the shift parameter, then it is possible to construct a test statistic whose limit distribution does not depend on . The test will be uniformly consistent against another class of alternatives
Here is some given number.
The mean function under null hypothesis is
the proposed test statistic is
We show that , where , i.e. the distribution of the random variable does not depend on . Remind that the function is known and therefore the solution can be calculated before the experiment using, say, numerical simulations.
We are given n independent observations of inhomogeneous Poisson processes with the mean function . We have to construct a GoF test in the hypothesis testing problem with parametric null hypothesis . More precisely, we suppose that under , the mean function is absolutely continuous: . Here is the true value, and the intensity function is . The set . Therefore if we denote , then the mean function under null hypothesis is .
It is convenient to use two different functions and and we hope that such notation will not be misleading.
Therefore, we have the parametric null hypothesis
where the parametric family is
Here is a known absolutely continuous function with properties: .
In this work, we denote by the derivative with respect to of any function .
We consider the class of tests of asymptotic level :
The test studied in this work is based on the following statistic of K-S type:
when is the MLE.
As we use the asymptotic properties of the MLE , we need some regularity conditions.
● The function is strictly positive and three times continuously differentiable.
● Its derivatives belong to . The Fisher information
does not depend on .
● The derivative .
● For any we have
Here is the usual norm define as .
Note that, by these conditions, the MLE is consistent, asymptotically normal
and the moments converge: for any
Moreover, it admits the representation (see , Theorem 3.1, page 101)
where . For the proofs see .
3. Main Result
Let us introduce the following random variable
where is a standard Wiener process.
The main result of this work is the following theorem.
Theorem 3.1. Let the conditions are fulfilled. Then, the test
belongs to the class
Let us consider n independent observations of inhomogeneous Poisson processes .
We have to show that .
where we put .
The parametric empirical process defined by
Since the function is differentiable on , according to the formula of finite increments applied to on , we have:
where is an intermediate point between and .
According to (3.2), we have the representation
is the remainder.
Let us put , and denote by the true value. Then relation (2.1) becomes
and we have
where we have set . Since is himself an estimator of therefore converges to . Also converge in probability to 0. Under these considerations we can rewrite as follow
Furthermore, we put
The intensity function is strictly positive. Therefore it was shown that the process is asymptotically the composition of a Brownian motion (in the sense of the weak convergence) with whitch we note , . In the other words converge weakly to the process in the space .
We introduce the stochastic process
It is easy to see that, if we change the variables and in the integrals then we obtain the following equality
The proof of the theorem is based on the proof of the following fundamental lemma.
Lemma 3.2. Let the conditions are satisfied. The process , converges weakly in the space to the process as . Since is a continuous function in in sense of the Skorohod distance, the random variable converges weakly to the random variable . In other words, we have
To prove the Lemma 3.2, we need the following lemmas.
Lemma 3.3. Let the conditions are satisfied. Then the following convergence hold
Proof of Lemma 3.3. For this, we need two relations
Indeed, for the first relation, since the consistent estimator converges to the true value and is a continuous function for all , then converges in probability to for all . Hence
Furthermore by the condition , the function is also bounded. Hence, we can easily obtain the relation (3.8).
Further, for the second relation, we have
Remind that , and , therefore
which gives the proof of relation (3.9).
Now we can evaluate the difference .
Since is a uniformly consistent estimator of on , then .
Further the relation (3.9) allows
The function , implies that , and
implies also that .
Therefore the Lemma 3.3 is proved.
Lemma 3.4. Let the conditions are satisfied, then the finite dimensional distributions of the process , converge to the finite dimensional distributions of the process , as .
Proof of the Lemma 3.4. The proof of the Lemma is based on the Central Limit theorem for stochastic integrals (see, e.g., Kutoyants , Theorem 1.1). We follow the proof of this theorem. In particular, we obtain the convergence when of the characteristic function to the characteristic function of the limit process .
They are defined as following
Indeed, we have
where we put .
On the other hand, we have
Taking into account the expression (3.12) and (3.13), we have the representation of
Thus, we can calculate the characteristic function as following
By the Taylor formula
we have as
This last expression (3.16) is equivalent to:
which is the characteristic function defined in (3.11).
Therefore, we have the convergence of the one-dimensional distributions. In the general case, the verification of the convergence is entirely similar.
Lemma 3.5. For any , and for any , we have
Proof of the Lemma 3.5. For any , and for any (say ), we have
Note that the two lemmas above are not sufficient to establish the weak convergence of the process in the space and also the convergence of the random process . However, the increments of the process being independent, the convergence of the process on finite intervals (that is, convergence in the Skorohod space of functions on without discontinuities of the second kind) follows from ( , Theorem 6.5.5), that is Lemma 3.4 and the following lemma.
Lemma 3.6. For any , we have
Proof of the Lemma 3.6. For all , we must show that
In fact, by Bienaymé-Chebyshev inequality we have:
Therefore the Lemma 3.2 is proved.
So, the last ingredient of the proof of Theorem 3.1 is the following estimate on the tails of the process .
Lemma 3.7. Let the conditions are satisfied. For any , there exist and such that for all , we have
Proof of the Lemma 3.7. We have
we have for the first expression
Direct calculation allows verifying that
where the constant does not depend on n. Hence
For the second term of 18, in a similar manner, we obtain a bound
This convergence allows us to say that for with some , we obtain the estimate (3.17)
Proposition 3.8. Let the conditions are satisfied. Then the test
is consistent under alternatives ,that is:
and it is uniformly consistent under alternatives , that is:
Proof of the Proposition 3.8. Under the hypothesis , the power is
We can write
where we have put
Therefore the Kolmogorov-Smirnov type test is consistent for this alternative. The presented above proof allows verifying the uniform consistency of this test against the alternative .
Indeed we have
The Proposition 3.8 is thus proved.
This work is devoted to the Kolmogorov-Smirnov test in the case of observations of non-homogeneous Poisson processes. The main results are obtained in the situation where, under the null hypothesis, the intensity functions of the observed inhomogeneous Poisson processes depend on an unknown parameter.
As the GoF test studied in this work is mainly based on the maximum likelihood estimator (MLE), we present the asymptotic properties of MLE in asymptotics of large samples. The conditions of coherence and asymptotic normality are given.
We have studied the Kolmogorov-Smirnov test for inhomogeneous Poisson processes with a parametric null hypothesis. The unknown parameter is the translation parameter. The construction of the test is based on the MLE of this parameter and the main result is that due to the structure of the statistics the substitution of the estimator instead of the unknown parameter leads to the limit of the test statistic with distribution which does not depend on the unknown parameter.
In this work, we find the Kolmogorov-Smirnov GoF test based on sup-metrics in the case of the translation parameter. It is natural to ask: what if we take metrics?
 Mann, H.B. and Wald, A. (1942) On the Choice of the Number of Class Intervals in the Application of Chi-Square Test. Annals of Mathematical Statistics, 13, 306-317.
 Ingster, Yu.I. and Kutoyants, Yu.A. (2007) Nonparametric Hypothesis Testing for an Intensity of Poisson Process. Mathematical Methods of Statistics, 16, 217-245.
 Dachian, S. and Kutoyants, Yu.A. (2007) On the Goodness-of-Fit Tests for Some Continuous Time Processes. In: Vonta, F., Nikulin, M., Limnios, N. and Huber-Carol, C., Eds., Statistical Models and Methods for Biomedical and Technical Systems, Birkhäuser, Boston, 395-413.
 Dabye, A.S. (2013) On the Cramér-von Mises Test with Parametric Hypothesis for Poisson Processes. Statistical Inference for Stochastic Processes, 16, 1-13.
 Khmaladze, E. (1981) Martingale Approach in the Theory of Goodness-of-Fit Tests. Theory of Probability and Its Applications, 26, 240-257. (Translated by A.B. Aries)