In this section, using an iteration of the Picard type, we will discuss the solutions for non-Lipschitz SDEs with fBm. Let \({X_{0}}( t) \equiv \xi\) be a random variable with \(\mathbb{E}{\vert \xi \vert ^{2}} < + \infty\), and construct an approximate sequence of stochastic process \(\{ X_{k}(t)\}_{k \geq1}\) as follows:
$$\begin{aligned} {X_{k}}( t) = \xi + \int_{0}^{t} {b\bigl( {s,{X_{k - 1}}( s)} \bigr)}\,ds + \int _{0}^{t} {\sigma\bigl( {s,{X_{k - 1}}( s)} \bigr)} \,d^{\circ}{B^{H}}( s),\quad k = 1,2, \ldots. \end{aligned}$$
(3.1)
Hereafter, we assume that \(1 \le T < + \infty\) without losing generality.
First, we given the following four key lemmas. The proofs for Lemma 5 and Lemma 6 will be presented in the Appendix.
Lemma 5
There exists a positive number
K, \(\forall b( {t,\cdot}),\sigma( {t,\cdot}) \in{\mathcal{L}_{\varphi}}[ {0,T} ] \cap{\mathbb{D}^{1,2}}( {\vert \mathcal{H} \vert })\), \(t \in[ {0,T} ]\), and we have
$$\begin{aligned} \mathbb{E} {\bigl\vert {b( {t,X})} \bigr\vert ^{2}} + \mathbb{E} {\bigl\vert {\sigma( {t,X})} \bigr\vert ^{2}} + \mathbb{E} {\bigl\vert {D_{t}^{\varphi}\sigma( {t,X})} \bigr\vert ^{2}} \le K\bigl( {1 + \mathbb {E} {{\vert X \vert }^{2}}} \bigr). \end{aligned}$$
Lemma 6
Under the conclusion of Lemma
5, one can get
$$\begin{aligned} \mathbb{E} {\bigl\vert {{X_{k}}( t)} \bigr\vert ^{2}} \le{C_{1}},\quad k = 1,2,\ldots,t \in[0,T], \end{aligned}$$
(3.2)
where
\({C_{1}} = 3( {1 + \mathbb{E}{{\vert \xi \vert }^{2}}})\exp( {12K{T^{2}}} )\).
Lemma 7
If
\(b(t,X)\)
and
\(\sigma(t, X)\)
satisfy the Hypothesis
4, then for
\(t \in[0,T]\), \(n \ge1\), \(k \geq1\), we have
$$\begin{aligned} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{n}}( s)} \bigr\vert ^{2}} \le{C_{2}} \int_{0}^{t} {\kappa\bigl( {\mathbb{E} {{\bigl\vert {{X_{n + k - 1}}( s) - X_{n - 1}( s)}\bigr\vert }^{2}}} \bigr)}\,ds \end{aligned}$$
(3.3)
and
$$\begin{aligned} \sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{n+k}}( s) -{X_{n}}( s)} \bigr\vert ^{2}} \le{C_{3}}t, \end{aligned}$$
where
\(C_{2}=8T\)
and
\(C_{3}\)
is a constant.
Proof
For \(0 \le s \le t\), we show that
$$\begin{aligned}& \mathbb{E}\bigl\vert X_{n+k}( s) - X_{n}( s)\bigr\vert ^{2} \\& \quad \le2\mathbb{E}\biggl\vert \int_{0}^{s} \bigl(b\bigl( s_{1},X_{n+k - 1}(s_{1}) \bigr) - b\bigl( s_{1},X_{n - 1}(s_{1}) \bigr) \bigr)\,d{s_{1}} \biggr\vert ^{2} \\& \qquad {} + 2\mathbb{E}\biggl\vert \int_{0}^{s} \bigl(\sigma\bigl( s_{1},X_{n+k - 1}(s_{1}) \bigr) - \sigma\bigl( s_{1},X_{n - 1}(s_{1}) \bigr) \bigr) \,d^{\circ}{B^{H}}(s_{1}) \biggr\vert ^{2} \\& \quad \le8T\mathbb{E} \int_{0}^{t} \bigl[\bigl\vert b\bigl( s_{1},X_{n+k - 1}(s_{1}) \bigr) - b(s_{1},X_{n - 1}(s_{1}) \bigr\vert ^{2} \\& \qquad {} +\bigl\vert \sigma\bigl( s_{1},X_{n+k - 1}(s_{1}) \bigr) - \sigma( s_{1},X_{n -1}(s_{1})\bigr\vert ^{2} \\& \qquad {}+\bigl\vert D_{s_{1}}^{\varphi}\bigl(\sigma \bigl( s_{1},X_{n+k - 1}(s_{1}) \bigr) - \sigma \bigl(s_{1},X_{n - 1}(s_{1})\bigr)\bigr)\bigr\vert ^{2}\bigr]\,d{s_{1}} \\& \quad \le{C_{2}} \int_{0}^{t} {\kappa\bigl( {\mathbb{E} {{\bigl\vert {{X_{n + k - 1}}( s) -X_{n - 1}( s)} \bigr\vert }^{2}}} \bigr)}\,ds. \end{aligned}$$
Then it is easy to verify
$$\begin{aligned} \sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{n+k}}( s) -{X_{n}}( s)} \bigr\vert ^{2}} \le& {C_{2}} \int_{0}^{t} {\kappa\bigl( {\mathbb{E} {{ \bigl\vert {{X_{n+ k - 1}}( s) - X_{n - 1}( s)} \bigr\vert }^{2}}} \bigr)}\,ds \\ \le& {C_{2}} \int_{0}^{t} {\kappa( {4{C_{1}}})}\,ds \le{C_{3}}t. \end{aligned}$$
This completes the proof of Lemma 7. □
Now, choose \(0 < {T_{1}} \le T\), such that \(t \in[ {0,{T_{1}}} ]\), for \({\kappa_{1}}( {{C_{3}}t}) \le{C_{3}}\), \({\kappa_{1}}( q) = {C_{2}}\kappa( q)\) holds. We should note that in the following part, we first of all prove the following main theorem, Theorem 9, in the time interval \([0,{T_{1}}]\), then we extend the result in the whole interval \([0,T]\). Fix \(k \geq1\) arbitrarily and define two sequences of functions \({\{ {{\phi_{n}}( t)} \}_{n = 1,2, \ldots}}\) and \({\{ {{{\tilde{\phi}}_{n,k}}( t)} \}_{n = 1,2, \ldots}}\), where
$$\begin{aligned}& {\phi_{1}}( t)= {C_{3}}t, \\& {\phi_{n + 1}}( t) = \int_{0}^{t} {{\kappa_{1}}\bigl( {{ \phi_{n}}( s)} \bigr)}\,ds, \\& {\tilde{\phi}_{n,k}}( t) = \sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{n}}( s)} \bigr\vert ^{2}},\quad n = 1,2, \ldots. \end{aligned}$$
Lemma 8
Under the Hypothesis
4,
$$\begin{aligned} 0 \le{\tilde{\phi}_{n,k}}( t) \le{\phi_{n}}( t) \le{\phi_{n - 1}}( t) \le \cdots \le{\phi_{1}}( t),\quad t \in[ {0,{T_{1}}} ], \end{aligned}$$
(3.4)
for all positive integer
n.
Proof
By Lemma 7, we have
$$\begin{aligned} {\tilde{\phi}_{1,k}}( t) = \sup _{0 \le s \le t} \mathbb{E} { \bigl\vert {{X_{1+k}}( s) - {X_{1}}( s)} \bigr\vert ^{2}} \le{C_{3}}t = {\phi _{1}}( t),\quad t \in[ {0,{T_{1}}} ]. \end{aligned}$$
Then, since \({\kappa_{1}}( q) = {C_{2}}\kappa( q)\), \(\kappa( q) \) is a concave function and
$$\begin{aligned} \mathbb{E} {\bigl\vert {{X_{k + 1}}( s) - {X_{1}}( s)} \bigr\vert ^{2}} \le\sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{k + 1}}( s) - {X_{1}}( s)}\bigr\vert ^{2}} = {\tilde{\phi}_{1,k}}( t),\quad 0 \le s \le t, \end{aligned}$$
it is easy to verify
$$\begin{aligned} {{\tilde{\phi}}_{2,k}}( t) =& \sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{2 + k}}( s) - {X_{2}}( s)} \bigr\vert ^{2}} \\ \le& {C_{2}} \int_{0}^{t} {\kappa\bigl( {\mathbb{E} {{\bigl\vert {{X_{k + 1}}( s) -{X_{1}}( s)} \bigr\vert }^{2}}} \bigr)}\,ds \\ \le& \int_{0}^{t} {{\kappa_{1}}\bigl( {{{ \tilde{\phi}}_{1,k}}( s)} \bigr)}\,ds \le \int_{0}^{t} {{\kappa_{1}}\bigl( {{ \phi_{1}}( s)} \bigr)}\,ds \\ =& {\phi_{2}}( t) = \int_{0}^{t} {{\kappa_{1}}( {{C_{3}}s})}\,ds \\ \le& {C_{3}}t = { \phi_{1}}( t),\quad t \in[ {0,{T_{1}}} ]. \end{aligned}$$
That is to say, for \(n=2\), we have
$$\begin{aligned} {\tilde{\phi}_{2,k}}( t) \le{\phi_{2}}( t) \le{ \phi_{1}}( t),\quad t \in[ {0,{T_{1}}} ]. \end{aligned}$$
Next, assume (3.4) for \(n \geq2\) and by the assumption for n
$$\begin{aligned} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{n}}( s)} \bigr\vert ^{2}} \le\sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{n}}( s)}\bigr\vert ^{2}} = {\tilde{\phi}_{n,k}}( t) \le{\phi_{n}}( t), \end{aligned}$$
it is easy to verify for \(n+1\)
$$\begin{aligned} {{\tilde{\phi}}_{n + 1,k}}( t) =& \sup _{0 \le s \le t} \mathbb{E} {\bigl\vert {{X_{n + k + 1}}( s) - {X_{n + 1}}( s)} \bigr\vert ^{2}} \\ \le& \int_{0}^{t} {{\kappa_{1}}\bigl( { \mathbb{E} {{\bigl\vert {{X_{n + k}}( s) -{X_{n}}( s)} \bigr\vert }^{2}}} \bigr)}\,ds \\ \le& \int_{0}^{t} {{\kappa_{1}}\bigl( {{{ \tilde{\phi}}_{n,k}}( s)} \bigr)}\,ds \\ \le& \int_{0}^{t} {{\kappa_{1}}\bigl( {{ \phi_{n}}( s)} \bigr)}\,ds = {\phi_{n + 1}}( t) \\ \le& \int_{0}^{t} {{\kappa_{1}}\bigl( {{ \phi_{n - 1}}( s)} \bigr)}\,ds = {\phi _{n}}( t),\quad t \in[ {0,{T_{1}}} ]. \end{aligned}$$
This completes the proof of Lemma 8. □
Theorem 9
Under the Hypothesis
4, then
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le T} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
By Theorem 9, we say that \(\{{X_{k}}(\cdotp)\}_{k\geq1}\) is a Cauchy sequence and define its limit as \(X( \cdotp)\). Then letting \(k\to \infty\) in (3.1), we finally see that the solutions to (1.1) exist.
Proof
Step 1: In this step we shall show
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le{T_{1}}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
By Lemma 8, we know \({\phi_{n}}( t)\) decreases monotonically when \(n \to\infty\) and \({\phi_{n}}( t)\) is non-negative function on \(t \in[ {0,{T_{1}}} ]\). Therefore, we can define the limit function \(\phi( t)\) by \({\phi_{n}}( t) \downarrow\phi( t)\). It is easy to verify that \(\phi( 0) = 0\) and \(\phi( t)\) is a continuous function on \(t \in[ {0,{T_{1}}} ]\) [35]. According to the definition of \({\phi_{n}}( t)\) and \({\phi}( t)\), we obtain
$$\begin{aligned} \phi( t) = \lim _{n \to\infty} {\phi_{n + 1}}( t) = \lim _{n \to\infty} \int_{0}^{t} {{\kappa_{1}}\bigl( {{ \phi_{n}}( s)} \bigr)}\,ds = \int_{0}^{t} {{\kappa_{1}}\bigl( {\phi( s)} \bigr)}\,ds,\quad t \in[ {0,{T_{1}}} ]. \end{aligned}$$
(3.5)
Since \(\phi( 0) = 0\) and
$$\begin{aligned} \int_{0 + } {\frac{{dq}}{{{\kappa_{1}}( q)}}} = \frac {1}{{{C_{2}}}} \int_{0 + } {\frac{{dq}}{{\kappa( q)}}} = + \infty, \end{aligned}$$
(3.5) implies \(\phi( t) \equiv0\), \(t\in[0,T_{1}]\).
Therefore we obtain
$$\begin{aligned} 0 \le\lim _{k,n \to\infty} \sup _{0 \le t \le{T_{1}}} \mathbb{E} { \bigl\vert {{X_{n + k}}( t) - {X_{n}}( t)} \bigr\vert ^{2}} = \lim _{k,n \to\infty} {\tilde{\phi}_{n,k}}( {{T_{1}}}) \le\lim _{n \to\infty} {\phi_{n}}( {{T_{1}}}) = 0, \end{aligned}$$
namely,
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le{T_{1}}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
Step 2: Define
$$\begin{aligned} T_{2} = \sup \Bigl\{ {\tilde{T} :\tilde{T} \in[ {0,T} ] \mbox{ and } \lim _{n,i \to\infty} \sup _{0 \le t \le\tilde{T}} \mathbb{E} \bigl\vert {X_{n}}( t) - {X_{i}}( t) \bigr\vert ^{2}= 0} \Bigr\} . \end{aligned}$$
Immediately, we can observe \(0 < {T_{1}} \le T_{2} \le T\). Now, we shall show
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
Let \(\varepsilon>0\) be an arbitrary positive number. Choose \(S_{0}\) so that \(0 < {S_{0}} < \min( {T_{2},1})\). And
$$\begin{aligned} {C_{4}} {S_{0}} < \frac{\varepsilon}{{10}}, \end{aligned}$$
(3.6)
where \({C_{4}} = 8K({1 + {K_{1}}( {1 + \mathbb{E}{{\vert \xi \vert }^{2}}} )}){S_{0}}\).
From the definition of \(T_{2}\), we have
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le T_{2} - {S_{0}}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
Then, for large enough N, we observe
$$\begin{aligned} \sup _{0 \le t \le T_{2} - {S_{0}}} \mathbb{E} {\bigl\vert {{X_{n}}(t) - {X_{i}}( t)} \bigr\vert ^{2}} < \frac{\varepsilon}{{10}},\quad n,i \ge N. \end{aligned}$$
(3.7)
On the other hand, one can get
$$\begin{aligned} \sup _{T_{2} - {S_{0}} \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n}}( t) -{X_{i}}( t)} \bigr\vert ^{2}} \le& 3\sup _{T_{2} - {S_{0}} \le t \le T_{2}} \mathbb {E} {\bigl\vert {{X_{n}}( t) - {X_{n}}( {T_{2} - {S_{0}}})} \bigr\vert ^{2}} \\ &{}+3\mathbb{E} {\bigl\vert {{X_{n}}( {T_{2} - {S_{0}}}) - {X_{i}}( {T_{2} - {S_{0}}})}\bigr\vert ^{2}} \\ &{}+3\sup _{T_{2} - {S_{0}} \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{i}}( {T_{2} - {S_{0}}}) - {X_{i}}( t)} \bigr\vert ^{2}} \\ =& 3{I_{1}} + 3{I_{2}} + 3{I_{3}}. \end{aligned}$$
(3.8)
Now, using Lemma 3, we obtain
$$\begin{aligned} {I_{1}} =& \sup _{T_{2} - {S_{0}} \le t \le T_{2}} \mathbb {E} {\bigl\vert {{X_{n}}( t) - {X_{n}}( {T_{2} - {S_{0}}})} \bigr\vert ^{2}} \\ \le& 2{S_{0}} \mathbb{E} \int_{T_{2} - {S_{0}}}^{T_{2}} {{{\bigl\vert {b\bigl( s_{1},X_{n -1}(s_{1}) \bigr)} \bigr\vert }^{2}}} \,d{s_{1}} \\ &{}+ 4H{S_{0}}^{2H - 1} \mathbb{E} \int_{T_{2} - {S_{0}}}^{T_{2}} {{{\bigl\vert {\sigma \bigl(s_{1},X_{n - 1}(s_{1}) \bigr)} \bigr\vert }^{2}}} \,d{s_{1}} \\ &{}+ 8{S_{0}}\mathbb{E} \int_{T_{2} - {S_{0}}}^{T_{2}} {{{\bigl\vert {D_{s_{1}}^{\varphi}\sigma\bigl( s_{1},X_{n - 1}(s_{1}) \bigr)} \bigr\vert }^{2}}} \,d{s_{1}} \\ \le& 8{S_{0}} \int_{T_{2} - {S_{0}}}^{T_{2}} {K\bigl({1 + {K_{1}}\bigl( {1 + \mathbb {E} {{\vert \xi \vert }^{2}}} \bigr)} \bigr)} \,d{s_{1}} \\ \le& 8S_{0}^{2}K\bigl( {1 + {K_{1}}\bigl( {1 + \mathbb{E} {{\vert \xi \vert }^{2}}} \bigr)} \bigr). \end{aligned}$$
Therefore by (3.6) we have
$$\begin{aligned} {I_{1}} \le\frac{\varepsilon}{{10}} \end{aligned}$$
(3.9)
and
$$\begin{aligned} {I_{3}} \le\frac{\varepsilon}{{10}}. \end{aligned}$$
(3.10)
Meanwhile, (3.7) implies
$$\begin{aligned} {I_{2}} = \mathbb{E} {\bigl\vert {{X_{n}}( {T_{2} - {S_{0}}}) - {X_{i}}( {T_{2} - {S_{0}}})}\bigr\vert ^{2}} < \frac{\varepsilon}{{10}},\quad n,i \ge N. \end{aligned}$$
(3.11)
Now putting (3.7)-(3.11) together, we have
$$\begin{aligned} \sup _{0 \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n}}( t) -{X_{i}}( t)} \bigr\vert ^{2}} \le& \sup _{0 \le t \le T_{2} - {S_{0}}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} \\ &{}+ \sup _{T_{2} - {S_{0}} \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} \\ \le& \frac{\varepsilon}{{10}} + 3{I_{1}} + 3{I_{2}} + 3{I_{3}} < \varepsilon. \end{aligned}$$
That is to say,
$$\lim _{n,i \to\infty} \sup _{0 \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. $$
Step 3: Using the method of reduction to absurdity, we shall show \(T_{2}=T\). Assume \(T_{2}< T\), we can choose a sequence of numbers \({\{ {{a_{i}}} \} _{i = 1,2, \ldots}}\) so that \({a_{i}} \downarrow0\) (\({i \to + \infty }\)) and for \(n > i \ge1\),
$$\begin{aligned} \sup _{0 \le t \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n}}( t) -{X_{i}}( t)} \bigr\vert ^{2}} \le{a_{i}}. \end{aligned}$$
(3.12)
We shall divide the step into several sub-steps.
First, for \(n > i \ge1\), we shall show
$$\begin{aligned} \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\bigl\vert {{X_{n}}( s) - {X_{i}}( s)} \bigr\vert ^{2}} \le3{a_{i}} + {C_{5}}t,\quad T_{2} + t \le T, \end{aligned}$$
(3.13)
where \({C_{5}} = 12TK({1 + {K_{1}}( {1 + \mathbb{E}{{\vert \xi \vert }^{2}}})})\).
To show this, set
$$\begin{aligned}& J_{1}^{( i)} = \mathbb{E} {\bigl\vert {{X_{n}}( {T_{2}}) - {X_{i}}( {T_{2}})} \bigr\vert ^{2}}, \\& J_{2}^{( i)}( t) = \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\biggl\vert { \int_{T_{2}}^{s} {\bigl( {b\bigl( s_{1},X_{n - 1}(s_{1}) \bigr) - b\bigl( {{s_{1}},{X_{i - 1}}(s_{1})} \bigr)} \bigr)\,d{s_{1}}} } \biggr\vert ^{2}}, \\& J_{3}^{( i)}( t)= \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\biggl\vert { \int_{T_{2}}^{s} {\bigl( {\sigma\bigl( s_{1},X_{n - 1}(s_{1}) \bigr) - \sigma\bigl( {{s_{1}},{X_{i - 1}}(s_{1})} \bigr)} \bigr)} \,d^{\circ}{B^{H}}(s_{1})} \biggr\vert ^{2}}. \end{aligned}$$
Then (3.12) implies \(J_{1}^{( i)} \le{a_{i}}\) and
$$\begin{aligned} J_{2}^{i}( t) + J_{3}^{i}( t) \le& 4T \mathbb{E} \int_{T_{2}}^{T_{2}+ t} \bigl[ \bigl\vert b \bigl(s_{1},X_{n - 1}(s_{1})\bigr) - b \bigl(s_{1},X_{i - 1}(s_{1})\bigr) \bigr\vert ^{2} \\ &{}+\bigl\vert \sigma\bigl(s_{1},X_{n - 1}(s_{1}) \bigr) - \sigma\bigl(s_{1},X_{i - 1}(s_{1})\bigr) \bigr\vert ^{2} \\ &{}+ \bigl\vert D_{s_{1}}^{\varphi}\bigl(\sigma\bigl(s_{1},X_{n - 1}(s_{1})\bigr) - \sigma\bigl(s_{1},X_{i -1}(s_{1})\bigr) \bigr)\bigr\vert ^{2}\bigr]\,ds_{1} \\ \le& 4TK\bigl(1 + {K_{1}}\bigl( 1 + \mathbb{E}\vert \xi \vert ^{2}\bigr) \bigr)t. \end{aligned}$$
Therefore
$$\begin{aligned} \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\bigl\vert {{X_{n}}( s) - {X_{i}}( s)} \bigr\vert ^{2}} \le& 3J_{1}^{( i)} + 3J_{2}^{( i)}( t) + 3J_{3}^{( i)}( t) \\ \le& 3{a_{i}} + {C_{5}}t,\quad T_{2} + t \le T. \end{aligned}$$
Next, we shall show an assertion which is analogous to Lemma 8. To state the assertion, we need to introduce several notations.
Choose a positive number \(0 < \eta \le T - T_{2}\) and a positive integer \(j \geq1\), so that
$$\begin{aligned} {C_{6}}\kappa( {3{a_{j}} + {C_{5}}t}) \le{C_{5}},\quad t \in[ {0,\eta} ],{\kappa _{2}}( q) = {C_{6}}\kappa( q), \end{aligned}$$
(3.14)
where \(C_{6}=12T\).
Introduce the sequence of functions \({\{ {{\psi_{k}}( t)} \}_{k = 1,2, \ldots}}\), \(t \in[ {0,\eta} ]\), defined by
$$\begin{aligned}& {\psi_{1}}( t) = 3{a_{j}} + {C_{5}}t, \\ & { \psi_{k + 1}}( t)= 3{a_{j + k}} + \int_{0}^{t} {{\kappa_{2}}\bigl( {{\psi _{k}}( s)} \bigr)\,ds} , \\& {\tilde{\psi}_{k,n}}( t)= \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{j + k}}( s)} \bigr\vert ^{2}}. \end{aligned}$$
Now, the assertion to be proved is the following:
$$\begin{aligned} {\tilde{\psi}_{k,n}}( t) \le{\psi_{k}}( t) \le{\psi_{k - 1}}( t) \le \cdots \le{\psi_{1}}( t),\quad t \in[ {0, \eta} ], \end{aligned}$$
(3.15)
for all positive integer k.
Noticing that \({\kappa_{2}}( q)\) is a non-decreasing, concave function, and (3.13) holds, from this for \(k=1\), we work out
$$\begin{aligned} {{\tilde{\psi}}_{1,n}}( t) =& \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\bigl\vert {{X_{n + 1}}( s) - {X_{j + 1}}( s)} \bigr\vert ^{2}} \\ \le& 3a_{j + 1} + {C_{6}} \mathbb{E} \int_{T_{2}}^{T_{2} + t} \bigl[ \bigl\vert b \bigl({s_{1}},X_{n}(s_{1})\bigr) - b\bigl( s_{1},X_{j}(s_{1}) \bigr) \bigr\vert ^{2} \\ & {}+ \bigl\vert \sigma\bigl( s_{1},X_{n}(s_{1}) \bigr) - \sigma\bigl(s_{1},X_{j}(s_{1})\bigr) \bigr\vert ^{2} \\ &{}+\bigl\vert D_{s_{1}}^{\varphi}\bigl( \sigma\bigl( s_{1},X_{n}(s_{1})\bigr) - \sigma\bigl(s_{1},X_{j}(s_{1})\bigr)\bigr)\bigr\vert ^{2}\bigr]\,d{s_{1}} \\ \le& 3{a_{j + 1}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( { \mathbb {E} {{\bigl\vert {{X_{n}}(s_{1}) - {X_{j}}(s_{1})} \bigr\vert }^{2}}} \bigr)\,d{s_{1}}} \\ \le& 3{a_{j}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}( {3{a_{j}} + {C_{5}} {s_{1}}} )\,d{s_{1}}} \le{\psi_{1}}( t),\quad t \in[ {0,\eta} ]. \end{aligned}$$
On the other hand, using (3.14) we arrive at
$$\begin{aligned} {{\tilde{\psi}}_{2,n}}( t) \le& \sup _{T_{2} \le s \le T_{2} + t} \mathbb{E} {\bigl\vert {{X_{n + 2}}( s) - {X_{j + 2}}( s)} \bigr\vert ^{2}} \\ \le& 3{a_{j + 2}} + {C_{6}} \int_{T_{2}}^{T_{2} + t} {\kappa\bigl( {\mathbb {E} {{\bigl\vert {{X_{n + 1}}(s_{1}) - {X_{j + 1}}(s_{1})} \bigr\vert }^{2}}} \bigr)\,d{s_{1}}} \\ \le& 3{a_{j + 2}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( {{{ \tilde{\psi}}_{1,n}}( t)} \bigr)\,d{s_{1}}} \\ \le& 3{a_{j + 1}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( {{ \psi_{1}}( t)} \bigr)\,d{s_{1}}} = {\psi_{2}}( t) \\ \le& 3{a_{j}}+{C_{5}}t = {\psi_{1}}( t),\quad t \in[ {0,\eta} ]. \end{aligned}$$
Then we have proved
$$\begin{aligned} {\tilde{\psi}_{2,n}}( t) \le{\psi_{2}}( t) \le{ \psi_{1}}( t). \end{aligned}$$
Now assume that the assertion holds for \(k \geq2\). Then, by an analogous argument, one can obtain
$$\begin{aligned} {{\tilde{\psi}}_{k + 1,n}}( t) \le& 3{a_{j + k + 1}} + \int _{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( { \mathbb{E} {{\bigl\vert {{X_{n + k}}(s_{1})-{X_{j + k}}(s_{1})} \bigr\vert }^{2}}} \bigr)\,d{s_{1}}} \\ \le& 3{a_{j + k + 1}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( {{{ \tilde{\psi}}_{k,n}}(s_{1})} \bigr)\,d{s_{1}}} \\ \le& 3{a_{j + k}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( {{\psi _{k}}(s_{1})} \bigr)\,d{s_{1}}} = { \psi_{k + 1}}( t) \\ \le& 3{a_{j + k - 1}} + \int_{T_{2}}^{T_{2} + t} {{\kappa_{2}}\bigl( {{\psi _{k - 1}}(s_{1})} \bigr)\,d{s_{1}}} \\ =& { \psi_{k}}( t),\quad t \in[ {0,\eta} ]. \end{aligned}$$
Therefore, we obtain (3.15) for all k. In terms of (3.15), we can define the function \(\psi( t)\) by \({\psi_{k}}( t) \downarrow\psi( t)\) (\({k \to\infty}\)). We observe that
$$\begin{aligned} \psi( 0) =& \lim _{k \to\infty} {\psi_{k + 1}}( 0) \\ =& \lim _{k \to\infty} {a_{j + k}} = 0. \end{aligned}$$
It is easy to verify that \(\psi( t)\) is a continuous function on \([ {0,\eta} ]\). Now by the definition of \({\psi_{k + 1}}( t)\) and \(\psi( t)\), we have
$$\begin{aligned} \psi( t) =& \lim _{k \to\infty} {\psi_{k + 1}}( t) \\ =& \lim _{k \to\infty} \biggl[ {3{a_{j + k}} + \int_{0}^{t} {{\kappa_{2}}\bigl( {{ \psi_{k}}( s)} \bigr)\,ds} } \biggr] \\ =& \int_{0}^{t} {{\kappa_{2}}\bigl( {\psi( s)} \bigr)\,ds} . \end{aligned}$$
(3.16)
Since \(\psi( 0) = 0\) and
$$\begin{aligned} \int_{0 + } {\frac{{dq}}{{{\kappa_{2}}( q)}}} = \frac {1}{{{C_{6}}}} \int_{0 + } {\frac{{dq}}{{\kappa( q)}}} = + \infty, \end{aligned}$$
(3.16) implies \(\psi( t) = 0\), \(t \in[ {0,\eta} ]\).
Therefore, we obtain
$$\begin{aligned} \lim _{k \to\infty} {{\tilde{\psi}}_{k,n}}( t) =& \lim _{k \to\infty} \sup _{0 \le s \le T_{2} + t } \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{j + k}}( s)}\bigr\vert ^{2}} \\ \le& \lim _{k \to\infty} \sup _{0 \le s \le T_{2}} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{j + k}}( s)}\bigr\vert ^{2}} \\ &{}+ \lim _{k \to\infty} \sup _{T_{2} \le s \le T_{2} + \eta} \mathbb{E} {\bigl\vert {{X_{n + k}}( s) - {X_{j +k}}( s)} \bigr\vert ^{2}} \\ \le& \lim _{k \to\infty} {\psi_{k}}( \eta) = \psi( \eta) = 0, \end{aligned}$$
namely
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le T_{2} + \eta} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
But this conclusion is contradictory to the definition of \(T_{2}\). In other words, we have already shown that
$$\begin{aligned} \lim _{n,i \to\infty} \sup _{0 \le t \le T} \mathbb{E} {\bigl\vert {{X_{n}}( t) - {X_{i}}( t)} \bigr\vert ^{2}} = 0. \end{aligned}$$
The proof of the existence of solutions of SDEs (1.1) is complete. □
Theorem 10
Under the Hypothesis
4, the path-wise uniqueness holds for (1.1), \(t\in[0,T]\).
Proof
Let \(X( t)\) and \(\tilde{X}( t)\) be two solutions of (1.1) on the same probability space and \(X( 0) = \tilde{X}( 0)\). We observe
$$\begin{aligned}& \mathbb{E} {\bigl\vert {X( t) - \tilde{X}( t)} \bigr\vert ^{2}} \\& \quad = \mathbb{E} {\biggl\vert { \int_{0}^{t} {\bigl( {b\bigl( {s,X( s)} \bigr) - b \bigl( {s,\tilde{X}( s)} \bigr)} \bigr)\,ds} + \int_{0}^{t} {\bigl( {\sigma\bigl( {{s_{1}},X( s)} \bigr) - \sigma\bigl( {s,\tilde{X}( s)} \bigr)} \bigr)} \,d^{\circ}{B^{H}}( s)} \biggr\vert ^{2}} \\& \quad \le2\mathbb{E} {\biggl\vert { \int_{0}^{t} {\bigl({b\bigl( {s,X( s)} \bigr) - b \bigl( {s,\tilde{X}( s)} \bigr)} \bigr)\,ds} } \biggr\vert ^{2}} + 2 \mathbb{E} {\biggl\vert { \int_{0}^{t} {\bigl( {\sigma\bigl( {s,X( s)} \bigr) - \sigma\bigl( {s,\tilde{X}( s)} \bigr)} \bigr)} \,d^{\circ}{B^{H}}( s)} \biggr\vert ^{2}} \\& \quad \le8T\mathbb{E} \int_{0}^{t} \bigl(\bigl\vert {b\bigl( s, X( s) \bigr) - b\bigl( s,\tilde{X}( s) \bigr)\bigr\vert ^{2} + \bigl\vert \sigma\bigl( s,X( s) \bigr) - \sigma\bigl( s,\tilde{X}( s)\bigr)\bigr\vert }^{2} \\& \qquad {} + \bigl\vert D_{s}^{\varphi}\bigl( \sigma \bigl( s,X( s) \bigr) - \sigma\bigl( s,\tilde{X}( s) \bigr) \bigr) \bigr\vert ^{2}\bigr)\,ds. \end{aligned}$$
Combining the above inequalities and the Hypothesis 4, one has
$$\begin{aligned} \mathbb{E} {\bigl\vert {X( t) - \tilde{X}( t)} \bigr\vert ^{2}} \le8T \int_{0}^{t} {\kappa\bigl( {\mathbb{E} {{\bigl\vert {X( s) - \tilde{X}( s)} \bigr\vert }^{2}}} \bigr)}\,ds. \end{aligned}$$
(3.17)
Then, noticing that \(\int_{0 + } {\frac{{dq}}{{\kappa( q)}}} = + \infty\), the above inequality (3.17) implies
$$\begin{aligned} \mathbb{E} {\bigl\vert {X( t) - \tilde{X}( t)} \bigr\vert ^{2}} = 0,\quad t\in[0,T]. \end{aligned}$$
Since T is an arbitrary positive number, we obtain from this \(X( t) \equiv\tilde{X}( t)\), for all \(0\le t \le T\).
Thus the path-wise uniqueness holds for (1.1). □