We begin by discussing the measurability of setvalued maps and then introduce the definition of an interval random variable. The basic definitions and more details can be found in [7]. A measurable space (\mathrm{\Omega},\mathcal{A}) consists of a basic set Ω together with a σalgebra \mathcal{A} of subsets of Ω called measurable sets. Here, we consider closed convex value setvalued maps F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k}, i.e., F(\omega ) is a closed convex subset of {\mathbb{R}}^{k} for each \omega \in \mathrm{\Omega}. This is the case when F is interval valued. The latter notion means that for each \omega \in \mathrm{\Omega}, the components of F(\omega ) are closed intervals in ℝ.
We first define what it means for a setvalued map to be measurable. Recall that the inverse image of a set S\subset {\mathbb{R}}^{k} under the setvalued map F is defined by
{F}^{1}(S)=\{\omega \in \mathrm{\Omega}:F(\omega )\cap S\ne \mathrm{\varnothing}\},
and that the graph of F (denoted by {G}_{F}) is defined by
{G}_{F}=\{(\omega ,y):\omega \in \mathrm{\Omega},y\in F(\omega )\}.
Definition 3 Let (\mathrm{\Omega},\mathcal{A}) be a measurable space and F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be a setvalued map. F is called measurable if the inverse image of each open set is a measurable set: if O\subset {\mathbb{R}}^{k} is open, then {F}^{1}(O)\in \mathcal{A}.
We are now in a position to introduce the definition of interval random variables and interval stochastic processes.
Definition 4 Let (\mathrm{\Omega},\mathcal{S},P) be a probability space. An intervalvalued map X:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} is called an interval random variable if

1.
X is measurable, and

2.
the function x\mapsto {p}_{x} is continuous on X, where {p}_{x} is the probability density function for the random variable x.
An interval stochastic process is an indexed set of interval random variables.
The probability density function {p}_{X} is then the intervalvalued function
{p}_{X}=\{{p}_{x}:x\in X\}.
In order to study the expectations and variances of interval random variables, we need to discuss first the integral of setvalued maps and, in particular, intervalvalued maps. The discussion begins with the notion of measurable selections.
Definition 5 Let (\mathrm{\Omega},\mathcal{A}) be a measurable space and F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be a measurable setvalued map. A measurable selection of F is a measurable map f:\mathrm{\Omega}\to {\mathbb{R}}^{k} satisfying f(\omega )\in F(\omega ) for each \omega \in \mathrm{\Omega}.
It is well known that every measurable setvalued map has at least one measurable selection [8]. Furthermore, we have the following equivalences [7].
Theorem 6 Let (\mathrm{\Omega},\mathcal{A}) be a measurable space and denote by ℬ the σalgebra of Borel sets in {\mathbb{R}}^{k}. Let F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be a setvalued map. The following are equivalent.

1.
F is measurable.

2.
{G}_{F}\in \mathcal{A}\otimes \mathcal{B}.

3.
{F}^{1}(B)\in \mathcal{A} for every B\in \mathcal{B}.

4.
There exists a sequence of measurable selections
{\{{f}_{n}\}}_{n=1}^{\mathrm{\infty}}
of
F
such that
F(\omega )=\overline{\bigcup _{n\ge 1}{f}_{n}(\omega )}
for each \omega \in \mathrm{\Omega}.
A countable family of measurable selections satisfying the last property is called dense.
Let F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be an intervalvalued map. We define the two special functions {l}_{F} and {r}_{F} such that {l}_{F}(\omega )=a(\omega ) and {r}_{F}(\omega )=b(\omega ), where F(\omega )=[a(\omega ),b(\omega )] for each \omega \in \mathrm{\Omega}. The next lemma shows that {l}_{F} and {r}_{F} are measurable selections of F when the latter is measurable.
Lemma 7 Let F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be a measurable intervalvalued map. Then the point functions {l}_{F} and {r}_{F} are measurable selections of F.
Proof Choose a sequence of measurable selections {\{{f}_{n}\}}_{n=1}^{\mathrm{\infty}} of F such that
F(\omega )=\overline{\bigcup _{n\ge 1}{f}_{n}(\omega )}.
Then {l}_{F}(\omega )={inf}_{n\ge 1}{f}_{n}(\omega ) and {r}_{F}(\omega )={sup}_{n\ge 1}{f}_{n}(\omega ) (here the inf and sup operations are taken componentwise). Since the inf and the sup operators preserve measurability, we see that the functions {l}_{F} and {r}_{F} are measurable selections of F. □
Example Let \mathrm{\Omega}=[1,\mathrm{\infty}) and define F:\mathrm{\Omega}\rightrightarrows \mathbb{R} by
Let {\{{r}_{n}\}}_{n=1}^{\mathrm{\infty}} be an enumeration of the rational numbers in the interval [0,1], and let us assume that {r}_{1}=1, {r}_{2}=0. Define {f}_{n}:[1,\mathrm{\infty})\to \mathbb{R} by
{f}_{n}(t)={r}_{n}t+(1{r}_{n})(t+\frac{1}{t}).
Thus, {l}_{F}(t)=t={f}_{1}(t) and {r}_{F}(t)=(t+\frac{1}{t})={f}_{2}(t). For every t\in [1,\mathrm{\infty}), the set {\{{r}_{n}t+(1{r}_{n})(t+\frac{1}{t})\}}_{n=1}^{\mathrm{\infty}} is dense in the interval [t,t+\frac{1}{t}]. Thus, F(t)=\overline{{\bigcup}_{n\ge 1}{f}_{n}(t)}.
Now suppose that (\mathrm{\Omega},\mathcal{A},\mu ) is a measure space and F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} is a setvalued map. A measurable selection f of F is an integrable selection if f is integrable with respect to the measure μ. The set of all integrable selections of F will be denoted by ℱ. The map F is called integrably bounded if there exists a μintegrable function g\in {L}^{1}(\mathrm{\Omega};\mathbb{R},\mu ) such that F(\omega )\subset g(\omega )\mathbf{B} for μalmost every \omega \in \mathrm{\Omega}. Here, B denotes the unit ball in {\mathbb{R}}^{k}. In this case, every measurable selection f of F is also an integrable selection since f(\omega )\in F(\omega )\subset g(\omega )\mathbf{B} implies that \parallel f(\omega )\parallel \le g(\omega ), where \parallel \cdot \paralleldenotes the Euclidean norm on {\mathbb{R}}^{k}.
Definition 8 The integral of a setvalued map F is defined to be the set of integrals of integrable selections of F. That is,
{\int}_{\mathrm{\Omega}}F\phantom{\rule{0.2em}{0ex}}d\mu =\{{\int}_{\mathrm{\Omega}}f\phantom{\rule{0.2em}{0ex}}d\mu :f\in \mathcal{F}\}.
(1)
We shall say that F is integrable if every measurable selection is integrable.
We have the following immediate properties:
Lemma 9 Let F:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be an intervalvalued map. If {l}_{F} and {r}_{F} are integrable, then F is integrable and
\begin{array}{rcl}{\int}_{\mathrm{\Omega}}F\phantom{\rule{0.2em}{0ex}}d\mu & =& [{\int}_{\mathrm{\Omega}}{l}_{F}\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{\mathrm{\Omega}}{r}_{F}\phantom{\rule{0.2em}{0ex}}d\mu ]\\ =& \{{\int}_{\mathrm{\Omega}}{f}_{\alpha}\phantom{\rule{0.2em}{0ex}}d\mu :{f}_{\alpha}=\alpha {l}_{F}+(1\alpha ){r}_{F},\alpha \in [0,1]\}.\end{array}
Proof The first equality is shown as follows. Since for every \omega \in \mathrm{\Omega} and every integrable selection f of F we have {l}_{F}(\omega )\le f(\omega )\le {r}_{F}(\omega ),
{\int}_{\mathrm{\Omega}}{l}_{F}(\omega )\phantom{\rule{0.2em}{0ex}}d\mu \le {\int}_{\mathrm{\Omega}}f(\omega )\phantom{\rule{0.2em}{0ex}}d\mu \le {\int}_{\mathrm{\Omega}}{r}_{F}(\omega )\phantom{\rule{0.2em}{0ex}}d\mu .
Therefore,
{\int}_{\mathrm{\Omega}}F\phantom{\rule{0.2em}{0ex}}d\mu \subseteq [{\int}_{\mathrm{\Omega}}{l}_{F}\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{\mathrm{\Omega}}{r}_{F}\phantom{\rule{0.2em}{0ex}}d\mu ].
On the other hand, let \theta \in [{\int}_{\mathrm{\Omega}}{l}_{F}\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{\mathrm{\Omega}}{r}_{F}\phantom{\rule{0.2em}{0ex}}d\mu ]. We may write \theta =(1\alpha ){\int}_{\mathrm{\Omega}}{l}_{F}\phantom{\rule{0.2em}{0ex}}d\mu +\alpha {\int}_{\mathrm{\Omega}}{r}_{F}\phantom{\rule{0.2em}{0ex}}d\mu for some \alpha \in [0,1]. Then
\begin{array}{rcl}\theta & =& {\int}_{\mathrm{\Omega}}((1\alpha ){l}_{F}+\alpha {r}_{F})\phantom{\rule{0.2em}{0ex}}d\mu \\ =& {\int}_{\mathrm{\Omega}}{f}_{\alpha}\phantom{\rule{0.2em}{0ex}}d\mu ,\end{array}
where {f}_{\alpha}=(1\alpha ){l}_{F}+\alpha {r}_{F}. Hence, \theta \in {\int}_{\mathrm{\Omega}}F\phantom{\rule{0.2em}{0ex}}d\mu.
The second equality is an immediate consequence of this. □
It will always be assumed that both {l}_{F} and {r}_{F} are integrable.
Example Let Ω and F be defined as in the previous example. Let μ be the measure defined by
d\mu =\frac{1}{{t}^{3}}\phantom{\rule{0.2em}{0ex}}dt.
Then
{\int}_{\mathrm{\Omega}}F\phantom{\rule{0.2em}{0ex}}d\mu =[{\int}_{1}^{\mathrm{\infty}}{l}_{F}(t)\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{1}^{\mathrm{\infty}}{r}_{F}(t)\phantom{\rule{0.2em}{0ex}}d\mu ]=[1,\frac{4}{3}].
In view of (3), we have the following corollary.
Corollary 10 Let {F}_{1},{F}_{2}:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be integrable intervalvalued maps. Then
\begin{array}{rcl}{\int}_{\mathrm{\Omega}}({F}_{1}+{F}_{2})\phantom{\rule{0.2em}{0ex}}d\mu & =& {\int}_{\mathrm{\Omega}}{F}_{1}\phantom{\rule{0.2em}{0ex}}d\mu +{\int}_{\mathrm{\Omega}}{F}_{2}\phantom{\rule{0.2em}{0ex}}d\mu \\ =& [{\int}_{\mathrm{\Omega}}{l}_{{F}_{1}}\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{\mathrm{\Omega}}{r}_{{F}_{1}}\phantom{\rule{0.2em}{0ex}}d\mu ]+[{\int}_{\mathrm{\Omega}}{l}_{{F}_{2}}\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{\mathrm{\Omega}}{r}_{{F}_{2}}\phantom{\rule{0.2em}{0ex}}d\mu ]\\ =& [{\int}_{\mathrm{\Omega}}({l}_{{F}_{1}}+{l}_{{F}_{2}})\phantom{\rule{0.2em}{0ex}}d\mu ,{\int}_{\mathrm{\Omega}}({r}_{{F}_{1}}+{r}_{{F}_{2}})\phantom{\rule{0.2em}{0ex}}d\mu ].\end{array}
Let (\mathrm{\Omega},\mathcal{S},P) be a probability space, and let Z:\mathrm{\Omega}\rightrightarrows {\mathbb{R}}^{k} be an interval random variable. We have
Z(\omega )=[{l}_{Z}(\omega ),{r}_{Z}(\omega )]=\{{z}_{\alpha}:=(1\alpha ){l}_{Z}(\omega )+\alpha {r}_{Z}(\omega ):\alpha \in [0,1]\}.
We shall say that Z is normally distributed if each z\in Z is normally distributed. An interval stochastic process {\{{Z}_{t}\}}_{t\in T} will be called normally distributed if for each t\in T, {Z}_{t} is normally distributed.
Let Z be an interval random variable. Then for each z\in Z,
{p}_{{l}_{z}}\le {p}_{z}\le {p}_{{r}_{z}}.
By the continuity of z\mapsto {p}_{z},
{p}_{Z}=[{p}_{{l}_{z}},{p}_{{r}_{z}}].
This means that
{l}_{{p}_{Z}}={p}_{{l}_{z}},\phantom{\rule{2em}{0ex}}{r}_{{p}_{Z}}={p}_{{r}_{z}}.
Guided by this and Lemma 9, we can define the interval expectation of the interval random variable Z as follows.
Definition 11 The interval expectation of an interval random variable Z is defined as
E(Z)=[E({l}_{Z}),E({r}_{Z})].
This definition coincides with Definition 2 since
\begin{array}{rcl}[E({l}_{Z}),E({r}_{Z})]& =& \{\alpha E({l}_{Z})+(1\alpha )E({r}_{Z}):\alpha \in [0,1]\}\\ =& \{E(\alpha {l}_{Z}+(1\alpha ){r}_{Z}):\alpha \in [0,1]\}\\ =& \{E({z}_{\alpha}):\alpha \in [0,1]\}.\end{array}
It should also be noted that the expectation of a vector random variable is the vector of expectations of its components.
It follows from equations (2) and (3) that
Also, if I=[a,b] and Z is an interval random variable, then
\begin{array}{rcl}E(IZ)& =& \{E({t}_{\alpha}{z}_{\alpha}):\alpha \in [0,1]\}\\ =& \{{t}_{\alpha}E({z}_{\alpha}):\alpha \in [0,1]\}\\ =& I\ast \{E({z}_{\alpha}):\alpha \in [0,1]\}\\ =& IE(Z).\end{array}
The same is true if I is an interval vector and Z is an interval random variable.
More generally, if A is a k\times k interval matrix and if its columns are denoted by the interval vectors {\mathbf{A}}_{1},{\mathbf{A}}_{2},\dots ,{\mathbf{A}}_{k}, then
\begin{array}{rcl}E(\mathbf{AZ})& =& E\left(\sum _{j=1}^{k}{\mathbf{A}}_{i}{Z}_{j}\right)\\ =& \sum _{j=1}^{k}E({\mathbf{A}}_{i}{Z}_{j})=\sum _{j=1}^{k}{\mathbf{A}}_{i}E({Z}_{j})\\ =& \mathbf{A}E(\mathbf{Z}).\end{array}
To introduce covariance of two interval random variables Y, Z, we need to assume that the function (x,y)\mapsto {p}_{x,y} is continuous on Y\times Z. Here, {p}_{x,y} is the joint probability density function of the two random variables x, y.
Definition 12 The interval covariance of two interval random variables Y, Z is defined as
Cov(Y,Z)=\{Cov({y}_{\alpha},{z}_{\alpha}):\alpha \in [0,1]\}.
To see that Cov(Y,Z) is an interval, note that
\begin{array}{rcl}Cov(Y,Z)& =& \{Cov((1\alpha ){l}_{Y}+\alpha {r}_{Y},(1\alpha ){l}_{Z}+\alpha {r}_{Z}):\alpha \in [0,1]\}\\ =& \{{(1\alpha )}^{2}Cov({l}_{Y},{l}_{Z})+\alpha (1\alpha )Cov({l}_{Y},{r}_{Z})\\ +\alpha (1\alpha )Cov({r}_{Y},{l}_{Z})+{\alpha}^{2}Cov({r}_{Y},{r}_{Z}):\alpha \in [0,1]\}.\end{array}
If Y=Z, we get the definition of the variance of an interval random variable Z as
\begin{array}{rcl}Var(Z)& =& \{Var({z}_{\alpha}):\alpha \in [0,1]\}\\ =& \{{(1\alpha )}^{2}Var({l}_{Z})+2\alpha (1\alpha )Cov({l}_{Z},{r}_{Z})+{\alpha}^{2}Var({r}_{Z}):\alpha \in [0,1]\}\end{array}
which is also an interval. Elementary calculus considerations reveal that
Var(Z)=[\frac{ab{c}^{2}}{a+b2c},max\{a,b\}],
where a=Var({l}_{Z}), b=Var({r}_{Z}), c=Cov({l}_{Z},{r}_{Z}). This last equation provides a formula for computing the interval Var(Z).
For interval random vectors, the above definitions hold componentwise.
The two interval random variables Y, Z will be called uncorrelated if for each {y}_{\alpha}\in Y, {z}_{\alpha}\in Z, {y}_{\alpha}, {z}_{\alpha} are uncorrelated. Therefore, Y, Z are uncorrelated if and only if Cov(Y,Z)=[0].
It is now straightforward to check the following theorem.
Theorem 13 Let \mathbf{Y},\mathbf{Z}\in {\mathbf{IR}}^{k}, \mathbf{W}\in {\mathbf{IR}}^{m}, {\mathbf{IR}}^{n} be interval random vectors, and let \mathbf{A}\in {\mathbf{IR}}^{{k}^{\mathrm{\prime}}\times k}, \mathbf{B}\in {\mathbf{IR}}^{{m}^{\mathrm{\prime}}\times m}, \mathit{\lambda}\in \mathbf{IR}, then

1.
Cov(\mathit{\lambda}\mathbf{Y},\mathbf{W})=\mathit{\lambda}Cov(\mathbf{Y},\mathbf{W}),

2.
Cov(\mathbf{Y}+\mathbf{Z},\mathbf{W})=Cov(\mathbf{Y},\mathbf{W})+Cov(\mathbf{Z},\mathbf{W}),

3.
Cov(\mathbf{AY},\mathbf{BW})=\mathbf{A}Cov(\mathbf{Y},\mathbf{W}){\mathbf{B}}^{T}.
The assumed continuous dependence of the probability density function (joint density function) on the random variable (variables) in an interval random variable (interval random variables) implies that the conditional probability density function is also continuous. This guarantees that the generalization of the conditional density function to the interval setting is always an interval.
Definition 14 The interval conditional expectation is defined as
\begin{array}{rcl}E(ZY)& =& \{E({z}_{\alpha}{y}_{\alpha}):\alpha \in [0,1]\}\\ =& \{\alpha E({l}_{Z}{y}_{\alpha})+(1\alpha )E({r}_{Z}{y}_{\alpha}):\alpha \in [0,1]\}.\end{array}
The following theorem is easily checked.
Theorem 15 For vector random variables X, Y, Z and interval matrix A of appropriate dimensions,

1.
E(\mathbf{X}+\mathbf{Y}\mathbf{Z})=E(\mathbf{X}\mathbf{Y})+E(\mathbf{Y}\mathbf{Z}),

2.
E(\mathbf{AY}\mathbf{Z})=\mathbf{A}E(\mathbf{Y}\mathbf{Z}).