Questions about maps from a probability space to a measure space which are measurable.
1
vote
1answer
23 views
problem on random variable in probability
A game consists of first rolling an ordinary 6-sided die once and then tossing a fair coin once. The score, which consist of adding the number of spots showing on the die to the number of heads ...
-5
votes
0answers
24 views
To Find Covariance Matrix From a Random Variable Matrix [on hold]
$$X=\begin{pmatrix}
1& 2& 3\\
2& 4& 1\\
3& 1& 1\\
4& 1& 2
\end{pmatrix}$$
What is the covariance matrix? Please Help.
0
votes
0answers
17 views
Determining if matrix could be a valid covariance and cross-covariance matrix
Given a matrix, $M$, how can you determine the following?
Could it be a valid covariance matrix of some random vector?
Could it be a valid cross-covariance matrix of two random vectors?
Could it be ...
0
votes
0answers
50 views
conditional random variable?
Can we really talk about conditional random variable (X$\mid$B); a random variable defined by the conditional distribution of the random variable X given the event B, or this is completely wrong?
...
0
votes
1answer
28 views
Given density $f_U(u)$ and $U=X_1/X_2$, can we find $f_{X_1}(x_1)$ and $f_{X_2}(x_2)$?
Problem Statement:
Let $X_1$ and $X_2$ be independent random variables.
Given the density function, $ f_U(u)$, and the relationship:
$$U=X_1/X_2$$
Can we find the density functions for $X_1$ and ...
0
votes
1answer
18 views
Entropy of geometric random variable?
I am wondering how to derive the entropy of a geometric random variable? Or where I can find some proof/derivation? I tried to search online, but seems not much resources is available.
Here is the ...
0
votes
1answer
33 views
Given a random value and its expected value, how can you determine if another random variable exists?
Suppose there exists a random variable $X$ with $E[X] = 1$, can $E[2^{-X}] = \frac{1}{4}$ exist? Can $E[2^X] = 8$ exist? Is there a general method to solve this type of problem?
1
vote
0answers
20 views
Division of Dependent Random Variables
Let $X_1$ and $X_2$ be dependent random variables.
Find the density function for:
$$U=X_1/X_2$$
Can I use the transformation method to find the conditional density and then use method of ...
1
vote
1answer
38 views
Proof of $E(X)=a$ when $a$ is a point of symmetry
I am trying to develop a proof of the following:
Given a random variable $X$ with symmetric probability density function $f(x)$, prove that $E(X)=a$ where $a$ is the point of symmetry.
A couple of ...
2
votes
1answer
59 views
Weak form of Berry-Esséen theorem
Let $X$ (a real random variable) have mean zero, unit variance and finite third moment. Let $Z_{n}:=(X_{1}+...+X_{n})/\sqrt{n}$, where $X_{1}, ... X_{n}$ are iid copies of $X$. According to the ...
0
votes
1answer
21 views
Showing $\int_0^{\infty}(1-F_X(x))dx=E(X)$ in both discrete and continuous cases
Ok, according to some notes I have, the following is true for a random variable $X$ that can only take on positive values, i.e $P(X<0=0)$
$\int_0^{\infty}(1-F_X(x))dx=\int_0^{\infty}P(X>x)dx$
...
2
votes
1answer
20 views
Question regard the notion of almost sure convergence
Consider an $n\times m$ matrix with i.i.d. entries each having zero mean and variance $1/n$.
Let $Y = X^TX$.
By the strong law of large numbers, we know that the $(i,j)$ entry of $Y$ goes almost ...
0
votes
1answer
44 views
Density of the sum of two random variables with parameters.
I've got a problem from probability theory, which is supposed to have simple theoretical solution. The statement is: "Suggest a probability distribution $(ζ,η)\inℝ^2$, such that random variable $γ = ...
0
votes
1answer
51 views
Given a function and how do you determine the pdf of the left side given the pdf of the right side variables?
Given a function and how do you determine the pdf of the left side given the pdf of the right side variables? Specifically what is the pdf of W, given the equation
$$
W = I^2 R$$
with $I$ and $R$ are ...
1
vote
0answers
42 views
Calculate the variance from a function of normal random variable
I am new to the topic that I found difficulty for the question:
Given the function $g(x) = e^{-X}$, $X \sim N(0,1)$, calculate the variance of $g(x)$.
I know the answer is $e(e-1)$. But I don't ...
0
votes
1answer
38 views
Joint pdf of independent randomly uniform variables
Given two random uniform variables, $U$ and $V$, that are uniformly distributed over [0,1], how do you calculate the joint pdf of $X$, $Y$ where $X = F(U,V)$ and $Y = G(U,V)$ and where is the joint ...
3
votes
2answers
135 views
Sums of Products of Two Normal Variables
Suppose that $X_1 ,\ldots,X_n,Y_1,\ldots,Y_n$ are all independent normal random variables with different means and variances. What is the PDF of the following random variable?
...
1
vote
1answer
29 views
Can the sum of two pregaussian random variables be zero?
Do there exist any two Sub-Gaussian random variables with variance 1 such that the sum of them is the $0$ random variable, whose value is $0$ with probability $1$?
0
votes
1answer
52 views
Probability of a random variable dependent on a parameter.
Let $X_L$ be a random variable dependent on a parameter $L$, taking only discrete values between $0$ and $+\infty$. Let $\mu L$ be its expectation, where $\mu$ is a costant. Which conditions should I ...
0
votes
2answers
68 views
What is the distribution of $c^x$? ($c$ is a constant, $x$ is a random variable)
What is the distribution of $c^x$, where $c$ is a constant and $x$ is a random variable?
For example, $x$ follows a Poisson distribution, what is the distribution of $2^x$?
1
vote
0answers
25 views
Laplace transform of a sum of stochastic variables
I have a problem with interpretation of one transformation performed on equation consisting of continuous random variables. Here is the source equation describing recurent relationship between the ...
1
vote
1answer
9 views
nonlinear transform of Gaussian random variable that preserves Gaussianity
I recently know that following results.
suppose that $x_1, x_2, x_3$ are independent real Gaussian random variables with $\mathcal{N}(0, 1)$. Then
$$
\frac{x_1 + x_2 x_3}{\sqrt{1+x_3^2}} \sim ...
0
votes
2answers
56 views
Fisher's information for two independent random variables
If $X$ and $Y$ are two independent random variables, with regular distributions, how can I prove
$I_{x,y}(\theta) = I_x(\theta) + I_y(\theta)$ ?
Thanks!
I tried:
$$ {\rm E}_\theta ...
2
votes
2answers
68 views
Ergodicity of a sequence of independent blocks
I am stuck with the problem given below, more precisely, with the part regarding ergodicity. I have a proof, also given in what follow, but it does not seem to be correct; well, at least, it does not ...
1
vote
1answer
24 views
Correlation of sums of correlated variables
I'm trying to work out an expression for a correlation of the weighted sums of two r.v.'s with a third r.v. To be precise, I have a trivariate normal distribution:
$$\{X,Y,Z\}\approx ...
0
votes
0answers
45 views
If $X$ is a random variable, is $p( X )$ also a random variable?
If $X$ is a random variable with density/mass function $p$, is $p(X)$ also a random variable?
3
votes
2answers
56 views
PDF of summation of independent random variables with different mean and variances
What can we say about the probability distribution function of $n$ independent random variables with different means and variances but the same PDF?
For example, lets say $X_1,X_2,\dots,X_n$ are ...
0
votes
2answers
41 views
Prove that the CDF of a random variable is always right-continuous
Let $X$ be a random variable with cumulative distribution function $F_X$. It is a known fact that this function $F_X$ is right-continuous. But I'm having some trouble to prove this result. Below I'm ...
0
votes
0answers
39 views
Generalized Rayleigh distribution?
A variable $z=\sqrt{x^2+y^2}$ is Rayleigh distributed with parameter $\sigma$ if $x\sim\mathcal{N}(0,\sigma^2)$ and $y\sim\mathcal{N}(0,\sigma^2)$.
Is there a known/named distribution for the case ...
2
votes
1answer
51 views
Basu's Theorem's application
I can't solve a problem, for which I am suppose to use Basu's theorem. Suppose that $X$ and $Y$ are independent Exponential random variables with common parameter $\lambda$. I have to show that $X + ...
0
votes
1answer
66 views
probability density and distribution
Let $X$ and $Y$ be uniform random variables where $0\leqslant x,y\leqslant 1$.
Let $\operatorname{sgn}(x)$ be $1$ when $x>0$, $−1$ when $x<0$ and $0$ when $x=0$.
Find the distribution and ...
2
votes
1answer
20 views
Discovering the joint distribution for two dependant RVs?
Suppose I have three continuous random variables. X1, X2, and Y, where Y = X1+X2, and X1 and X2 are dependent.
If I know the probability distributions separately for X1, X2, and Y, is there a ...
2
votes
2answers
82 views
Almost sure convergence of a sum of random variables
Suppose $(X_i)_{i=1}^{\infty}$ is an i.i.d. sequence of rv's, where $X_i$ can take countably many values $\{x_1,x_2,\dots\}$ with probabilities $\{p_1,p_2\dots\}$, respectively. Let $p_{n,k}:= ...
2
votes
1answer
39 views
Prove expectation inequality
Any ideas on how I could prove the veracity or falseness of the following inequality?
Let $X:\Omega \to \mathbb{R}$ a random variable such that the expressions under are well-defined. Then
$$E[e^X] ...
0
votes
0answers
64 views
Measurable random variable with respect to a finite partition
Let $(\Omega,\mathcal F,P)$ be a probability space, and $X$ is a random variable on $\Omega$. If $\mathcal G$ is a sub sigma algebra of $\mathcal F$, generated by a partition $\{ A(i),\; ...
1
vote
1answer
36 views
Issue with a Poisson process and its jump times
Let $(N_t)_{t\geq 0}$ be a Poisson process and $$T_n = \inf\{t\geq 0, \ N_t \geq n\}$$ Now given $t \ge 0$ how to compute
$$ \mathbb{E} \left[ \sum_{n=1}^{N_t} X_{T_n}\right] $$
? where $(X_t)_{t\ge ...
-1
votes
1answer
24 views
Expectation of an Inverse of a Random Variable
$X$ and $Y$ are two discrete random variables taking values greater than 1. Given, $\mathbf{E}[X] \geq \mathbf{E}[Y]$, either prove that $\mathbf{E}[\frac{1}{X}] \leq \mathbf{E}[\frac{1}{Y}]$ or ...
1
vote
1answer
34 views
What does orthogonal random variables mean?
As far as I know orthogonality is a linear algebraic concept, where for for a 2D or 3D case if the vectors are perpendicular we say they are orthogonal. Even it is OK for higher dimensions. But when ...
0
votes
1answer
23 views
Law of Gaussian Vector and Matrix
I have a probability problem that I’m struggling with. Any help would be appreciated.
I have no clue about how to solve this.
Here is the problem:
Let A be:
\begin{pmatrix} 4 & 7 \\ 1 & -2 ...
3
votes
1answer
44 views
Criterion for independency of random variables
I saw in some notes the following "criterion" for independency of two random variables.
Let $X$ and $Y$ be real-valued random variables defined on the same space. $X$ and $Y$ are statistically ...
1
vote
0answers
21 views
Derive Student T distribution using transformation theorem
I am trying working on an exercise that asks me to show that
If $ X_1 \in N(0,1) $ and $ X_2 \in \chi^2(n) $ are independent random variables, then $ X_1 / \sqrt{X_2/n} \in t(n) \, $ where $ ...
3
votes
1answer
37 views
Distribution of the minimum of two random variables
Here is my problem. Let us consider X, Y and Z to be random variables following the exponential distribution (same mean or not does not matter).
I am trying to find the distribution of the ...
0
votes
0answers
37 views
Transforming 2 Random Variables from Joint Uniform Distribution using Jacobian
I am trying to transform 2 random variables A and B to 2 other variables X and Y and get the joint probability density function of X and Y.
I am given the equations: A = ln[Y / (1-Y)] and B = ...
2
votes
1answer
72 views
Random processes: Repair time
I have a question that is to do with qeueing theory and repair times:
Assume that a small office has 4 printers. Each printer breaks down independently of the
other printers and independently of the ...
3
votes
1answer
93 views
Weak convergence in the Skorohod space $D([0, T])$ $\forall T$ implies Weak convergence in $D([0, \infty))$?
Assume that $Z_n$ are random variables taking value in the Skorohod space $D([0, \infty),Y)$ (endowed with its usual Skorohod topology) of right-continuous functions $[0, \infty) \to Y$, where $Y$ is ...
0
votes
1answer
34 views
Conditional probability over a function
I have a question if the following relations on conditional probabilities hold for independent random variables?
$$P_{X \mid Y, G(Y)}(x_1)=P_{X \mid \{Y\}}(x_2)$$ where $G$ is not necessarily ...
1
vote
1answer
56 views
distribution of maximum of $n$ Pearson correlations
$\mathbf{x}=[x_1,x_2,...,x_m]^{\top}$ is a vector of length $m$ and $\mathbf{y_1}, \mathbf{y_2}, ..., \mathbf{y_n}$ are similarly $n$ vectors of length $m$.
If the elements of $\mathbf{x}$ and ...
0
votes
1answer
48 views
Conditional distribution of a function of random variables
I have a question about conditional distribution. Suppose we have three independent random variables $X_1$, $X_2$, $X_3$.
Then we have mapping $Y_1=g(X_1, X_2)$. The mapping is not necessarily an ...
1
vote
2answers
22 views
Results about conditional expectations
$\theta$, $\phi$ are integrable random variables on a probability space $(\Omega,\mathcal{F},P)$ and $\mathcal{G}$ is $\sigma$-field on $\Omega$ contained in $\mathcal{F}$.
Now we want to prove ...
0
votes
1answer
27 views
How do you calculate the expected value, the variance and the standard deviation of a sum of random variables, regardless of their distribution
I was looking for a formula or something, but I can't find anything anywhere. So, can someone tell me the steps I need to follow in order to do that? I thought that the expected value of the sum was ...