Modern theory of probability is formulated on the footing of measure theory. Use this tag if your question is about this theoretical footing (for example probability spaces, random variables, law of large numbers, central limit theorems, and the like). Use (probability) for explicit computation of ...
0
votes
0answers
16 views
Combining convergence in probability and the means of the positive sequence of r.v. implies convergence in L 1
Let $\{X_n\}$ be a collection of positive random variable with $X_n \rightarrow X$ in probability. Prove that if $E(X_n) \rightarrow E(X)$, then $X_n \rightarrow X$ in $L^1$.
My partial answer:
Let ...
3
votes
2answers
37 views
Can the $0$-norm represent determinism?
In Scott Aaronson's Quantum Computing since Democritus, he presents classical probability theory as based on the $1$-norm, and QM as based on the $2$-norm.
Call $\{v_1,\ldots,v_N\}$ a unit vector ...
1
vote
0answers
45 views
Conditioning a Poisson process on the number of arrivals in a fixed time
Let T1 and T5 be the first and fifth arrivals in a Poisson process with rate lambda.
(a) Find the conditional density of T1 given that there are 10 arrivals in the time interval (0,1)
(b) Find the ...
1
vote
1answer
17 views
Kolmogorov three series theorem, convergence
Let $X_n$, $n\geq 0$, be i.i.d. random variables such that: $\mathbb E(X_1)=0$, and $0<\mathbb E(|X_1|^2)<\infty$. Given that $\alpha >\frac{1}{2}$, I need to show that for some $A>0$, ...
1
vote
0answers
15 views
canonical form of dyadic martingales
Let $(X_k)_{1\leq k \leq n}$ be a Walsh-Paley $L^p$-martingale (a dyadic martingale) with values in a Banach space $X$.
Why does there exist a dyadic martingale $(Y_k)_{1\leq k \leq n}$ with the ...
2
votes
1answer
26 views
Martingale and Stopping Time
Consider the random walk $$S_n=\sum_{k}^{n}X_{k}$$
Where $X_k$'s are iid, $$\mathbb P(X_1=1)=\mathbb P(X_1=-1)=\frac{1}{2}$$
and $\mathcal{F}_{n}=\sigma(X_i,0\leq i\leq n)$.
How do I prove that ...
0
votes
1answer
16 views
Definitions of K-L divergence based on likelihood ratio and on R-N derivative
From Wikipedia
For distributions $P$ and $Q$ of a continuous random variable, KL-divergence is defined to be the integral:
$$
D_{\mathrm{KL}}(P\|Q) = \int_{-\infty}^\infty ...
4
votes
1answer
23 views
$L^{p}$ functions from Rudin Exercises 3.5
I am attempting a question from Rudin's "Real and Complex Analysis" Chapter 3 question 5. I shall summarise the question as below: Suppose that $f$ is a complex measurable function on $X$, $\mu$ a ...
0
votes
1answer
22 views
Conditional Probability with marginal densities
X and Y have the joint denstiy:
$f(x,y) = 2x+2y-4xy$ for $0< X< 1$ and $0< Y< 1$
and 0 otherwise.
.
(a) Find The marginal densities of X and Y
I got both marginal densities equal to 1 ...
1
vote
1answer
31 views
Conditional Probability Proof
Suppose that X and Y are independent discrete random variables. Let h(x,y) be a
bounded two-variable function. Show that:
E [h(X,Y)|X = x] = E [h(x,Y )]
Explain why this is usually not true if X and ...
1
vote
1answer
24 views
Conditional Probability Given N=n
suppose that $N$ is a Poisson$(μ)$ random variable. Given $N=n$, random variables $X_1,X_2,X_3,\cdots,X_n$ are independent with uniform∼$(0,1)$ distribution. So there are a random number of $X$'s.
...
2
votes
2answers
48 views
Any example for a function having domain and range as subset of real line that is NOT Borel function?
Suppose there is a function $f:A\to B$ where $A,\,B\subseteq\mathbb{R}$, is there any example for this function being NOT Borel function?
Well the question came up to be when I was reading the ...
1
vote
1answer
31 views
probability to choose the same number in a set
Suppose a set of numbers $\{1,2,\ldots,n\}$ and a set of entity $\{1,2,\ldots,m\}$ (with $n,m$ any real number).
How can I compute the probability that each entity choose the same number in the set?
1
vote
1answer
21 views
$X_n \sim \text{Exponential}(\lambda_n)$, independent, $\sum 1/\lambda_n = \infty$, then, $\sum X_n=\infty$ a.s.
Let $\{X_n\}$ be a sequence of independent Exponential random variables with mean
$$
E(X_n)=\frac{1}{\lambda_n},
$$
where
$$
0 < \lambda_n < \infty.
$$
If
$$
\sum \frac{1}{\lambda_n} = \infty,
...
3
votes
2answers
39 views
A criterion for independence based on Characteristic function
Let $X$ and $Y$ be real-valued random variables defined on the same space. Let's use $\phi_X$ to denote the characteristic function of $X$. If $\phi_{X+Y}=\phi_X\phi_Y$ then must $X$ and $Y$ be ...