Modern theory of probability is formulated on the footing of measure theory. Use this tag if your question is about this theoretical footing (for example probability spaces, random variables, law of large numbers, central limit theorems, and the like). Use (probability) for explicit computation of ...
1
vote
1answer
14 views
Conditional Probability Given N=n
suppose that $N$ is a Poisson$(μ)$ random variable. Given $N=n$, random variables $X_1,X_2,X_3,\cdots,X_n$ are independent with uniform∼$(0,1)$ distribution. So there are a random number of $X$'s.
...
1
vote
2answers
41 views
Any example for a function having domain and range as subset of real line that is NOT Borel function?
Suppose there is a function $f:A\to B$ where $A,\,B\subseteq\mathbb{R}$, is there any example for this function being NOT Borel function?
Well the question came up to be when I was reading the ...
1
vote
1answer
28 views
probability to choose the same number in a set
Suppose a set of numbers $\{1,2,\ldots,n\}$ and a set of entity $\{1,2,\ldots,m\}$ (with $n,m$ any real number).
How can I compute the probability that each entity choose the same number in the set?
1
vote
1answer
20 views
$X_n \sim \text{Exponential}(\lambda_n)$, independent, $\sum 1/\lambda_n = \infty$, then, $\sum X_n=\infty$ a.s.
Let $\{X_n\}$ be a sequence of independent Exponential random variables with mean
$$
E(X_n)=\frac{1}{\lambda_n},
$$
where
$$
0 < \lambda_n < \infty.
$$
If
$$
\sum \frac{1}{\lambda_n} = \infty,
...
2
votes
1answer
27 views
A criterion for independence based on Characteristic function
Let $X$ and $Y$ be real-valued random variables defined on the same space. Let's use $\phi_X$ to denote the characteristic function of $X$. If $\phi_{X+Y}=\phi_X\phi_Y$ then must $X$ and $Y$ be ...
1
vote
1answer
10 views
Distribution of the product of two independent and complex Gaussian Random Variables
What is the distribution of the product of two independent and complex Gaussian Random Variables? Assume they both have zero mean.
1
vote
1answer
31 views
Predictable process and supermartingale
Let $S$ and $T$ be stopping times with $S\leq T$. Define the process $1_{(S,T]}(n,\omega)$ with parameter set $\mathbb{N}$ to be 1 if $S(\omega)<n\leq T(\omega)$, and $0$ otherwise. I am asked to ...
1
vote
0answers
29 views
Interchange supremum and expectation
Let $B_n:=\{f\in L^\infty_+\mid f\le n \}$, where we consider $L^\infty$ with the weak$^*$ topology. I have the following sets
$$D(z):=\{h\in L^0_+\mid h\le Z_T \mbox{ for a }Z\in Z(z)\}$$
where ...
1
vote
2answers
44 views
Probability Game
A game has the following rule -
“Scissors cuts paper. Paper covers rock. Rock crushes lizard. Lizard poisons Spock. Spock smashes scissors. Scissors decapitates lizard. Lizard eats paper. Paper ...
0
votes
0answers
22 views
Showing that $|X|^p$ is a submartingale
I am trying to get my head around something, and I'm pretty sure the answer is lurking right in front of me.
We're in $(\Omega,\mathcal{F},P,(\mathcal{F}_n))$, and we denote by $(X_n)$ a ...
2
votes
1answer
25 views
Independence and conditional expectation
So, it's pretty clear that for independent $X,Y\in L_1(P)$ (with $E(X|Y)=E(X|\sigma(Y))$), we have $E(X|Y)=E(X)$. It is also quite easy to construct an example (for instance, $X=Y=1$) which shows that ...
1
vote
0answers
26 views
Continuous random sampling with replacement.
Construct a set $s\subseteq[0,1]$ by sampling points in $[0,1]$ with uniform probability density $x\leq1$ so that $|s|=x$. Interpret this as a sampling frame during which data is captured. Now, ...
0
votes
1answer
43 views
Intuition behind conditional expectation when sigma algebra isn't generated by a partition
I'm struggling with the concept of conditional expectation, when the sigma algebra on which it is conditioned isn't generated by a partition.
If $(\Omega,\mathcal{F},P)$ is a probability field such ...
-1
votes
2answers
127 views
probability problem [closed]
the probability that a man who is 85 yrs old will die before attaining the age of 90 is 1/3. Four persons A1,A2,A3, and A2 are 85 yrs old. The probability that A1 will die before attaining the age 90 ...
0
votes
0answers
25 views
Generalized Likelihood Ratio Test and Hypothesis Testing
Below is a question from a review sheet on an upcoming final that I am really struggling with. Any help is greatly appreciated!
Let $Y_1, Y_2,...,Y_8$ be a random sample from the uniform ...
1
vote
0answers
31 views
Using the Law of Iterated Logarithm
The law of iterated logarithm says that for a Brownian motion $(B_t)_{t\geq 0}$:
\begin{align}
\limsup_{t\to 0}\frac{|B_t|}{\sqrt{2t\ln\ln\frac{1}{t}}}=1
\end{align}
so we can find $t_0>0$ such ...
0
votes
1answer
15 views
Probability integral transform: Is it integral transform? Can it be for discrete distribution?
From Wikipedia
the probability integral transform or transformation relates to the result that data values that are modelled as being random variables from any given continuous distribution can be ...
0
votes
1answer
46 views
Counterexamples for Borel-Cantelli
Our teacher mentioned to construct two counterexmaples for Borel-Cantelli using the following ways.
(a) Construct an exmaple with $\sum_{i=1}^{\infty}\mathbb P(A_i)=\infty$ where $\mathbb ...
0
votes
0answers
23 views
Measurability conditions
May you could help me understanding the following two things.
(1) Lets consider a partition $A_n$ from a set $\Omega$, $\Omega$ is the disjoint union of all $A_n$. $\mathcal{A}=\sigma(A_n:n\in\mathbb ...
0
votes
0answers
20 views
Stuck on proof of a martingale equality (similar to doob's inequality)
The question is: Let $M$ be a positive, continuous martingale that converges a.s. to zero as $t$ tends to infinity. Prove that for every $x > 0$
$$P\{\sup_{t\geq 0} M_t > x \,| \,F_0\} = 1\wedge ...
0
votes
1answer
27 views
Computing $\mathbb{E}[X_1S^4]$
Given $X_1,X_2,\cdots$ i.i.d. random variables with $\mathbb{E}[X_i]=0$. If we are given $S=\sum_{i=1}^{10}X_i$ and the fact that $\mathbb{E}[S^5]=30$ what method do you need to compute ...
1
vote
1answer
41 views
Independence of conditional random variables
Assume I have two random variables $A$ and $B$, which are not independent. In my particular case they will be values of a stochastic process at two given points in time, where $A$ is observed at an ...
-2
votes
0answers
34 views
Is the discounted sum of non-negative random values a convergent submartingale?
Consider the random variables $X_{1}, X_{2}, \ldots, X_{n}$ with $\infty > E[X_{n}] >0$ and the discounted sum up to $n$, $S_{n} = \beta^{n} \sum_{i=1}^{n}X_{i}$, $0<\beta<1$. (Q1) Is ...
8
votes
0answers
107 views
+50
An unusual type of linear algebra problem
I've come across the following linear algebra problem while trying to derive something in information theory. I'm looking both for numerical ways to solve this type of problem and for anything ...
1
vote
0answers
25 views
Reverse Hölder Continuity and Hausdorff dimension
Let $f$ be a function on $[0,1]$. Say that $f$ is reverse Hölder continuous of exponent $\beta > 0$ if there is a $C >0$ such that for any $s<t\in [0,1]$, there exists $s',t'\in [s,t]$ such ...
3
votes
1answer
30 views
Application of Strong Law of Large Numbers and Fubini's Theorem
This problem comes from here. I am not looking for help on solving the problem, actually to understand something said in the setup:
Let $F$ be a distribution with $F(0-) =0 $ and$F(1)=1$. Let ...
1
vote
0answers
17 views
Estimate on Galton-Watson process distribution
Let $(Zn)_{n\in \mathbb N_0}$ be a Galton-Watson process, i.e.
$$ Z_{n+1} = \sum_{k=1}^{Z_n}\xi_{n,k},\qquad (\xi_{n,k})_{n\in \mathbb N_0,k \in \mathbb N} \quad \text{i.i.d } \mathbb N_0 \text{ ...
0
votes
1answer
61 views
Relation between $\operatorname{Prob}(X+Y)$ and $\operatorname{Prob}(X)+\operatorname{Prob}(Y)$
Let $X,Y$ be two random variables on the same probability space $(\mathbb N_0,\mathcal F, \mathbb P)$.
Under what conditions on $X$ and $Y$ can I say something about the relation between the following ...
3
votes
1answer
46 views
Distribution of a Brownian motion with respect to $\mathbb{P}^x$
Let $(\Omega,\mathcal{A},\mathbb{P})$ a probability space and $(B_t)_{t \geq 0}$ a Brownian motion (started in $x=0$). Then one can define a probability measure $\mathbb{P}^x$, $x \in \mathbb{R}$, on ...
1
vote
0answers
26 views
Interpretation of dP in Radon-Nikodym Theorem
[Radon-Nikodym Theorem]
Let $(\Omega, \Sigma, P)$ be a probability space. Suppose that $(\Omega, \Sigma, \mu)$ be a measure space with $\mu(A)=0$ implies $P(A)=0$, then there exist a function $f:X ...
1
vote
0answers
60 views
Change of probability measure and a continuous-time Markov chain
Let $(\Omega,\mathcal{F},\mathbb{P},\mathbb{F})$ be a complete filtered probability space, with $W$ a Wiener process and $\alpha$ a continuous-time Markov chain (taking values in $\{1,...,M\}$). We ...
0
votes
2answers
39 views
Conditional Expectation of Exponential Order Statistic $\text{E}(X_{(2)} \mid X_{(1)}=r_1)$
Having already worked out the distributions of $\Delta_{(2)}X=X_{(2)}-X_{(1)}\sim\text{Exp}(\lambda)$ and of $\Delta_{(1)}X=X_{(1)}\sim\text{Exp}(2\lambda)$ where $X_{(i)}$ are the $i$th order ...
0
votes
1answer
31 views
independent $L^2$ distance random variable
When you have independent random variables with $L^2$ distance between
each of them is some fixed constant. Say expectation of each of those random variables are zero
THEN
How can i conclude that ...
1
vote
1answer
45 views
problem on sequence of uniform random variable
Let $X_i, i \geq 1,$ be independent uniform (0, 1) random variables, and define $N$ by $$N=\min\{n:X_n < X_{n-1}\}$$ where $X_0 = x$. Let $f(x) = E[N | X_0=x]$
(a) Derive an integral equation for ...
0
votes
0answers
31 views
How to compute $p(\pi>\theta_2+N>\theta_1+\pi, \hspace{1em}\theta_1<0, \hspace{1em}\theta_2<\theta_1+\pi)$ for dependent jointly Normal RVs
How would one go about computing:
$p(\pi>\theta_2+N>\theta_1+\pi, \hspace{1em}\theta_1<0, \hspace{1em}\theta_2<\theta_1+\pi)$
given
$\theta_1, \theta_2$ are jointly Normal (mean=0, ...
0
votes
1answer
85 views
Probability theory problem
A bag contains $5$ white and $5$ black balls. A draws $5$ balls retain any that are white and returns any black ones to the bag. B then draws $5$ balls, retains any that are white and returns any ...
0
votes
0answers
16 views
How to calculate auto-correlation of a bpsk modulated signal or how to calculate expectation value of complex exponential function [migrated]
How to calculate auto-correlation of a bpsk modulated signal or how to calculate expectation value of complex exponential function manually not by using matlab or any other software? for example,if ...
2
votes
0answers
67 views
A representation theorem for a minimally sufficient statistic by Bahadur
The Statement of the Problem
I'd appreciate help in proving the following, unproven theorem from a classic article by Bahadur ([BAH], Theorem 6.3) (the expressions in square brackets are my ...
1
vote
0answers
24 views
Cylindrical sigma algebra answers countable questions only.
I got a missing link in some in the following (standard) textbook question:
Show that the cylindrical sigma algebra $\mathcal{F}_T$ on $\mathbb{R}^T$ (equals $\bigotimes_{t\in ...
1
vote
1answer
29 views
Cumulative distribution function and probabilities
I've run into problem with this exercise:
Let $X$ have distribution function:
$F(x)=\begin{cases}
0, & \text{if $x$}<0 \\
\frac{1}{2}x, & \text{if }0\leq x\leq 2, \\
1, & ...
3
votes
1answer
38 views
Find version of conditional expectation
I'm struggling with the concept of conditional expectation. We didn't cover it in my probability theory class, yet it's required for my statistics course.
I'm basically having no idea how to solve any ...
1
vote
0answers
18 views
Upper bound on truncation error of a fourier series approximation of a pdf?
Given a probability density function, $f\left(x\right)$, of a continuous random variable, $X$, and given an $N$-th order fourier series approximation:
$$f_N\left(x\right)=\sum_{n=-N}^{N}c_n e^{inx}$$
...
6
votes
1answer
41 views
How can a $\sigma$-algebra be “treated” or computed? Example
My question is: I have a random variable $X:\Omega \rightarrow \mathbb{R}$, the $\sigma$-algebra generated by $X$ is: $\sigma(X) := \{X^{-1}(B), B\in \mathcal{B}(\mathbb{R})\}$.
But, imagine now that ...
1
vote
1answer
34 views
Verify a distribution that is not exponential family
I understand that if the support of a distribution depends on the parameter $\theta$, it is not exponential family even if its pdf can be written in the form $ f(x | \theta) = h(x)c(\theta) \exp\left( ...
1
vote
1answer
25 views
Continuity of conditional expectation in $L_p$
I'm looking at a probability space $(\Omega,\mathcal{F},P)$. Let $1\leq p<\infty$, and let $\mathcal{G}$ be a sub-$\sigma$-algebra of $\mathcal{F}$. I'm then asked to show that, for $X\in L_p(P)$, ...
0
votes
1answer
37 views
+50
To define two random variables that satisfy a condition
First:
Let $X_{1}, X_{2}, X_{3}, X_{4}$ be 4 random variables.Consider the two auxiliary random variables: $\tilde{X_{1}}, \tilde{X_{2}}$ where their corresponding alphabets $\tilde{\chi_{1}}, ...
2
votes
0answers
25 views
Gambling Game: Martingales
This is a multipart question; if there's a strong preference for breaking this into separate questions I'll do that.
Imagine a game between a gambler and a croupier.
Total capital in the game is ...
-2
votes
0answers
46 views
Advanced Probability theory
Let E(Y/X)_ω = [Y/A_ω], where A_ω ≈ {r∈Ω:X(r)=X(ω)}. Given the indicator function as
I_A(ω) = ({1, for ω ∈ A; }
{0,for ω ∉ A.})
Show that E(Y/X)_(ω) = 3I_A^c_(ω) + 4I_A(ω).
Hint: Use the ...
7
votes
2answers
122 views
+50
Asymptotics of the sum of squares of binomial coefficients
We are trying to estimate the cardinality $K(n,p)$ of so-called Kuratowski monoid with $p$ positive and $n$ negative linearly ordered idempotent generators. In particular, we are interesting in the ...
0
votes
2answers
30 views
Third Axiom of Probability Explanation
I'm reading my book on probability and it explains the 3rd Axiom as follows:
For any sequence of mutually exclusive events $E_1, E_2, ...$ (that is, events or which $E_iE_j = \emptyset$ when $ i \ne ...