The approach and interpretation of probability associated with Bayes theorem; usually used as opposed to the frequentist approach. It can be seen as an extension of logic that enables reasoning with propositions whose truth or falsity is uncertain. A Bayesian probabilist starts with some prior ...
0
votes
0answers
19 views
Gibbs sampling to produce posterior pdf
Suppose we have the following classical normal linear regression model:
$$y_i = \beta_1 x_{1i} + \beta_2x_{2i} + \beta_3x_{3i} + e_i$$
where $e_{i} \sim iid.N(0, \sigma^2)$ for all $i = 1, 2, ...
0
votes
0answers
27 views
Title of my thesis that includes Generative bayesian and optimization techniques
I am struggling to get a single word that would imply both Generative Bayesian and Optimization technique. I am looking for my thesis topic that goes currently like "Generative Bayesian and ...
1
vote
1answer
35 views
Revising probability using Bayes' rule
Let's say 100 numbers is picked from a set of numbers one by one with replacement, which is between 000 to 999. I'm required to give my subjective probability before the experiment and revise (or ...
1
vote
1answer
25 views
Bayes Theorem with joint probability evidence?
If I am trying to compute the probability $P(Z\mid(A,B))$ using Bayes' Theorem, how would I expand the right-hand side, particularly the evidence $P(A,B)$ in the denominator?
0
votes
0answers
10 views
Standard error of importance sampling estimator
I am currently studying Monte Carlo simulation and more specifically independent, importance sampling. Consider the following context:
Assume we have some posterior pdf ...
0
votes
2answers
79 views
+50
Why would I use Bayes' Theorem if I can directly compute the posterior probability?
I fully understand the mechanics of Bayes' Theorem. However, I am wondering when do I need to use it? If I am able to compute the posterior probability directly from measured data, why would I need to ...
1
vote
0answers
17 views
Bayesian random walk
Suppose that, at first, I am trying to estimate the mean and standard deviation of some data that I assume to be normally distributed. My prior is gaussian with mean $\mu_0$ and variance $\sigma^2_0$. ...
4
votes
2answers
43 views
What is an approach for optimizing the values of a matrix?
My apologies if I get some terminology wrong, I don't have a formal math background; half my problem is articulating what I'm trying to do and identifying the domain of math that deals with this kind ...
1
vote
1answer
38 views
Can a subjective probability be considered as a proportion of certainty?
In a Bayesian framework, can a subjective probability be considered as a proportion of certainty, or does a proportion only make sense if we are, say, counting the number of "successes" out of the ...
1
vote
1answer
35 views
Jeffreys prior with redundant parameters
I am trying to derive the Jeffreys prior for models with redundant parameters. For example, consider a single Bernoulli random variable with two parameters, $\theta_0$ (failure probability) and ...
0
votes
0answers
65 views
Interpretation of bayesian 95% prediction interval
Assume the following bivariate regression model:
$y_i = \beta x_i + u_i$ where $u_i$ is i.i.d $N(0, \sigma^2 = 9)$ for $i = 1, 2, ..., n$.
Assume a noninformative prior of the form:
$p(\beta) ...
1
vote
1answer
34 views
Illustrate the invariance property of a noninformative prior
Consider $n$ i.i.d observations from a normal distribution with unknown mean, $\mu$, and unknown variance $\sigma^2$, ie, $y_i \sim i.i.d \ N(\mu, \sigma^2)$ for $i = 1, 2, \cdots, n$.
Let ...
0
votes
0answers
23 views
Uniform choice for Prior Distribution
My prior function is
$\Phi\left(\mathbf{k}_\ell,W_\ell\right)=\frac{1}{N}\log p\left(\mathbf{k}_\ell,W_\ell\right)$
which is determined once I choose the Bayesian prior parameter likelihood ...
0
votes
1answer
25 views
Natural conjugate prior for bernoulli distribution
Assume we have an i.i.d. sample of $n$ observations from a Bernoulli distribution. That is, $\displaystyle{p(y_i|\theta) = \theta^{y_i}(1-\theta)^{1-y_i}} \ \ \ \ \text{for} \ \ y_i = 0, 1$ and $i = ...
1
vote
1answer
25 views
Finding the marginal posterior distribution of future prediction, $y_{n+1}$
Assume the following bivariate regression model:
$y_i = \beta x_i + u_i$ where $u_i$ is i.i.d $N(0, \sigma^2 = 9)$ for $i = 1, 2, ..., n$.
Assume a noninformative prior of the form:
$p(\beta) ...
1
vote
1answer
40 views
combining conditional probabilities
I've come across a physics paper in which pdf
$$
p(a|b)
$$
is desired, but only
$$
p(a|c)\\
p(c|b)
$$
are known. It is claimed that
$$
p(a|b)=\int p(a|c)p(c|b) dc.
$$
Is this correct wlog? I can't ...
1
vote
1answer
47 views
A question on raining probability using conditional probability
When A predicts raining, the chance of raining is 60%. When B predicts
raining, the chance of raining is also 60%. If A and B both predict to rain
(assuming they did the prediction independently), ...
0
votes
1answer
24 views
Conditional probability of two sequential samples
Suppose we model a distribution over $27$ alphabet symbols with the random variable $\mathbf{X} $, a vector which takes a multinomial distribution, parameterized by $\mathbf{\theta}=(\theta_1, ..., ...
0
votes
1answer
22 views
Help writing Dirichlet (multidimensional Beta) PDF correctly
I am not getting a PDF when I attempt to express the Dirichlet distribution over the random variable vector $\mathbf{\theta}=(\theta_1, ..., \theta_{27})$. Suppose a total of $2000$ observations on ...
2
votes
1answer
32 views
Not sure how to solve Bayesian parameter learning problem
I could use some help solving a problem about a Dirichlet prior.
We have a multinomial distribution over an alphabet of 27 symbols parameterized by $\mathbf{\theta}=(\theta_1, ..., \theta_{27})$. We ...
1
vote
1answer
43 views
Confusing paragraph about Bayesian priors and marginal likelihood
I'm somewhat confused by the following paragraph, taken from Koller's book , on how $P(D)$ is the "probability of seeing this particular data set given our prior beliefs". I would have thought that ...
1
vote
1answer
39 views
Bayesian inference of the true prior distribution, given posterior distribution
I am reading about Bayesian inference. One book (DeGroot) discusses how different prior distributions can change the posterior distribution. Prior distributions are assumptions based on the ...
1
vote
1answer
26 views
Question regarding Bayesian VAR
Reference: http://support.sas.com/rnd/app/da/new/801ce/ets/chap4/sect30.htm.
So there is a VAR equation that is to be treated in Bayesian way:
$\mathbb{y} = (X \otimes I_k)\beta + e$ where $\beta$ ...
1
vote
1answer
24 views
Research and application of causal inference
I have been reading Pearl's book to understand how Bayesian networks and causal discovery might work. Other than Pearl, I haven't yet found a rigorous, systematic approach to causal inference from ...
2
votes
2answers
31 views
Posterior Distribution of a Prior Variable
Let $X_{1},\dots,X_{n}$ be a random sample from an exponential distribution with density $f(x;\theta)=\theta e^{-\theta x}$, $x>0$ (having mean $1/\theta$). Assume a prior density for $\theta$ ...
0
votes
0answers
25 views
Trouble reading multinomial naive bayes notation
$C_m$: m = most likely class (wanted to write C subscript MAP for "maximum a posteriori" but couldn't do MAP with MathJax)
...
-1
votes
1answer
68 views
How to prove if P(A|B)>P(A) then P(B|A)>P(B) [closed]
How to prove that If P(A|B)>P(A) then P(B|A)>P(B)
1
vote
0answers
20 views
Bayesian updating of multivariate normal?
Let $\bf x$ be an unobserved realization of $\tilde{\bf x}\sim\mathcal{N}(\pmb\mu,\pmb\Sigma)$, where $\pmb\mu\equiv\begin{bmatrix}\mu_1\\\mu_2\end{bmatrix}$ and ...
1
vote
2answers
44 views
Hypothetical probability question
I found this question on an online University of Washington course assignment related to Bayesian Probabilities:
You've lost contact with your safari leader and now you find yourself confronted by a ...
0
votes
0answers
42 views
Optimality proof for greedy algorithm
Let $\mathcal{A} = \{a_1, \ldots, a_N\}$ be a set of actions that can be performed on a system $S$. Each action $a_i$, if performed, produces a gain $g_{a_i}(S)$. Moreover, the actions in ...
0
votes
0answers
25 views
Bayesian Updating with Gaussian Signals
My question relates to a standard bayesian result.
Let $A$ be some unknown parameter normally distributed with mean $\mu$ and variance $\sigma^2$.
Is we observe $X = A + \epsilon$ where $\epsilon$ ...
0
votes
1answer
61 views
Why does $p(\textbf{x} | X) = E_{\theta}[p(\textbf{x}|\theta)]$
I'm reading about Bayesian inference and there is one derivation I don't understand or see (from my book):
$\textbf{x} = (x_1, ..., x_n)$ is $n$-dimensional vector, $X = (\textbf{x}_1, ..., ...
2
votes
0answers
32 views
Computing evidence for least-squares fit
I'm at a loss trying to implement Bayesian model selection for standard least-squares polynomials fits.
I have three polynomials of order $1$, $2$, and $3$, and a sequence of $(x,y)$ data points. ...
3
votes
1answer
50 views
Maximum Entropy Distribution When Mean and Variance are Not Fixed with Positive Support
I know when the mean and variance of $\ln x$ are both fixed, then the maximum entropy probability distribution is lognormal. When the mean of a random variable is fixed the MEPD is the exponential ...
0
votes
0answers
47 views
How to calculate a Bayesian Inference over a Poisson Binomial Distribution
In relation to this question, how do I use Bayesian inference over a Poisson Binomial Distribution? If possible, what is the Conjugate Prior?
Thanks to @Stijn, here is an elaboration of the problem:
...
4
votes
3answers
97 views
Is there any research field dedicated to estimating a “game” itself in game theory?
Game theory stuffs usually provide how a "game" works and then tries to figure out solutions - but I am wondering if there is any research field dedicated to estimating the full rules of a game. So ...
2
votes
1answer
25 views
Determine which parameter has correlation with result and which is not
sorry for probably silly question, it's the first time when I need to do such work.
I have large data set with regarding clicks on some element on web page. It contains some characteristics of such ...
2
votes
2answers
95 views
Comparing uniform priors
The background of the problem is this:
Assume that we have a parameter vector $\Theta$ which satisfies $\Theta^\prime\Theta=1$. If we let this vector have the uniform prior, the density of the prior ...
0
votes
1answer
31 views
Checking independence of variables in a Bayesian network
I need a little help with Bayesian Networks. Consider given the following network (all variables are binary) and we need to check conditional independence of $A$ and $C$ if $X$ and $Z$ are given.
Any ...
3
votes
1answer
81 views
How to do Bayesian updating on biased information?
You have a coin that you can flip, but you can't see. It's a weighted $3$-sided coin taken (uniformly) randomly from some small known collection of $100$ weighted coins. However, we don't know how ...
0
votes
1answer
116 views
Bayes rule with multiple conditions
I am wondering how I would apply Bayes rule to expand an expression with multiple variables on either side of the conditioning bar.
In another forum post, for example, I read that you could expand ...
3
votes
1answer
42 views
Implied prior with relationship $y=\text{arccot}(x)$
I'm trying to solve an exercise, which I think I have almost managed to solve but not quite. Any help would be appreciated!
So, what we have is a vector which we obtain by norming the vector ...
0
votes
1answer
43 views
Find the Posterior distribution- prior: $exp(1)$, likelihood: $poisson(\lambda)($
I have a prior $\lambda \sim exp(1)$ and a likelihood $X \sim poisson(\lambda)$, and I observed in a sample of $n=5$ a mean of $3$. What is the posterior distribution of $\lambda$?
Here is my ...
4
votes
1answer
109 views
Is there an introduction to probability and statistics that balances frequentist and bayesian views?
Perhaps, roughly, I might be described as advanced undergraduate regarding mathematics. However, I have not learned statistics and have only learned elementary probability. Does there exist a book or ...
0
votes
0answers
67 views
Bayesian computational problem with R
I can't understand the iter (this is me) to solve this kind of exercise involving the approximation of the posterior density.
Consider this exercise taken from Bayesian computation with R:
We ...
0
votes
1answer
19 views
Regular Conditional Bayesian Experiment
In "Elements of Bayesian Statistics" (1990), Florens, Mouchart and Rolin describe two basic forms of reduction of a Bayesian experiment: Marginalization and Conditioning (Ch. 1). I don't understand ...
0
votes
0answers
52 views
Estimating the radius of a circle
I have a circle iwth radius $r$. I want to test the hypothesis that $r \leq 2$ vs. $r >2$ based on the posterior of $r$. $r$ follows the prior distribution: $f(r) = \frac{2}{r^{2}}$, $ r >0.5$. ...
2
votes
0answers
69 views
Probability distribution for a digit of a number
If someone choose a digit $\alpha$ and a digit $\beta$ independently. Each one can be in $0,1, ...,9$. So $\mu = \alpha \beta$ (e.g. if $\alpha = 5$ and $\beta = 3$ then $\mu =53$). And I observe a ...
3
votes
1answer
74 views
hint with Bayes rule problem
The pirate Captain Queequeg has a lazy crew and suspects they are planning to stage a mutiny. Captain Queequeg's solution is to have every member of the crew roll Queequeg's lucky die. If the roll is ...
0
votes
0answers
19 views
Bayesian Parameter Estimation Doubt
I was going through a pattern recognition book and in the chapter of Bayesian Parameter Estimation I came across this formula. I cannot understand how the 2nd line is derived from the first line. ...