The tag has no wiki summary.

learn more… | top users | synonyms

1
vote
1answer
26 views

Entropy vs predictability vs encodability

Imagine there's a guessing game where a series of binary symbols are presented and a human must decide quickly if the symbol is the same as the previous or different. There's a property of the ...
0
votes
0answers
60 views
+50

Replace a continuous probability distribution with a discrete one

Say one wants to fit a curve $f(x)$ to a set of noisy data points $(x_i, y_i)$. If the error for each point $y_i$ is assumed to be normally distributed with variance $\sigma_i^2$, one wants to find ...
1
vote
1answer
24 views

Shannon inequalities

I have some difficulties in showing the relationship between mutual information $I(X; Y |Z)$ and $I(X; Y)$? What is larger?
2
votes
0answers
27 views

Computing Relative entropy?

I am doing a project for my CS class and I was wondering if the following would work. I have 50 different people who have rated the same 50 books. The rating system is as follows: negative 5 = hate ...
1
vote
0answers
27 views

convergence of discrete random variables with finite entropy

Let $Z$ be the set of discrete random variables on some probability space. Define the quantity $d(X_1,X_2)=h(X_1 \mid X_2)+h(X_2 \mid X_1)$ between two random variables $X_1, X_2 \in Z$. For $X \in Z$ ...
3
votes
0answers
69 views

Which takes more energy: Shuffling a sorted deck or sorting a shuffled one?

You have an array of length $n$ containing $n$ distinct elements. You have access to a comparator on the elements (a black-box function that takes $a$ and $b$ and returns true if $a < b$, false ...
0
votes
0answers
28 views

convexity of the product of two entropy-like functions

Consider the functions $T_p(q)= \sum_i q_i^p$, where p>1 and q is a finite-dimensional vector satisfying $\sum_i q_i = 1, q_i >0$ (ie, a probability mass function). In information-theoretic terms, ...
2
votes
1answer
23 views

Do the pth powers of $p$-norms define the same partial ordering on the set of all probability distributions for all $p>1$?

Consider the $p$-th power of the Schatten $p$-norm $||q||_p$ of a probability distribution $q$ , ie, the function $\sum_j q_j^p$, where $\sum_j q_j = 1$ and $q_j \geq 0$. For fixed $q$ and $p>1$ ...
0
votes
1answer
30 views

i.i.d binary random variable question

Suppose there are i.i.d. binary random variables $X_i \sim X$ with distribution $P(X=1) = 0.75$ and $P(X=0) = 0.25$ i) For $n=5$ and $e=0.1$, which sequences fall in the typical set $A_e^n$? What is ...
1
vote
0answers
74 views

Why is the negative entropy Lipschitz with respect to the $1$-norm (Over)?

Let $\left\|x \right\| = \sum_{i=1}^{i=n}\left|x^i\right|$ and $d\left(x\right)=\sum_{i=1}^{i=n}x^i\ln x^i$ where $x\in R^n $ and $ \sum_{i=1}^{i=n}x^i=1$ How to prove: For all $x, x'$, $$\left| ...
0
votes
1answer
58 views

Approximating probability of success of Bernoulli trials using Kullback–Leibler divergence

In "Probabilistic Graphical Models" book by Daphne Koller and Nir Friedman they have the following approximation of probability of r successful outcomes of N Bernoulli trials: $P(S_N=r)\approx ...
2
votes
1answer
81 views

Definition of the Entropy

I have a question regarding definition of entropy by expected value of the random variable $\log \frac{1}{p(X)}$: $H(X) = E \log \frac{1}{p(X)}$, where $X$ is drawn accordingly to the probability ...
1
vote
1answer
88 views

Bits in a coin-toss experiment

This is not homework but an actual problem. We flip a fair coin ten times. This gives A$_1$ to A$_{10}$. Each coin toss = 10 bits. We flip another fair coin ten times. This gives B$_1$ to ...
0
votes
0answers
61 views

Proof for the upper bound on entropy $H(S)$?

I was trying to prove the upper bound on $H(S)$ using the inequalities $\ln(x)\le(x-1)$ and $\ln(1/x)\ge(1-x)$ for independent and memory less source symbols $s_1,\dots,s_q$ . I am trying to prove ...
0
votes
1answer
51 views

Entropy Problem: mutual information

I have a problem about entropy and mutual information that I have attempted, but would like feedback on. 30% Boas 20% Anaconda 50% Cobra Half of the Cobras were medium sized, and the other half were ...

1 2 3 4 5 6
15 30 50 per page