[Solved]: Showing that the entropy of i.i.d. random variables is the sum of entropies

Problem Detail: The shannon entropy of a random variable $Y$ (with possible outcomes $Sigma={sigma_{1},…,sigma_{k}}$) is given by $H(Y)=-sumlimits_{i=1}^{k}P(Y=sigma_{i});log(P(Y=sigma_{i}))$. For a second random variable $X=X_{1}X_{2}…X_{n}$, where all $X_{i}$’s are independent and equally distributed (each $X_{i}$ is a copy of the same random Read More …