Carlos Iguaran (https://bayes.club/@charleemos)
Given $X_1,...,X_k$ random variables where $X_i \sim Poisson(\lambda_i)$ then:
$$ X_1+...+X_k \sim Poisson(\lambda_1+...+\lambda_k) $$
This result makes intuitive sense. Each of the event classes $X_i$ have expected ocurences $\lambda_i$ in a given period of time, that means that all events expected that happen in a given period of time must be the sum of expected ocurences of each class $\lambda_1 + ... + \lambda_k$.
Given the above, the PMF is:
$$ p(X_1+...+X_k=n) = \frac{(\sum \lambda_i)^n e^{-\sum \lambda_i}}{n!} $$
Lets note $\vec{X} = (X_1,...,X_n)$.
Given independence, the PMF joint probability of X is:
$p(\vec{X}) =\prod_k \frac{e^{\lambda_k} \lambda_k^{x_k}}{x_k!}$
We want to see that $\vec{X}|_{\sum{x_i}=n} \sim Mult(n,\pi)$ with $\pi_i = \frac{\lambda_i}{\sum_j \lambda_j}$
By Bayes theorem we know that:
$$ p(\vec{X}|n) = \frac{p(n|\vec{X})p(\vec{X})}{p({n})} = \begin{cases} 0 & \sum x_i \neq n \\ \frac{p(\vec{X})}{p(n)} & otherwise \end{cases} $$
since n as a random variable is completely defined once x is fixed.
$$ \frac{p(x)}{p(n)} = \frac{\prod_k \frac{e^{\lambda_k} \lambda_k^{x_k}}{x_k!}}{\frac{(\sum \lambda_k)^n e^{-\sum \lambda_k}}{n!}} = \frac{n!}{\prod_k x_k!} \frac{\prod_k e^{\lambda_k} \lambda_k^{x_k}}{(\sum \lambda_k)^n e^{-\sum{\lambda_k}}} = $$
Given that $n = \sum x_k$, we get:
$$ \frac{\prod_k e^{\lambda_k} \lambda_k^{x_k}}{(\sum \lambda_i)^n e^{-\sum{\lambda_i}}} = \frac{\prod_ke^{\lambda_k}}{e^{-\sum\lambda_i}} \frac{\prod_k \lambda_k^{x_k}}{\prod_k(\sum\lambda_i)^{x_k}} = \frac{\prod_ke^{\lambda_k}}{\prod_ke^{\lambda_k}}\prod_k p_k^{x_k} $$