This thesis is devoted to the simultaneous estimation of the means of p> 1 independent Poisson distributions. A novel loss function that penalizes bad estimates of each of the means and the sum of the means is introduced. Under this loss function, a class of minimax estimators that uniformly dominate the MLE, is derived. This class is shown to also be minimax and uniformly dominating under the commonly used weighted squared error loss function. Estimators in this class can be fine-tuned to limit shrinkage away from the MLE, thereby avoiding implausible estimates of means anticipated to be bigger than the others. Further light is shed on this new class of estimators by showing that it can be derived by Bayesian and empirical Bayesian methods. Moreover, a class of prior distributions for which the Bayes estimators are minimax and dominate the MLE under the new loss function, is derived. Estimators that shrink the observations towards other points in the parameter space are derived and their performance is compared to similar estimators previously studied in the literature. The most important finding of the thesis is the aforementioned class of estimators that provides the statistician with a convenient way of compromising between two conflicting desiderata (good total and individual risk) when estimating an ensemble of Poisson means.