The Fisher information can be used to define a Riemannian metric to compare probability distributions inside a parametric family. The most well-known example is the case of (univariate) normal distributions, where the Fisher information induces hyperbolic geometry. In this talk we will investigate the Fisher information geometry of Dirichlet distributions, and beta distributions as a particular case. We show that it is negatively curved and geodesically complete. This guarantees the uniqueness of the notion of mean distribution, and makes it a suitable geometry to apply the K-means algorithm, e.g. to compare and classify histograms.