Answer by Joe Blitzstein:
This is called the(CV), and is a relative measure of variability. Standard deviation (SD) has the same units as the measurement, whereas CV is unitless and can be interpreted without worrying about whether the measurement was made in, say, inches rather than meters.
Here are two quick examples where CV makes sense intuitively as a measure of variability:
1. Measuring distances. If I'm measuring a distance and the SD is 1 inch, would you say my measuring instrument is very precise, kind of precise, or not precise at all? That's very hard to answer if you don't know what I am measuring. If I'm measuring distances between stars, an SD of 1 inch is incredibly precise; if I'm measuring human heights, it's pretty crude; if I'm measuring distances within a molecule, it's useless. The CVs handle the different scales of measurement in a more directly understandable way.
2. House prices. Consider house prices in two cities, where one city tends to have vastly higher house prices than the other. There is a lot of fuss in introductory statistics books about when it's ok to assume that two population SDs are equal. This is partly an empirical question, but intuitively in this example I think it would be much more plausible to assume that the CVs are equal than that the SDs are equal, since that's just saying it's more realistic to think about percentages than absolute numbers. Similarly, stores are much more likely to have a 10% off sale than to take a fixed amount like $5 off the cost of every item.
A downside of CV is that it is undefined if the mean is 0, but CV tends to be most useful and interpretable when working with positive quantities anyway.