Sharpe ratio

measures the performance of an investment (e.g., a security or portfolio ) compared to a risk-free asset, after adjusting for its risk.

It is defined as the difference between the returns of the investment and the risk-free return, divided by the standard deviation of the investment (i.e., its volatility).

It represents the additional amount of return that an investor receives per unit of increase in risk.

Subtracting the risk-free rate from the mean return allows an investor to better isolate the profits associated with risk-taking activities. The risk-free rate of return is the return on an investment with zero risk, meaning it's the return investors could expect for taking no risk. The yield for a U.S. Treasury bond, for example, could be used as the risk-free rate.

Generally, the greater the value of the Sharpe ratio, the more attractive the risk-adjusted return.

Formula for Sharpe ratio

What is good Sharpe ratio?

  • Usually, any Sharpe ratio greater than 1.0 is considered acceptable to good by investors.
  • A ratio higher than 2.0 is rated as very good.
  • A ratio of 3.0 or higher is considered excellent.
  • A ratio under 1.0 is considered sub-optimal.