# likelihood(likelihood)-其他

## likelihood(likelihood)

### https://qr.ae/pvAGkz：

Likelihood is the probability that an event that has already occurred would yield a specific outcome. Probability refers to the occurrence of future events, while a likelihood refers to past events with known outcomes.

Probability is used when describing a function of the outcome given a fixed parameter value. For example, if a coin is flipped 10 times and it is a fair coin, what is the probability of it landing heads-up every time?

Likelihood is used when describing a function of a parameter given an outcome. For example, if a coin is flipped 10 times and it has landed heads-up 10 times, what is the likelihood that the coin is fair?

So, the likelihood of a set of parameter values, $$θ$$ given outcomes $$x$$ is given by
$$L(θ|x)=P(x|θ)$$

### https://qr.ae/pvAG4v

I know only two meanings for “likelihood.” The first is an informal synonym for “probability,” as in “the likelihood of rain tomorrow is small.” Not much more to say about that one.

The other meaning is as a term of art in statistics. Assume we have a probability distribution with density $$f(x)$$ . (It could also be a probability mass function, but for this discussion let’s assume it is a density). $$f(x)dx$$ is the probability that a draw from this distribution lands in a small neighborhood around $$x$$.

$$f(x)$$ may be characterized by some other parameters. For example, if $$f$$ is a Gaussian density, it is characterized by the mean μμ and standard deviation $$σ$$ . So instead of $$f(x)$$ , we could write $$f(x,μ,σ)$$ . In general, we can pack all such parameters into a single vector $$θ$$ and write the density as
$$f(x,\theta)$$ . When we view $$f$$ as a density, we have some constant values of $$θ$$ in mind and think of the function as varying in $$x$$ .

When we think of $$f(x,\theta)$$ as a likelihood, we instead hold $$x$$ constant and let $$\theta$$ vary.

A common application of the likelihood function is in estimation. In this case, $$\theta$$ is unknown and we want to estimate it from some given data $$x$$ . A standard approach is maximum likelihood estimation: estimate $$\theta$$ by the value which maximizes $$f(x,\theta)$$ for the given observations $$x$$ .

————————

### https://qr.ae/pvAGkz：

Likelihood is the probability that an event that has already occurred would yield a specific outcome. Probability refers to the occurrence of future events, while a likelihood refers to past events with known outcomes.

Probability is used when describing a function of the outcome given a fixed parameter value. For example, if a coin is flipped 10 times and it is a fair coin, what is the probability of it landing heads-up every time?

Likelihood is used when describing a function of a parameter given an outcome. For example, if a coin is flipped 10 times and it has landed heads-up 10 times, what is the likelihood that the coin is fair?

So, the likelihood of a set of parameter values, $$θ$$ given outcomes $$x$$ is given by
$$L(θ|x)=P(x|θ)$$

### https://qr.ae/pvAG4v

I know only two meanings for “likelihood.” The first is an informal synonym for “probability,” as in “the likelihood of rain tomorrow is small.” Not much more to say about that one.

The other meaning is as a term of art in statistics. Assume we have a probability distribution with density $$f(x)$$ . (It could also be a probability mass function, but for this discussion let’s assume it is a density). $$f(x)dx$$ is the probability that a draw from this distribution lands in a small neighborhood around $$x$$.

$$f(x)$$ may be characterized by some other parameters. For example, if $$f$$ is a Gaussian density, it is characterized by the mean μμ and standard deviation $$σ$$ . So instead of $$f(x)$$ , we could write $$f(x,μ,σ)$$ . In general, we can pack all such parameters into a single vector $$θ$$ and write the density as
$$f(x,\theta)$$ . When we view $$f$$ as a density, we have some constant values of $$θ$$ in mind and think of the function as varying in $$x$$ .

When we think of $$f(x,\theta)$$ as a likelihood, we instead hold $$x$$ constant and let $$\theta$$ vary.

A common application of the likelihood function is in estimation. In this case, $$\theta$$ is unknown and we want to estimate it from some given data $$x$$ . A standard approach is maximum likelihood estimation: estimate $$\theta$$ by the value which maximizes $$f(x,\theta)$$ for the given observations $$x$$ .