## Assumed distributions of signal and noise

$$X_n \sim N(0,1) =f_n(x | \theta=0)= \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}$$

$$X_s \sim N(d',1) =f_s(x | \theta=d')= \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-d')^2}{2}}$$

## Likelihood

The likelihood is defined as (in SDT it is ass the inverse of the the likelihood ratio most often used in statistics). $\Lambda (x) = \frac{f_S(x)}{f_N(x)}$

It can be demonstrated similarly to here that

$\log{\Lambda}(x) = d' \left( x - d' / 2\right)$

## Decision rule

If $$\log{\Lambda} > \lambda$$ choose $$a_S$$

which results in

$\text{If }x > c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_s$ $\text{If }x < c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_n$

That is, the criterium can be imposed direclty in the sample from the random variable. The observer model is completely specified by $$d'$$ and $$\lambda$$.

### Example

Trials with the signal

library(tidyverse)

dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime

dat_s <- tibble(trial = 1:20) %>%
mutate(x = rnorm(n(), dprime, 1),
resp = if_else(x > crit, "Yes", "No"))
dat_s
## # A tibble: 20 x 3
##    trial       x resp
##    <int>   <dbl> <chr>
##  1     1  0.375  No
##  2     2  1.30   Yes
##  3     3 -0.830  No
##  4     4  1.70   Yes
##  5     5 -0.558  No
##  6     6 -0.400  No
##  7     7 -0.369  No
##  8     8  1.32   Yes
##  9     9 -1.07   No
## 10    10  0.215  No
## 11    11  0.673  Yes
## 12    12  1.03   Yes
## 13    13 -0.999  No
## 14    14  0.289  No
## 15    15  0.508  Yes
## 16    16  0.0676 No
## 17    17  1.85   Yes
## 18    18  1.95   Yes
## 19    19  1.29   Yes
## 20    20  1.72   Yes
ggplot(dat_s) +
geom_line(data = tibble(x = seq(-3, 4, 0.01),
y = dnorm(x, dprime, 1)),
aes(x = x, y = y)) +
geom_vline(xintercept = crit) +
geom_point(aes(x = x, y = 0, color = resp))

For the noise trials

dat_n <- tibble(trial = 1:20) %>%
mutate(x = rnorm(n(), 0, 1),
resp = if_else(x > crit, "Yes", "No"))
dat_n
## # A tibble: 20 x 3
##    trial      x resp
##    <int>  <dbl> <chr>
##  1     1  0.290 No
##  2     2 -0.400 No
##  3     3  0.702 Yes
##  4     4  0.177 No
##  5     5 -0.614 No
##  6     6  1.27  Yes
##  7     7 -1.37  No
##  8     8  0.307 No
##  9     9 -0.223 No
## 10    10 -0.528 No
## 11    11  1.07  Yes
## 12    12  0.226 No
## 13    13  0.240 No
## 14    14  0.254 No
## 15    15 -0.726 No
## 16    16 -1.79  No
## 17    17 -0.789 No
## 18    18 -0.375 No
## 19    19  0.281 No
## 20    20  1.74  Yes
ggplot(dat_n) +
geom_line(data = tibble(x = seq(-3, 4, 0.01),
y = dnorm(x, 0, 1)),
aes(x = x, y = y)) +
geom_vline(xintercept = crit) +
geom_point(aes(x = x, y = 0, color = resp))

## Probabilities

Under the 0-1 loss model, the observer model can be reparametrized in terms of the probabilities

$$p_{FA}=P_{\mu=0}\left(d(X)=a_s\right) = P_{\mu=0}\left(x > c\right) = 1 - \Phi(c)$$

$$p_{H}=P_{\mu=d'}\left(d(X)=a_s\right) = P_{\mu=d'}\left(x > c\right) = 1 - \Phi(c-d')$$

then

$$c = \Phi^{-1} (1 -p_{FA})=- \Phi^{-1} (p_{FA})$$

$$d' = c - \Phi^{-1} (1 -p_{H})= \Phi^{-1} (p_{H}) - \Phi^{-1} (p_{FA})$$

## Other parametrizations of the criterion

Sometimes, a criterion measured from the unbiased criterion is used

$$c_{center} = c - \frac{d'}{2} = - \frac{1}{2} \left( \Phi^{-1} (p_{H}) + \Phi^{-1} (p_{FA}) \right)$$

Another popular criterion is the relative heights of the distributions at $$c$$

$$\beta = \Lambda(c)$$

Another criterion used in $$\lambda$$, often called bias, which expressed in terms of $$c$$ is

$\lambda = d' \left( c - \frac{d'}{2} \right)$ and in terms of $$c_{center}$$ is

$\lambda = d' c_{center}$ and in terms of proportion of hits and false alarms is

$\lambda = \frac{1}{2} \left( (\Phi^{-1}(p_{FA}))^2 - (\Phi^{-1}(p_{H}))^2\right)$

## Optimal performance

From here

$c^{*} =\frac{\lambda^{*}}{d'} + \frac{d'}{2} =\frac{-\text{logit}(P(H_S))}{d'} + \frac{d'}{2}$

If $$P(H_S) = 1/2$$, then

$c^{*} = \frac{d'}{2}$

## Parameter estimation

The MLE of $$p_{FA}$$ is $$\widehat{p}_{FA}=\frac{n_{FA}}{ n_{FA} + n_{CR}}$$.

The MLE of $$p_{H}$$ is $$\widehat{p}_{H}=\frac{n_{H}}{ n_{H} + n_{M}}$$.

Given that $$(d',c)$$ is an invertible transformation form $$(p_H,p_{FA})$$, we can calculate $$\widehat{d}'$$ and $$\widehat{c}$$ from $$\widehat{p}_{H}$$ and $$\widehat{p}_{FA}$$.

## ROC curves

If the observer model is the equal-variance model, the ROC curve for an observer with fixed sensitivity will be calculated ass $$p_{FA} = 1 - \Phi(c)$$

$$p_{H} = 1 - \Phi(c-d')$$

Example of several ROC curves for different sensitivities

dprime <- seq(0, 3, .5)
crit <- seq(-5, 5, .01)

crossing(dprime, crit) %>%
mutate(p_fa = 1 - pnorm(crit),
p_h = 1 - pnorm(crit - dprime)) %>%
ggplot(aes(x = p_fa, y = p_h, color = factor(dprime))) +
geom_line() +
xlim(0, 1) +
ylim(0, 1)

### zROC

Given

$$p_{FA} = 1 - \Phi(c) = \Phi(-c)$$

$$p_{H} = 1 - \Phi(c-d') = \Phi(d'-c)$$

then

$$\Phi^{-1}(p_{FA}) = -c$$

$$\Phi^{-1}(p_{H}) = d'-c$$

That is

$$\Phi^{-1}(p_{H}) = d' + \Phi^{-1}(p_{FA})$$

The slope is 1 and the intercept is $$d'$$.

crossing(dprime, crit) %>%
mutate(z_fa = -crit,
z_h = dprime - crit) %>%
ggplot(aes(x = z_fa, y = z_h, color = factor(dprime))) +
geom_line() +
coord_equal(xlim = c(-4, 4), ylim = c(-4, 4))