Assumed distributions of signal and noise

\(X_n \sim N(0,1) =f_n(x | \theta=0)= \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}\)

\(X_s \sim N(d',1) =f_s(x | \theta=d')= \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-d')^2}{2}}\)

Likelihood

The likelihood is defined as (in SDT it is ass the inverse of the the likelihood ratio most often used in statistics). \[\Lambda (x) = \frac{f_S(x)}{f_N(x)}\]

It can be demonstrated similarly to here that

\[\log{\Lambda}(x) = d' \left( x - d' / 2\right)\]

Decision rule

If \(\log{\Lambda} > \lambda\) choose \(a_S\)

which results in

\[\text{If }x > c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_s\] \[\text{If }x < c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_n\]

That is, the criterium can be imposed direclty in the sample from the random variable. The observer model is completely specified by \(d'\) and \(\lambda\).

Example

Trials with the signal

library(tidyverse)

dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime

dat_s <- tibble(trial = 1:20) %>% 
  mutate(x = rnorm(n(), dprime, 1),
         resp = if_else(x > crit, "Yes", "No"))
dat_s
## # A tibble: 20 x 3
##    trial      x resp 
##    <int>  <dbl> <chr>
##  1     1  1.55  Yes  
##  2     2 -0.712 No   
##  3     3  0.436 No   
##  4     4  1.46  Yes  
##  5     5 -0.599 No   
##  6     6 -0.339 No   
##  7     7  0.527 Yes  
##  8     8  1.58  Yes  
##  9     9 -0.738 No   
## 10    10 -1.05  No   
## 11    11  1.11  Yes  
## 12    12 -0.551 No   
## 13    13 -0.606 No   
## 14    14 -0.381 No   
## 15    15  0.648 Yes  
## 16    16  1.78  Yes  
## 17    17  1.41  Yes  
## 18    18  0.886 Yes  
## 19    19  2.01  Yes  
## 20    20  0.512 Yes
ggplot(dat_s) +
  geom_line(data = tibble(x = seq(-3, 4, 0.01),
                          y = dnorm(x, dprime, 1)),
            aes(x = x, y = y)) +
  geom_vline(xintercept = crit) +
  geom_point(aes(x = x, y = 0, color = resp))

For the noise trials

dat_n <- tibble(trial = 1:20) %>% 
  mutate(x = rnorm(n(), 0, 1),
         resp = if_else(x > crit, "Yes", "No"))
dat_n
## # A tibble: 20 x 3
##    trial         x resp 
##    <int>     <dbl> <chr>
##  1     1  1.70     Yes  
##  2     2  0.256    No   
##  3     3  0.542    Yes  
##  4     4 -0.215    No   
##  5     5 -0.892    No   
##  6     6 -1.13     No   
##  7     7  0.450    No   
##  8     8  0.366    No   
##  9     9  1.62     Yes  
## 10    10  1.59     Yes  
## 11    11  0.354    No   
## 12    12  0.492    No   
## 13    13  0.144    No   
## 14    14  0.288    No   
## 15    15 -1.19     No   
## 16    16  0.407    No   
## 17    17  0.489    No   
## 18    18 -0.672    No   
## 19    19  0.000363 No   
## 20    20 -1.36     No
ggplot(dat_n) +
  geom_line(data = tibble(x = seq(-3, 4, 0.01),
                          y = dnorm(x, 0, 1)),
            aes(x = x, y = y)) +
  geom_vline(xintercept = crit) +
  geom_point(aes(x = x, y = 0, color = resp))

Probabilities

Under the 0-1 loss model, the observer model can be reparametrized in terms of the probabilities

\(p_{FA}=P_{\mu=0}\left(d(X)=a_s\right) = P_{\mu=0}\left(x > c\right) = 1 - \Phi(c)\)

\(p_{H}=P_{\mu=d'}\left(d(X)=a_s\right) = P_{\mu=d'}\left(x > c\right) = 1 - \Phi(c-d')\)

then

\(c = \Phi^{-1} (1 -p_{FA})=- \Phi^{-1} (p_{FA})\)

\(d' = c - \Phi^{-1} (1 -p_{H})= \Phi^{-1} (p_{H}) - \Phi^{-1} (p_{FA})\)

Other parametrizations of the criterion

Sometimes, a criterion measured from the unbiased criterion is used

\(c_{center} = c - \frac{d'}{2} = - \frac{1}{2} \left( \Phi^{-1} (p_{H}) + \Phi^{-1} (p_{FA}) \right)\)

Another popular criterion is the relative heights of the distributions at \(c\)

\(\beta = \Lambda(c)\)

Another criterion used in \(\lambda\), often called bias, which expressed in terms of \(c\) is

\[ \lambda = d' \left( c - \frac{d'}{2} \right)\] and in terms of \(c_{center}\) is

\[\lambda = d' c_{center}\] and in terms of proportion of hits and false alarms is

\[\lambda = \frac{1}{2} \left( (\Phi^{-1}(p_{FA}))^2 - (\Phi^{-1}(p_{H}))^2\right)\]

Optimal performance

From here

\[c^{*} =\frac{\lambda^{*}}{d'} + \frac{d'}{2} =\frac{-\text{logit}(P(H_S))}{d'} + \frac{d'}{2} \]

If \(P(H_S) = 1/2\), then

\[c^{*} = \frac{d'}{2} \]

Parameter estimation

The MLE of \(p_{FA}\) is \(\widehat{p}_{FA}=\frac{n_{FA}}{ n_{FA} + n_{CR}}\).

The MLE of \(p_{H}\) is \(\widehat{p}_{H}=\frac{n_{H}}{ n_{H} + n_{M}}\).

Given that \((d',c)\) is an invertible transformation form \((p_H,p_{FA})\), we can calculate \(\widehat{d}'\) and \(\widehat{c}\) from \(\widehat{p}_{H}\) and \(\widehat{p}_{FA}\).

ROC curves

If the observer model is the equal-variance model, the ROC curve for an observer with fixed sensitivity will be calculated as

\(p_{FA} = 1 - \Phi(c)\)

\(p_{H} = 1 - \Phi(c-d')\)

Example of several ROC curves for different sensitivities

dprime <- seq(0, 3, .5)
crit <- seq(-5, 5, .01)

crossing(dprime, crit) %>% 
  mutate(p_fa = 1 - pnorm(crit), 
         p_h = 1 - pnorm(crit - dprime)) %>% 
  ggplot(aes(x = p_fa, y = p_h, color = factor(dprime))) +
  geom_line() +
  xlim(0, 1) +
  ylim(0, 1)

zROC

Given

\(p_{FA} = 1 - \Phi(c) = \Phi(-c)\)

\(p_{H} = 1 - \Phi(c-d') = \Phi(d'-c)\)

then

\(\Phi^{-1}(p_{FA}) = -c\)

\(\Phi^{-1}(p_{H}) = d'-c\)

That is

\(\Phi^{-1}(p_{H}) = d' + \Phi^{-1}(p_{FA})\)

The slope is 1 and the intercept is \(d'\).

crossing(dprime, crit) %>% 
  mutate(z_fa = -crit, 
         z_h = dprime - crit) %>% 
  ggplot(aes(x = z_fa, y = z_h, color = factor(dprime))) +
  geom_line() +
  coord_equal(xlim = c(-4, 4), ylim = c(-4, 4))