Equal variance

Assumed distributions of signal and noise

\(X_n \sim N(0,1) =f_n(x | \theta=0)= \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}\)

\(X_s \sim N(d',1) =f_s(x | \theta=d')= \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-d')^2}{2}}\)

Decision rule

Under SDT, it is assumed that the decision rule is the likelihood ratio

It can be demonstrated that

\[\log{\Lambda} = -d' \left( x - d' / 2\right)\]

which gives the following decision rule

\[\text{If }x > c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_s\] \[\text{If }x < c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_n\]

That is, the criterium can be imposed direclty in the sample from the random variable. The observer model is completely specified by \(d'\) and \(\lambda\).

Example

library(dplyr)
library(ggplot2)
dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime
dat <- data.frame(trial=1:20) %>% group_by(trial) %>%
  mutate(x = rnorm(1,dprime ,1), resp = ifelse(x > crit, 1, 0))
dat
## Source: local data frame [20 x 3]
## Groups: trial [20]
## 
##    trial        x  resp
##    <int>    <dbl> <dbl>
## 1      1 1.486212     1
## 2      2 1.486212     1
## 3      3 1.486212     1
## 4      4 1.486212     1
## 5      5 1.486212     1
## 6      6 1.486212     1
## 7      7 1.486212     1
## 8      8 1.486212     1
## 9      9 1.486212     1
## 10    10 1.486212     1
## 11    11 1.486212     1
## 12    12 1.486212     1
## 13    13 1.486212     1
## 14    14 1.486212     1
## 15    15 1.486212     1
## 16    16 1.486212     1
## 17    17 1.486212     1
## 18    18 1.486212     1
## 19    19 1.486212     1
## 20    20 1.486212     1
ggplot(dat)+geom_point(aes(x=x,y=0,color=factor(resp)))+
  geom_vline(aes(xintercept=crit, color ='c'))+
  geom_vline( aes(xintercept=dprime, color = 'fsignal'))+
  geom_vline( aes(xintercept=0, color = 'fnoise'))+
  stat_function(fun = dnorm,arg = list(mean = dprime, sd =1), aes(color = 'fsignal'))+
  stat_function(fun = dnorm,arg = list(mean = 0, sd =1), aes(color = 'fnoise'))
## Warning: Ignoring unknown parameters: arg

## Warning: Ignoring unknown parameters: arg

For the noise trials

dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime
dat <- data.frame(trial=1:20) %>% group_by(trial) %>%
  mutate(x = rnorm(1, 0 ,1), resp = ifelse(x > crit, 1, 0))
dat
## Source: local data frame [20 x 3]
## Groups: trial [20]
## 
##    trial          x  resp
##    <int>      <dbl> <dbl>
## 1      1 -0.1116656     0
## 2      2 -0.1116656     0
## 3      3 -0.1116656     0
## 4      4 -0.1116656     0
## 5      5 -0.1116656     0
## 6      6 -0.1116656     0
## 7      7 -0.1116656     0
## 8      8 -0.1116656     0
## 9      9 -0.1116656     0
## 10    10 -0.1116656     0
## 11    11 -0.1116656     0
## 12    12 -0.1116656     0
## 13    13 -0.1116656     0
## 14    14 -0.1116656     0
## 15    15 -0.1116656     0
## 16    16 -0.1116656     0
## 17    17 -0.1116656     0
## 18    18 -0.1116656     0
## 19    19 -0.1116656     0
## 20    20 -0.1116656     0
ggplot(dat)+geom_point(aes(x=x,y=0,color=factor(resp)))+
  geom_vline(aes(xintercept=crit, color ='c'))+
  geom_vline( aes(xintercept=dprime, color = 'fsignal'))+
  geom_vline( aes(xintercept=0, color = 'fnoise'))+
  stat_function(fun = dnorm,arg = list(mean = dprime, sd =1), aes(color = 'fsignal'))+
  stat_function(fun = dnorm,arg = list(mean = 0, sd =1), aes(color = 'fnoise'))
## Warning: Ignoring unknown parameters: arg

## Warning: Ignoring unknown parameters: arg

Probabilities

Under the 0-1 loss model, the observer model can be reparametrized in terms of the probabilities

\(p_{FA}=P_{\mu=0}\left(d(X)=a_s\right) = P_{\mu=0}\left(x > c\right) = 1 - \Phi(c)\)

\(p_{H}=P_{\mu=d'}\left(d(X)=a_s\right) = P_{\mu=d'}\left(x > c\right) = 1 - \Phi(c-d')\)

then

\(c = \Phi^{-1} (1 -p_{FA})=- \Phi^{-1} (p_{FA})\)

\(d' = c - \Phi^{-1} (1 -p_{H})= \Phi^{-1} (p_{H}) - \Phi^{-1} (p_{FA})\)

Example

dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime
pFA <- 1-pnorm(crit)
pH <- 1-pnorm(crit-dprime)
pFA
## [1] 0.3110568
pH
## [1] 0.5820509
pCR <- 1- pFA
pM <- 1 - pH
pCR
## [1] 0.6889432
pM
## [1] 0.4179491

Parameter estimation

The MLE of \(p_{FA}\) is \(\widehat{p}_{FA}=\frac{n_{FA}}{ n_{FA} + n_{CR}}\).

The MLE of \(p_{H}\) is \(\widehat{p}_{H}=\frac{n_{H}}{ n_{H} + n_{M}}\).

Given that \((d',c)\) is an invertible transformation form \((p_H,p_{FA})\), we can calculate \(\widehat{d}'\) and \(\widehat{c}\) from \(\widehat{p}_{H}\) and \(\widehat{p}_{FA}\).

Example

Let’s simulate 20 noise trials.

nNoise <- 20
nFA <- rbinom(1, nNoise, pFA)
estpFA <- nFA / nNoise 
estpCR <- 1- estpFA 
estpFA
## [1] 0.2
estpCR
## [1] 0.8

and 20 signal trials.

nSignal <- 20
nH <- rbinom(1, nSignal, pH)
estpH <- nH/ nSignal 
estpM <- 1- estpH 
estpH
## [1] 0.5
estpM
## [1] 0.5

Unequal variance

References

Knoblauch, K., & Maloney, L. T. (2012). Modeling Psychophysical Data in R. New York: Springer.

Maloney, L. T., & Zhang, H. (2010). Decision-theoretic models of visual perception and action. Vision Research, 50(23), 2362–2374.

Wickens, T. D. (2001). Elementary signal detection theory. Oxford University Press.