## Equal variance

### Assumed distributions of signal and noise

$$X_n \sim N(0,1) =f_n(x | \theta=0)= \frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}$$

$$X_s \sim N(d',1) =f_s(x | \theta=d')= \frac{1}{\sqrt{2\pi}}e^{-\frac{(x-d')^2}{2}}$$

### Decision rule

Under SDT, it is assumed that the decision rule is the likelihood ratio

It can be demonstrated that

$\log{\Lambda} = -d' \left( x - d' / 2\right)$

$$\log{\Lambda} < \lambda$$ gives the following decision rule

$\text{If }x > c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_s$ $\text{If }x < c =\frac{\lambda}{d'} + \frac{d'}{2} \text{ then }a=a_n$

That is, the criterium can be imposed direclty in the sample from the random variable. The observer model is completely specified by $$d'$$ and $$\lambda$$.

#### Example

library(dplyr)
library(ggplot2)
dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime
dat <- data.frame(trial=1:20) %>% group_by(trial) %>%
mutate(x = rnorm(1,dprime ,1), resp = ifelse(x > crit, 1, 0))
dat
## # A tibble: 20 x 3
## # Groups:   trial [20]
##    trial       x  resp
##    <int>   <dbl> <dbl>
##  1     1 -1.08       0
##  2     2  0.302      0
##  3     3  1.98       1
##  4     4  0.687      1
##  5     5  0.197      0
##  6     6  2.55       1
##  7     7  1.46       1
##  8     8  0.431      0
##  9     9  0.758      1
## 10    10 -0.174      0
## 11    11  0.0404     0
## 12    12 -1.10       0
## 13    13 -0.190      0
## 14    14  1.74       1
## 15    15  1.59       1
## 16    16  0.786      1
## 17    17  0.967      1
## 18    18 -0.279      0
## 19    19 -0.943      0
## 20    20  2.01       1
ggplot(dat)+geom_point(aes(x=x,y=0,color=factor(resp)))+
geom_vline(aes(xintercept=crit, color ='c'))+
geom_vline( aes(xintercept=dprime, color = 'fsignal'))+
geom_vline( aes(xintercept=0, color = 'fnoise'))+
stat_function(fun = dnorm,arg = list(mean = dprime, sd =1), aes(color = 'fsignal'))+
stat_function(fun = dnorm,arg = list(mean = 0, sd =1), aes(color = 'fnoise'))
## Warning: Ignoring unknown parameters: arg

## Warning: Ignoring unknown parameters: arg

For the noise trials

dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime
dat <- data.frame(trial=1:20) %>% group_by(trial) %>%
mutate(x = rnorm(1, 0 ,1), resp = ifelse(x > crit, 1, 0))
dat
## # A tibble: 20 x 3
## # Groups:   trial [20]
##    trial       x  resp
##    <int>   <dbl> <dbl>
##  1     1  0.329      0
##  2     2 -1.17       0
##  3     3  0.881      1
##  4     4  0.156      0
##  5     5 -0.0890     0
##  6     6  1.19       1
##  7     7 -0.803      0
##  8     8  0.792      1
##  9     9 -0.574      0
## 10    10 -0.257      0
## 11    11  1.77       1
## 12    12  1.09       1
## 13    13  0.543      1
## 14    14  1.29       1
## 15    15 -0.844      0
## 16    16  1.04       1
## 17    17  0.107      0
## 18    18 -0.217      0
## 19    19 -0.164      0
## 20    20  0.778      1
ggplot(dat)+geom_point(aes(x=x,y=0,color=factor(resp)))+
geom_vline(aes(xintercept=crit, color ='c'))+
geom_vline( aes(xintercept=dprime, color = 'fsignal'))+
geom_vline( aes(xintercept=0, color = 'fnoise'))+
stat_function(fun = dnorm,arg = list(mean = dprime, sd =1), aes(color = 'fsignal'))+
stat_function(fun = dnorm,arg = list(mean = 0, sd =1), aes(color = 'fnoise'))
## Warning: Ignoring unknown parameters: arg

## Warning: Ignoring unknown parameters: arg

### Probabilities

Under the 0-1 loss model, the observer model can be reparametrized in terms of the probabilities

$$p_{FA}=P_{\mu=0}\left(d(X)=a_s\right) = P_{\mu=0}\left(x > c\right) = 1 - \Phi(c)$$

$$p_{H}=P_{\mu=d'}\left(d(X)=a_s\right) = P_{\mu=d'}\left(x > c\right) = 1 - \Phi(c-d')$$

then

$$c = \Phi^{-1} (1 -p_{FA})=- \Phi^{-1} (p_{FA})$$

$$d' = c - \Phi^{-1} (1 -p_{H})= \Phi^{-1} (p_{H}) - \Phi^{-1} (p_{FA})$$

#### Example

dprime <- .7
lambda <- .1
crit <- lambda / dprime + .5 *dprime
pFA <- 1-pnorm(crit)
pH <- 1-pnorm(crit-dprime)
pFA
## [1] 0.3110568
pH
## [1] 0.5820509
pCR <- 1- pFA
pM <- 1 - pH
pCR
## [1] 0.6889432
pM
## [1] 0.4179491

### Parameter estimation

The MLE of $$p_{FA}$$ is $$\widehat{p}_{FA}=\frac{n_{FA}}{ n_{FA} + n_{CR}}$$.

The MLE of $$p_{H}$$ is $$\widehat{p}_{H}=\frac{n_{H}}{ n_{H} + n_{M}}$$.

Given that $$(d',c)$$ is an invertible transformation form $$(p_H,p_{FA})$$, we can calculate $$\widehat{d}'$$ and $$\widehat{c}$$ from $$\widehat{p}_{H}$$ and $$\widehat{p}_{FA}$$.

#### Example

Let’s simulate 20 noise trials.

nNoise <- 20
nFA <- rbinom(1, nNoise, pFA)
estpFA <- nFA / nNoise
estpCR <- 1- estpFA
estpFA
## [1] 0.35
estpCR
## [1] 0.65

and 20 signal trials.

nSignal <- 20
nH <- rbinom(1, nSignal, pH)
estpH <- nH/ nSignal
estpM <- 1- estpH
estpH
## [1] 0.5
estpM
## [1] 0.5

## References

Knoblauch, K., & Maloney, L. T. (2012). Modeling Psychophysical Data in R. New York: Springer.

Maloney, L. T., & Zhang, H. (2010). Decision-theoretic models of visual perception and action. Vision Research, 50(23), 2362–2374.

Wickens, T. D. (2001). Elementary signal detection theory. Oxford University Press.