Today’s post will analyze two orthogonal dimensions in signal detection theory. To ensure that I don’t move us along before providing some key takeaways, let’s clarify the two metrics first:
“Response bias” is quantified by ß(likelihood ration) and C(report criterion), reflecting decision strategy;
“perceptual sensitivity” measured by d’(d prime), representing true detection ability.
1 What does signal detection theory assume?
In our previous article, we mentioned that any process involving stimulus reception and response encompasses both a sensory process and a decision-making process.
Sensory process: Depends on the nature of the external stimulus;
Decision-making process: Influenced by the Subjective factor.
2 What are the four outcomes of signal detection theory?
The Signal Detection Theory(SDT) assumes that the sensory and decision-making processes are independent. Consequently, the corresponding “response bias” and “Perceptual sensitivity”(d’) are also independent.
As we have discussed multiple times before, there are four possible types of response:

The probabilities of these four response types are related as follows:
P(H)+P(M) = 1
P(FA)+P(CR) = 1
Where:

3 What is response bias in signal detection theory?
There are two ways to calculate response bias in SDT: the likelihood ratio (ß) and the report criterion(C).
3.1 What is the likelihood ratio?
Each time the subject receives a stimulus, a psychological perceptual magnitude x is generated, and x always falls within the overlapping region of the noise distribution (N) and the signal-plus-noise distribution (SN). The blue curve represents the pure noise distribution (N), while the red curve represents the signal-plus-noise distribution (SN). Now, consider a randomly presented stimulus that elicits a perceptual value x:
If x is weak and lies to the left of X₁ (far from the SN distribution), the subject will judge it as noise;
If x is strong and lies to the right of X₃ (far from the N distribution), the subject will judge it as a signal;
Depending on the likelihood ratio, if x falls between X₁ and X₃, the subject may judge it as either noise or a signal.
For an x located between X₁ and X₃, a vertical line is drawn from the x-axis, intersecting both the SN and N distributions at two points:
The height of the intersection with the SN distribution is denoted as O(SN);
The intersection height with the N distribution is denoted as O(N).
The ratio O(SN)/O(N) is the likelihood ratio, denoted as lₓ.

To better visualize the impact of different **likelihood ratios (lₓ)**, we analyze three key points:
Point A (between original X₂ and X₃)
ß = 4.38
Interpretation: A ß significantly >1 indicates a strict criterion, where the subject avoids false alarms at the cost of more misses.
Point B (between original X₁ and X₂)
ß = 1.00
Interpretation: ß = 1 reflects a neutral criterion, balancing hit rates and false alarm rates.
Point C (exactly at original X₂)
ß = 0.28
Interpretation: A ß <1 suggests a lenient criterion, prioritizing hits but accepting higher false alarms.
General Rule:
ß > 1 → Strict criterion (conservative strategy);
ß ≈ 1 → Neutral criterion;
ß < 1 → Lenient criterion (liberal strategy).

3.2 The payoff matrix
Under constant signal strength (d') and sensitivity levels, the optimal decision criterion ß_opt for maximizing expected payoff is determined by:
Key dynamic relationships:
When signal probability P(S) increases:
Rational decision-makers should relax the criteria (decrease ß)
Behavioral manifestation: Rightward shift of decision threshold
Example: ß_opt may drop from 1.5 to 0.6 when P(S) increases from 20% to 60%
When P(S) decreases:
Stricter criteria (increase ß) should be adopted
Behavioral manifestation: Leftward threshold shift
Example: ß_opt may rise from 0.6 to 2.2 when P(S) decreases from 60% to 10%
Payoff adjustments:
Increasing hit reward V(H) reduces ß_opt
Increasing the cost of FA, C(FA), elevates ß_opt.
Security system case: Setting high C(M) can drive ß_opt to approximately 0.3.
Boundary condition:
When V(CR)+C(FA) = V(H)+C(M), ß_opt = P(N)/P(S)
Medical diagnosis example: ß≈0.1 ensures minimal miss rate when disease risk >> test cost
To maintain clarity and conciseness, this article focuses on the first key metric of response bias - the likelihood ratio and its relationship with the payoff matrix. Through the concrete examples provided above, we hope readers have better understood how to calculate the optimal decision criterion ß_opt for maximizing expected benefits.
In our next article, we will further explore:
The nature and application of the report criterion
Perceptual sensitivity in signal detection theory
The calculation method and interpretation of the core metric d' (d prime)
Stay tuned, subscribe to Bayeslab, and let everyone master the wisdom of statistics at a low cost with the AI Agent Online tool.