Objective: This study addresses the human factors challenge of designing and validating decision support to promote less biased intelligence analysis. Background: The confirmation bias can compromise objectivity in ambiguous medical and military decision making through neglect of conflicting evidence and judgments not reflective of the entire evidence spectrum. Previous debiasing approaches have had mixed success and have tended to place additional demands on users' decision making. Method: Two new debiasing interventions that help analysts picture the full spectrum of evidence, the relation of evidence to a hypothesis, and other analysts' evidence assessments were manipulated in a repeated-measures design: (a) an integrated graphical evidence layout, compared with a text baseline; and (b) evidence tagged with other analysts' assessments, compared with participants' own assessments. Twenty-seven naval trainee analysts and reservists assessed, selected, and prioritized evidence in analysis vignettes carefully constructed to have balanced supporting and conflicting evidence sets. Bias was measured for all three evidence analysis steps. Results: A bias to select a skewed distribution of confirming evidence occurred across conditions. However, graphical evidence layout, but not other analysts' assessments, significantly reduced this selection bias, resulting in more balanced evidence selection. Participants systematically prioritized the most supportive evidence as most important. Conclusion: Domain experts exhibited confirmation bias in a realistic intelligence analysis task and apparently conflated evidence supportiveness with importance. Graphical evidence layout promoted more balanced and less biased evidence selection. Application: Results have application to real-world decision making, implications for basic decision theory, and lessons for how shrewd visualization can help reduce bias.