Made for intelligence analysts. Now free to the public.
Follow OpenSourceACH on Twitter

User manual

From The Open Source Analysis of Competing Hypotheses Project

Revision as of 19:44, 17 August 2010 by Matt (Talk | contribs)

This manual covers the mechanics of the ACH methodology, and instructions on using the ACH software. It presumes prior knowledge of the methodology and its purpose, however.

Note: the entirety of this manual is also available in the software package.

Contents

Hypotheses

Deciding what hypotheses to evaluate is the first step in ACH. Hypotheses must be carefully considered and worded, as they form the foundation on which the analysis is built. Input from several different analysts with different perspectives is strongly encouraged.

A hypothesis is a testable proposition about what is true, or about what has, is, or will happen. It should usually be worded as a positive statement that can be disproved. Try to develop hypotheses that meet two tests:

  • the hypotheses cover all reasonable possibilities, including those that seem unlikely but not impossible,
  • the hypotheses are mutually exclusive. That is, if one hypothesis is true, then all other hypotheses must be false.

Sometimes it is useful to think of a hypothesis as a set of unique stories of how an event played out or will play out. As evidence is collected and added to the matrix, you may find that hypotheses need to be reworded, refined, split into two hypotheses, combined, added, or deleted.

When deciding whether to include an unlikely hypothesis, consider whether the hypothesis is virtually impossible or simply unproven because there is no evidence for it. For example, the possibility that an adversary is trying to deceive you should not be rejected just because you see no evidence of deception. If deception is done well, you should not expect to find evidence of it readily at hand. The possibility should not be rejected until it is disproved, or, at least, until after you have made a systematic search for evidence and found it lacking.

Creating and Editing Hypotheses

Hypotheses are created via a page that asks for a short, descriptive label and a more detailed explanation.

Only the Project Owner has permission to add hypotheses. You may do so immediately after creating a project, or by clicking the Enter Hypotheses tab underneath the large Project heading.

On the Enter Hypotheses page, type a brief description of your first hypothesis into the Label field; what you type here will always be displayed at the top of your hypothesis columns. If you like, you can add more details to this hypothesis in the Description field. When viewing the matrix, you can reveal this detailed description by moving your mouse over the label.

When finished, click Save.

If you are adding a new hypothesis to an ongoing project, be sure to consult with the other group members before doing so. Adding a new hypothesis may alter their understanding, and thus their consistency scores, of the existing hypotheses.

Only the project owner may edit hypotheses. This is because different members of the project will be filling in their matrices at different times. It is important that evaluations of hypotheses not be affected by small changes in wording.

If you are the project owner, you can edit a hypothesis by clicking its label at the top of a matrix. On the following page, click the "Edit hypothesis information" link. This page will also let you delete the hypothesis using the red "Delete hypothesis" link.

Evidence

How do you choose your evidence? The word "evidence" is interpreted very broadly. It refers to all factors that influence your judgment about the hypotheses. This includes assumptions and logical deductions as well as specific items of intelligence reporting. Assumptions or logical deductions about capabilities, intentions, or how things normally work in the foreign country or culture involved are often more important than hard evidence in determining analytical conclusions.

The absence of evidence is also evidence and must be noted. For example, if you are analyzing the intention of an adversary to launch a military attack, the steps the adversary has not taken may be more significant than the observable steps that have been taken. Ask yourself the following question for each hypothesis: If this hypothesis is true, what are all the things that must have happened, or may still be happening, and what evidence of this should I expect to see? Then ask: If I am not seeing this evidence, why not? Is it because it is not happening, it is being concealed, or because I have not looked for it?

Not all evidence needs to be included. For some types of issues, such as warning analysis, it may be useful to prune the older evidence. A collection of older evidence is likely to bias the analysis in favor of concluding that the status quo will continue. If there is going to be a significant change (such as a military attack, a coup d'etat in the near future, or a surprising election victory or defeat) that may well be apparent only from the recent evidence.

If you are uncertain whether an item of evidence is true or deceptive, it is often advisable to enter that evidence twice, once with the assumption that it is not deceptive, and once with the assumption that it is deceptive. For example, a foreign leader makes a public statement of intentions, such as "we have no interest in developing weapons of mass destruction." You cannot rate the consistency or inconsistency of that statement with your hypotheses without pre-judging whether the foreign leader is being truthful or trying to hide a weapons development program. The solution is to enter and rate that item of evidence twice, once with the assumption that it is true, and once with the assumption it is deceptive. Entering this evidence twice with different consistency ratings forces you to think about the deception, and this is preferable to entering it only once and rating it as consistent with all hypotheses. In this case, you may also want to put a check in the flag column for both entries to remind you that you have entered the same evidence item twice.

Creating and Editing Evidence

To enter evidence, click on the Enter Evidence/Arguments link beneath the project header on any page. On the next page, you'll be given two options: Single and Multiple. By default, you are given the single item entry form. Details on some of the different evidence information fields--Notes, Type, Date/Time, and Code--are below.

The Multiple option lets you add evidence in bulk. Many analysts store evidence in a spreadsheet. ACH's Multiple Evidence Entry tool lets you quickly add many evidence items if you've been storing them in this manner. However, entering your evidence like this will only allow you to insert two pieces of information per item: Evidence Name (a short description) and Evidence Notes (a more detailed description). Other information like Date/Time and Type will have to be inserted manually at a later time. To import multiple items from a spreadsheet, follow these steps:

  • Review your evidence spreadsheet and insure that you are using two columns: one to name the evidence, and another to describe
  • Copy your evidence. Using your mouse, click and drag across your evidence items so that both columns and all rows are highlighted. Then, from the Edit menu, choose Copy.
  • Paste the contents into the box on the Multiple Evidence Entry page, and click Save.

Evidence Notes

The Evidence Notes section of the Enter Evidence/Arguments page is for entering information beyond what can be put in the matrix. The Notes may include a fuller description of the evidence, a reference to the source of the evidence, a hyperlink directly to the source document, or extracts or a full copy of the report. You can reveal the Notes while viewing a matrix by moving your mouse over the evidence labels in the left-most column. The Notes are also displayed on each evidence item's dedicated page, which you can reach by clicking the evidence label links.

An item of evidence is not required to have a Note. In a collaborative project, the entire team of analysts has a single set of notes for each evidence item, Changing the Notes field will change it for every other member of the project.

Enter Type of Evidence

Type of evidence often refers to the type of source that provided the evidence, but any other analytically useful set of categories may be used. For intelligence analysis, common types of evidence include such categories as HUMINT, SIGINT, Imagery, Open Source, Assumption, Logical Deduction, and Absence of Evidence. In a criminal investigation, it may be appropriate to have categories such as Police Report, Eye Witness Account, Interview, and Forensic Evidence. A counterintelligence investigator might use the column to catalogue Motive, Opportunity, Means, and Character Assessment. Your own installation may have evidence types customized for your industry.

Type of evidence can be entered on the Evidence page when adding evidence, and later by editing the evidence item (see Editing Evidence below). Sorting and analyzing the evidence by type can provide clues to the reliability of sources and signal possible deception. If all types of sources are telling a consistent story, that is a good sign. If not, try to figure out why. Are some sources vulnerable to manipulation for the purpose of deceiving you?

To sort by type of evidence, click on the Sort Evidence button and then select the Type variable. This will sort the types of evidence alphabetically. Note that when sorting evidence, you can sub-sort it by another variable. For example, if you were to choose Type as your first sorting variable and Diagnosticity as your second, you would easily be able to identify the most diagnostic or discriminating items of HUMINT or Imagery.

When evidence is entered and then sorted by diagnosticity or any other characteristic, the hypotheses are reordered by Inconsistency Score or Weighted Inconsistency Score, with the most likely hypothesis to the left. To facilitate consistent entering of data, a click on the Enter Evidence/Arguments button returns both the evidence and the hypotheses to their original order.

Enter Serial Number

If your evidence is derived from a source document with a unique serial number, or if it came from a Web page with a URL, enter that serial number or URL into the Serial Number field. This will help you keep an accurate record of your evidence. It will also help you find other analysts working on similar problems: if you click on an evidence item from the matrix, you'll be taken to that item's dedicated page. The serial number will be displayed on that page, and clicking the link next to it ("Who else is using this?") will show you all other ACH projects in your organization that are using that source document.

Enter Date/Time

The date or time when events occurred is significant for some analyses, especially counterintelligence or criminal investigations. You can enter the date and time of events on the Enter Evidence/Arguments page, under the Date/Time field. (You can always change this later using Edit Evidence.)

When viewing the matrix, this information is hidden by default in order to simplify the presentation and make more room to list hypotheses. To show the Date/Time column, click the Show Data Columns link and check the Date/Time box. You can sort by Date/Time by clicking on the column header, or by using the Sort Evidence tools. Doing so will turn your matrix into a chronological list of events.

In order for the evidence to sort correctly, you must use the following format: YYYY-MM-DD HH:MM:SS Both date and time are optional. However, if you choose to enter a time, you must also enter a date.

Enter Code

The Code function is an extra, free-entry text field that gives the analyst the flexibility to enter any additional set of categories the analyst might wish to use to sort the evidence; it is similar to "tags" on many popular Web sites. For example, you might want to code the data by country or by state. You can also use this column to track a specific stream of HUMINT reporting, or to identify reporting from a specific news organization like the BBC. The Code function can also be used to flag specific reports that you suspect might constitute denial, deception, or inaccurate reporting. The Code column sorts alphabetically or numerically.

The default view of the matrix hides the Code column. To show this column, click the Show Data Columns link and check the Code box. To sort by Code, click the Code column's header, or use the Sort Evidence tools. Editing Evidence

All of the above attributes can be changed. To edit evidence, click on the evidence label in any matrix; each evidence label in a matrix is hyperlinked to a dedicated page about that piece of evidence. On that page, click the Edit Evidence Information link. Make the appropriate changes and click the Save Changes button.

Only the project owner may delete evidence. If you are the project owner, you'll notice a red Delete Evidence Record link next to the Edit link. Upon clicking this link, you'll be asked to verify the deletion.

Finding Evidence

Because the order of evidence is changed whenever it is sorted, it may become hard to find a particular piece of evidence in the matrix. The easiest way to find it is with your Web browser's Find command, under the Edit menu in Internet Explorer and Firefox.

ACH also automatically numbers each item of evidence as it is entered. This reflects the order in which evidence was entered, and this numbering cannot be changed. This number is hidden by default. To reveal it, choose Order Added from the Show Data Columns menu. You can sort and reverse sort the evidence by clicking the Order Added column header. For other sorting methods, see the Sort Evidence section.

Rating the Consistency of Your Evidence

After entering your evidence and hypotheses, the next step is to assess whether each item of evidence is Consistent or Inconsistent with each hypothesis. For each hypothesis, ask yourself: if this hypothesis were true, is it likely that I would see this evidence? If the answer is "Yes," change the consistency rating to show that the evidence is Consistent (C) with the hypothesis; if "No," mark it as Inconsistent (I). If it is Very Consistent or Very Inconsistent, mark it CC or II. The Very Consistent (CC) and Very Inconsistent (II) ratings are used when you have a high degree of confidence in providing the rating or believe the item of evidence makes a compelling case supporting or contradicting a hypothesis.

Evidence may also be Neutral (N) or Not Applicable (NA) to some hypotheses. The Neutral (N) designation is used when there are alternative interpretations of the evidence, one Consistent and the other Inconsistent. The Not Applicable (NA) rating is appropriate when an item of evidence is clearly not applicable to one or more of the hypotheses.

You may sometimes find that you are not qualified to comment on the consistency of some evidence; for instance, if you are a human factors expert, and the evidence regards chemical weapons. In such cases, you are free to leave the cell blank.

How to Start Rating

To enter or edit Consistency ratings, go to your Personal Matrix and click the "Edit your consistency scores" link. Your matrix cells will become pop-up lists. Scoring each cell is as simple as choosing the appropriate scores from the pop-up lists. Each score is instantly saved the moment you select it. If your computer crashes while you're in the middle of scoring a matrix, all of the scores entered up to that point are safe.

Also note that while you enter scores, the hypothesis scores at the top of the matrix do not change. Updated hypothesis scores will be displayed once you click the "stop editing" link.

Working Across the Matrix

In entering the Consistency ratings, it is essential that you work across the matrix, assessing the Consistency of the evidence, one item at a time, against each of the hypotheses. Do not work down the matrix. That is, do not take one hypothesis at a time and assess the consistency or inconsistency of all the evidence for that single hypothesis.

This procedure enables you to assess what is called the "diagnostic" value of the evidence. Diagnosticity of evidence is an important concept that is, unfortunately, unfamiliar to many analysts. Evidence is diagnostic when it is Inconsistent with one or more hypotheses and Consistent with others. That is, it influences your judgment on the relative likelihood of the various hypotheses. If an item of evidence is Consistent or Inconsistent with all hypotheses, it has no diagnostic value. In doing an ACH analysis, it is a common experience to discover that much of the evidence supporting what you believe to be the most likely hypothesis is really not helpful, because the same evidence is consistent with all the other hypotheses. In this case, it would be misleading to use this particular item of evidence to support your analytic conclusion. In some cases, it might even prove counterproductive because someone reading your assessment who supports a contrary hypothesis could argue that the evidence you cite also is consistent with their view.

If an item of evidence is Very Inconsistent (II) with a hypothesis, and if that evidence rates high on Credibility, this is a strong indicator that the hypothesis is unlikely. To sort by Diagnosticity, go to Sort Evidence and select Diagnosticity.

The standards you use for judging Consistency and Credibility sometimes evolve as you gain a better understanding of the relationship between the hypotheses and the evidence. After entering all the evidence, go back and make sure the judgments are consistent. Change any ratings you now see in a different light. You may need to do this several times during the course of the analysis.

ACH uses the Consistency ratings to calculate an Inconsistency Score for each hypothesis. Evidence that is Inconsistent (I) counts one point, and Very Inconsistent (II) counts two points. Because the focus is on refuting hypotheses rather than confirming them, only Inconsistent and Very Inconsistent evidence is counted in the Inconsistency Score.

Working with Matrices

The core of an ACH project is a matrix of hypotheses and evidence. There are a few different ways to look at the matrix:

  • The Personal Matrix displays your own consistency/inconsistency ratings.
  • The Group Matrix shows you how much your team agrees about each evidence-hypothesis pair.
  • You can view the matrix with a teammate's consistency scores.
  • Finally, you can generate a comparison matrix: this looks just like the Group Matrix, only it compares your scores with those of one other team member instead of the entire team.

Personal Matrix

The Personal Matrix is for your own consistency scores. Here, you will assess the consistency of each piece of evidence to each hypothesis. To begin, click the "Edit Consistency Scores" link. When you do this, that link turns into a link to "stop editing."

As you complete the matrix, be sure to work across the matrix, not down. For each hypothesis, ask yourself: if this hypothesis were true, is it likely that I would see this evidence? (For more tips, see our Help article on rating consistency scores.) You do not have to complete the entire matrix; for example, if there is a piece of evidence that requires technical expertise you do not have, you are free to leave those cells blank.

Be sure to move your mouse over the evidence and hypothesis headers, as doing so will expose more details about these items. This is very helpful when assessing consistency.

When finished, click the "stop editing" link. When you do this, the hypothesis scores will adjust to reflect your new ratings. You can come back at any time and change your ratings.

Group Matrix

The Group Matrix shows the same set of evidence and hypotheses, only instead of filling the cells with inconsistency scores, it shows you how much consensus there is among you and your teammates. Each member of an analytical team completes a matrix. The Group Matrix then highlights the root causes of your disagreements, so that you can have a more productive discussion.

As you move your mouse over the cells, each member's score for that cell is revealed.

Compare Matrices and User Matrices

At the bottom of the Group and Personal Matrices, you are given options for two other types of matrices: one that let's you view another member's Personal Matrix, and one that lets you compare one member's scores with another's. Matrix Tools

ACH gives you several options for manipulating matrices and making them more manageable:

  • Adjust Column Width: If you need additional space for your hypothesis columns, simply drag the bottom-right corner of your browser window to adjust its size. ACH matrices will automatically adjust their widths accordingly.
  • Hide/Show Columns: To save space in the matrix, it is possible to hide or show certain columns that are used less frequently than others. This is done by clicking on the Show Data Columns tab just above the matrix and then checking the columns you want to appear in the matrix.
  • Printing: To print a matrix, click the Printer icon select Print in the File menu. If you have created multiple matrices, the program will only print one matrix at a time. To print all matrices, they must be selected and printed individually. You may need to adjust the column widths or slide the panel dividers so that the printed matrix will fit on one page.
  • Matrix Duplication: If you are part of a multi-member ACH project, you may want to begin a private, solo project based on the same set of evidence and hypotheses. To do this, use the Duplicate Matrix link from your Personal Matrix. This will create an entirely separate project, with you as the sole member.

ACH as a Collaborative Process

ACH is an excellent framework for collaboration among analysts. Although the software tool can be used by an individual analyst, the cross-fertilization of ideas when a group of analysts work together as a team helps analysts avoid personal bias and generates more and better ideas. When a team approach is adopted, the project can combine inputs and insights from analysts with different backgrounds. When analysts disagree, the Group Matrix can be used to highlight the precise area of disagreement. Subsequent discussion can then focus productively on the ultimate source of the differences.

It is also possible to do a sensitivity analysis to see how alternative interpretations of the evidence or different assumptions affect the likelihood of the alternative hypotheses. This often helps resolve, or at least narrow down, areas of disagreement. It is also possible to go back and enter different ratings and see how any single change or set of changes affects the overall likelihood of the various hypotheses. In this way, it is possible to clearly identify which disagreements are important to resolve and which really aren't worth arguing about.

A collaborative ACH project can be implemented with a four-step process:

  • Identify an agreed-upon set of data relevant to the topic; remember to include assumptions, logical deductions, the absence of data, and conclusions from other analyses.
  • Convene a structured brainstorming session with a diverse group of analysts to identify all the potential hypotheses.
  • Commission one analyst or a small group of analysts to work independently--and on their own time schedules--to load the data. Then invite all interested analysts to take part in the project by analyzing the data in their own Personal Matrix. This might take several days or weeks. This should provide independent validation of the key conclusions, and because they are working from their own desks instead of together, the likelihood of groupthink is minimized.
  • Reconvene the larger group to assess the results of the working group. Use the Group Matrix to determine the sources of disagreement. Focus on what data emerges as most diagnostic, the most persuasive reasons for discounting hypotheses, the credibility of the data supporting the most likely hypotheses, and the most productive areas for future research or collection.

Collaborative Features

ACH features a number of tools to help teams collaborate:

Evidence-Based Networking:

ACH allows you to find others in your organization who are using the same source documents you are. Often, analysts doing similar work are not aware of one another. An excellent way of finding such people is by identifying those who work with the same documents as you. ACH makes this easy.

If your evidence is based on a document with a unique ID, such as a URL or serial number, you can enter this ID into the Serial Number field on the evidence page. The ID will then be displayed on that page, and clicking the link next to it ("Who else is using this?") will show you all other ACH projects in your organization that are using that source document.

Group Matrix:

If your ACH project as multiple members, each member, like you, will complete a Personal Matrix. Naturally, you and your counterparts will disagree on some of the consistency scores. The Group Matrix automatically shows you where those disagreements lie. Rolling your mouse over the cells will reveal how each project member scored the cell. Learn more about what you can do with the Group Matrix.

Persistent Chat:

Each ACH project has a dedicated chat room. When viewing your project, you can access the room by clicking the Chat button in the lower left corner of the page.

Unlike most chat rooms, you do not have to be present in the room to read what others have been saying. The ACH chat room displays the entire transcript since the project's creation, and any project member may participate.

Message Boards:

Along with the chat tool, ACH provides a separate message board for each component of your project: every hypothesis, evidence item, and evidence-hypothesis pair has a dedicated page, the bottom half of which is dedicated to discussion.

You can reach the evidence and hypothesis pages from any matrix by clicking the text that corresponds to that item. For the evidence-hypothesis pair pages, click the cell while viewing the Group Matrix.


Original source: Richards Heuer