ÈÍÒÅËÐÎÑ > ¹63, 2011 > Linking Military Service Budgets to Commander Priorities

Mark A. Gallagher and M. Kent Taylor
Linking Military Service Budgets to Commander Priorities


10 îêòÿáðÿ 2011

In May 2011, the Government Accountability Office (GAO) reported to the congressional committees on armed Services regarding the influence of the U.S. combatant commanders (COCOMs) on the development of joint requirements as part of the Department of Defense (DOD) acquisition process.1 The increased COCOM role in developing joint requirements was legislated by the Weapon Systems Acquisitions Reform Act of 2009 and the Ike Skelton National Defense Authorization Act for Fiscal Year 2011. The GAO reported mixed results regarding the implementation of the legislation—specifically, they found that the COCOMs are now enfranchised vis-à-vis the Joint Capabilities Integration and Development System (JCIDS) for the development of military requirements. However, the COCOMs are still at the mercy of the Services when it comes to actually developing the DOD budget and acquiring materiel; the COCOMs only provide "advisory guidance to the larger acquisition and budget processes."

Colombian army special forces at Tolemaida Air Base during technical demonstration

In this article, we review the establishment of the COCOMs per the Goldwater-Nichols Department of Defense Reorganization Act of 1986, briefly discuss the current DOD resource allocation process, and then propose a construct to evaluate the extent to which the DOD budget is aligned with COCOM operational requirements. We also discuss how to implement this proposal along with advantages and potential concerns based on implementation of this proposal.

Worth noting is the scope of this proposal. Specifically, it is not our intent to create an algorithm whereby a bunch of budget numbers and COCOM priorities are smashed together and the output is declared to be the DOD budget. This analytical approach is intended to inform DOD decisionmakers, not to make their decisions for them—that is, we do not intend to turn the DOD budget into an engineering problem. Furthermore, we are not proposing any changes to current authorities; we are simply proposing quantifiable and tractable measures of how well military department budgets align with COCOM priorities.

Establishment of COCOMs

On July 15, 1985, President Ronald Reagan signed Executive Order 12526 and created the President's Blue Ribbon Commission on Defense, chaired by David Packard.2 The final report released in June 1986 quotes President Reagan's direction for the Packard Commission as:

The primary objective of the Commission shall be to study defense management policies and procedures, including the budget process, the procurement system, legislative oversight, and the organizational and operational arrangements, both formal and informal, among the Office of the Secretary of Defense, the Organization of the Joint Chiefs of Staff, the Unified and Specified Command System, the Military Departments, and the Congress.

The Packard Commission presented findings and recommendations organized around four topics in their interim report as submitted to the President on February 28, 1986:

  • national security planning and budgeting
  • military organization and command
  • acquisition organization and procedures
  • government-industry accountability.

Though DOD was faced with an increasing demand for joint planning and operations across the domains, the commission found that the Services were planning and conducting operations as independent, often competing organizations with little collaboration and cooperation. Similarly, the commission found that each Service advocated and acquired systems to accomplish their assigned roles and missions independently—as though each Service was the primary, if not sole, producer and consumer of its materiel. In rare instances when they did work together, the Services typically closed ranks as a last resort so as to frustrate attempts by the Secretary of Defense to impose top-down direction that would otherwise impede the status quo for the Services.

Packard Commission recommendations were primarily implemented in two ways. First, National Security Decision Directive 219, dated April 1, 1986, implemented virtually all of the commission recommendations that did not require legislative action. Second, the remaining recommendations were addressed via congressional legislation that was introduced on November 24, 1985, as H.R. 3622, "Joint Chiefs of Staff Reorganization Act of 1985." Congress enacted it as the Goldwater-Nichols Department of Defense Reorganization Act of 1986, and President Reagan signed the legislation into law on October 1, 1986 (Public Law 99–433).3 Goldwater-Nichols made sweeping changes to U.S. Code Title 10 that continue to impact the management and functions of DOD. The overall congressional intent for the Goldwater-Nichols legislation was outlined in Section 3:

  • to reorganize DOD and strengthen civilian authority in the department
  • to improve the military advice provided to the President, National Security Council, and Secretary of Defense
  • to place clear responsibility on the commanders of the unified and specified combatant commands for the accomplishment of missions assigned to those commands
  • to ensure that the authority of the commanders of the unified and specified combatant commands is fully commensurate with the responsibility of those commanders for the accomplishment of missions assigned to their commands
  • to increase attention to the formulation of strategy and to contingency planning
  • to provide for more efficient use of defense resources
  • to improve joint officer management policies
  • to otherwise enhance the effectiveness of military operations and improve the management and administration of DOD.

Section 211 of Goldwater-Nichols legislation created a new chapter in U.S. Code Title 10 regarding COCOMs, specifically subtitle A, part 1, chapter 6, "Combatant Commands." Chapter 6 was subdivided into six sections and addressed the following topics:

  • §161, establishment of COCOMs
  • §162, assignment of forces to COCOMs
  • §163, role of the Chairman of the Joint Chiefs of Staff (CJCS) with respect to COCOMs
  • §164, COCOM responsibilities and authorities
  • §165, administration and support to COCOMs
  • §166, COCOM budget proposals.

Though two sections and three subsections were added in subsequent legislation between 1986 and 2003, three sections of chapter 6 are of particular interest when it comes to issues related to DOD budgeting and resource allocation, notably §163, §164, and §166. Subsection (b) (2) of §163 specifies, among other things, that the CJCS "serves as the spokesman for the commanders of the combatant commands, especially on the operational requirements of their commands." It further specifies that the CJCS shall "evaluate and integrate" information related to COCOM requirements, "advise and make recommendations" to the Secretary of Defense regarding COCOM requirements (individually and collectively), and "communicate" COCOM requirements "to other elements of the Department of Defense." Otherwise stated, the CJCS is the middle-man between the COCOMs and the rest of DOD with respect to COCOM operational requirements.

Section 164 addresses COCOM responsibilities and authorities. Of note, §164 subsection (b) (1) specifies that the chain of command flows from the President to the Secretary of Defense to the COCOMs. It also describes COCOM authorities for establishing subordinate commands, organizing their forces, employing their forces, and so forth. Finally, §164 specifies the CJCS advisory role is established with respect to working at the behest of the Secretary of Defense to ensure the COCOMs have "sufficient authority, direction, and control over the commands and forces assigned to the command to exercise effective command over those commands and forces." The section leaves budget authority for the forces with the military departments so the COCOM may focus on the warfighting missions. The extent of COCOM authority for budgetary matters is confined to §166; specifically, COCOM budget proposals are limited to four specific COCOM activities: joint exercises, force training, contingencies, and selected operations.

While COCOMs should focus on warfighting rather than organizing and equipping units, they should be able to influence the types of units available. In an analogy to professional sports, the net effect of §164 and §166 is akin to a coach having full control of the team on the practice field and full control during actual games, but having very little say over who is actually on the team. Ideally, all levels of the sports franchise—ownership, coaches, players, and support staff—are working together when it comes to decisions on personnel, individual training, team practices, game tactics, and so forth.

DOD Resource Allocation Process

Three interrelated DOD decision support systems must be synchronized in order for COCOMs to have the general purpose forces they need to accomplish their assigned missions. The interrelationship of these three decision support systems is depicted in the accompanying figure, along with brief descriptions of each decision support system as posted on the Defense Acquisition University portal.4

Colombian army special forces at Tolemaida Air Base during technical demonstration

Given the purpose of this article and the relative maturity of the COCOM role in the JCIDS process for the development of joint requirements, we focus on the DOD budgeting process and, to a lesser extent, DOD acquisition.

The Office of the Secretary of Defense (OSD) leads the annual Planning, Programming, Budgeting, and Execution (PPBE) process and builds the DOD budget. The Army, Navy (which includes the Marines), and Air Force begin the PPBE process by submitting their proposed budgets, called Program Objective Memorandums (POMs), to OSD. OSD then leads the Program and Budget Review (PBR), which adjusts the Service proposals with inputs across DOD, including the COCOMs. The PBR product, through the Office of Management and Budget, becomes the DOD portion of the President's annual budget. In turn, Congress reviews and revises the President's budget and sends approved legislation back to the President, who signs it into law. Each Service budget authorization includes funding requests to enable it to fulfill its Title 10 responsibility to organize, train, and equip forces. Consequently, the Services control the vast majority of the DOD budget.

Of the five appropriation categories in the DOD budget, three are germane to our discussion of how materiel is acquired by the Services and used by the COCOMs. In practical terms, these three categories capture what DOD is spending for future materiel, what DOD is building now, and what DOD is using now, respectively:

  • research, development, testing, and evaluation (RDT&E)
  • procurement (PROC)
  • operations and maintenance (O&M).

The current DOD budgeting process has perceived problems of inefficiencies. A common complaint is that the Services are somewhat parochial (and arguably myopic) in constructing their budgets by advocating and funding new systems that are typically Service- or domain-centric as though the Services were living in a pre–Goldwater-Nichols time warp. When they advocate and fund parochial systems, they often do so at the expense of funding the acquisition and O&M for materiel that would provide the COCOMs with joint capabilities (that is, across Services). Service-centric budgeting is not a new condition; in fact, it was a problem described by General Maxwell Taylor in his 1960 book The Uncertain Trumpet. In this book, General Taylor describes the budget and strategy obstacles he faced in the Pentagon during the mid- and late 1950s. Regarding the parochialism in the Services' approach to budgeting, he wrote, "We look at our forces horizontally when we think of combat functions, but we view them vertically in developing the defense budget."5 The establishment of COCOMs has significantly modified the requirements process; however, the budget process remains essentially unchanged. The COCOMs do submit Integrated Priority Lists (IPLs) to OSD and the Joint Staff indicating challenges that the budget should address. In addition to this status quo of making marginal recommendations to the PPBE process, this proposal would give each COCOM a quantifiable prioritized input to Service budgets.

A COCOM Priority Rating Proposal

We propose a prioritized rating schema so that the Services' budget alignment with each COCOM's needs could be evaluated throughout the DOD budgeting process. In particular, we propose that the COCOMs score budget proposals using prioritized ratings to quantify the relative contribution of specific budget programs to the accomplishment of each COCOM's assigned military missions. These priority ratings would serve as quantitative evaluation criteria to be included during the PPBE process and would incentivize the Services to account for COCOM priorities in the annual Service budget submissions and deliberations.

Colombian army special forces at Tolemaida Air Base during technical demonstration

Using the President's budget submission to Congress from the previous fiscal year as a baseline, each COCOM would provide COCOM-weighted priority ratings for the Service's RDT&E, PROC, and O&M. Anyone—in a Service, OSD, the Joint Staff, or Congress—could apply COCOM prioritized ratings to proposals in the next fiscal year's budget and assess the impact of individual or collective changes. For the sake of simplicity, RDT&E and PROC will be considered together as a composite category for acquisition (ACQ). ACQ scores are intended to give long-term preferences across years that are in both the baseline budget and the budget being evaluated. Using ACQ as the sum of RDT&E and PROC better indicates the extent of the acquisition, and it allows the Service budget to adjust within years to account for cost, schedule, or performance issues. We contend that O&M gives a short-term evaluation, so we recommend limiting it to just the next fiscal year. We recommend using the exact same years in evaluating these measures in the baseline and new budget to avoid the impacts of production programs starting in the first year or terminating in the last year of the Future Years Development Program (FYDP). We also contend that evaluating these budget categories of acquisition (ACQ = RDT&E plus PROC) and O&M will highlight the linkages and disconnects between Service budgets with the programs (that is, forces or capabilities) required by the COCOMs to accomplish their assigned military missions. The other appropriation categories, including military personnel and military construction, will follow O&M and ACQ to align without requiring direct COCOM ratings.

In table 1, we show a simplified notional example to explain the scoring proposal given to four separate DOD budget programs: airplanes, ships, tanks, and education. We included education as a representative of much of the Service infrastructure that does not directly affect the warfighting capability of COCOMs. In our proposal, Programs listed in the first column represent a compilation of Service program elements related to the given program. The Baseline funding in the second column of this example could be either the sum of the acquisition over the FYDP or O&M for equipped units for the next fiscal year. For each COCOM, the Priority rating reflects COCOM reliance upon the given program in terms of meeting their assigned missions. The Program score is the product of baseline funding times the priority rating (that is, the Program score combines the level of effort and desirability of those particular programs and corresponding operational units). For the example in table 1, the first COCOM assigned a priority rating of 1 for tanks. Thus, since the baseline funding for tank programs is $2 billion, the program score for tanks by the first COCOM is 2. Similarly, the second COCOM rates tanks at 0.5, so the product with funding of $2 billion is a program score of 1. The bottom row shows the sum of the columns. In particular, the sum of the program scores, which are the weighted products of funding and ratings, indicates the level of support those Service programs (and associated program elements) provide to each COCOM. Subtotals of these scores could be used to highlight contributions from various sources, such as individual Services or major commands.

This illustrates several points. First, each COCOM has a unique military mission assigned to it. Therefore, each of the COCOMs will prioritize Service programs differently according to their assigned military missions and perception of the likelihood and severity of future operations. The various priority ratings reflect their COCOM commanders' subjective assessments of the relative contribution of that Service's programs in accomplishing their current and future missions in either their assigned geographic area or functional responsibilities. A priority rating of zero indicates that the given Service's program does not contribute to that COCOM's mission. Hence, Service funding of unrated programs could change or even be eliminated without changing the COCOM's overall score. In this example, ships do not contribute to the first two COCOMs, and tanks do not contribute to the third COCOM. COCOMs would not be expected to score programs that indirectly support their mission, such as professional military education.

An intended consequence of this proposal is that the Services would be incentivized to reduce their indirect costs to an extent that did not affect the quality, and hence the ratings, of their operational units. The Services could burden the operational units with some indirect costs; however, these additional costs would make their operational units appear less efficient. Priority ratings of the same value indicate that programs support that COCOM equally; hence, funding could be moved between these programs without affecting that COCOM's overall score. Funds could be transferred between tank units and plane squadrons without changing the first or second COCOM overall score. Thus, COCOM priority ratings that do not vary much across the programs have little impact on the total scores when the Service budgets are modified. The third COCOM rated funding ships twice as valuable as plane squadrons, so every additional dollar to fund ships has twice the impact on the overall score as the same additional dollar allocated to plane squadrons. The more COCOM ratings vary across the programs, the more sensitive the overall score is to adjustments in the funding.

Let us examine how these priority ratings would be useful in evaluating a proposed alternative in the next budget cycle. Continuing the previous example, consider the alternative funding depicted in table 2. In this instance, a proposal to increase the funding for tank units from $2 billion to $4 billion changes the first COCOM's corresponding product to $4 billion times the priority rating of 1 for a score of 4, which is a 100 percent increase over the baseline score of 2. The decrease in funding for airplanes from $3 billion to $2 billion, with the first COCOM priority weight of 1, causes the product to decrease from the baseline of 3 to a value of 2. This proposed measure enables anyone knowing the COCOM's priority rating to calculate the resulting scores of either funding increases or decreases. The sum of both these funding changes for the first COCOM is an increase to a total score of 6, which is a 20 percent increase over the baseline value of 5. Similarly, the combined changes result in a total score increase of 20 percent for the second COCOM and a 23 percent decrease for the third COCOM.

Colombian army special forces at Tolemaida Air Base during technical demonstration

The relative values, rather than the absolute values, of a particular COCOM's priority ratings are what matter. If the ratings are multiplied by a factor, the sum product is also multiplied by the same factor; however, the percentage change is not affected. In the first example depicted in table 1, the priority ratings of the second COCOM are simply a factor 0.5 times the first COCOM's ratings. Hence, the percent changes are the same, so if the evaluations are not concerned with the relative differences of support to different COCOMs, the total magnitude of the individual COCOM's ratings does not matter.

In a further refinement, OSD or the Joint Staff could implement this proposal to reflect specified preferences for the various COCOMs by requiring each COCOM's overall total score, the sum of the weighted funding, not to exceed an assigned limit. A lower limit would result in some combination of reduced individual ratings or reduced number of programs with an assigned weight; the result would be less assessed impact of Service budget changes. The Secretary of Defense could assign each COCOM its total prioritized dollar sum as an indication of a relative importance of that COCOM mission. For example, a geographic COCOM may be given a limit of twice the summed weighted program funding of a functional COCOM. Prescribed limits for the COCOMs' values would facilitate combining scores across COCOMs.

Decisionmakers in the PPBE process may determine that the aggregate impact to the third COCOM is too severe in the proposal in table 2. A revised funding alternative proposal is shown in table 3. This revision increases funding for tanks and ships in order to compensate for reduced airplane funding. This alternative, while adjusting various program funding, has no overall impact to any of the original COCOM's aggregate measures, shown in table 1.

Implementation

OSD or the Joint Staff could direct COCOMs to provide scores as proposed. However, the approach could also be implemented partially in at least three ways:

  • Any Service could request the COCOMs to provide scores as part of the Service's internal POM preparations.
  • A COCOM could unilaterally score the previous President's budget and announce to the Services, Joint Staff, and OSD the desire to maintain the total of weighted funding.
  • A Capability Portfolio Manager (CPM) could request the COCOMs to score the Service programs under its authority.

OSD or the Joint Staff could also direct one of these partial implementations to test the value of this approach and work out the implementation details.

Advantages and Potential Concerns

This budget scoring proposal has three intended main advantages:

  • The proposal is simple, quantitative, fiscally constrained, and transparent; anyone throughout the budget process—including a Service developing its POM, OSD evaluating PBR alternatives, or Congress debating final law—may apply the approach and use the measures to evaluate a budget decision.
  • It highlights the direct linkages (and potential disconnects) between Service budgets and COCOM priorities.
  • It maintains roles and responsibilities consistent with current Title 10 regulations.

First, this proposal is not complicated to understand, implement, or evaluate—the COCOM prioritized rating schema provides a clear, transparent, indisputable, quantitative indication of each COCOM's unique priorities over the vast range of Service programs. Since the COCOM ratings are confined to a baseline, such as the previous President's budget, the resulting measures are realistic. Given COCOM priorities, anyone may evaluate a proposed Service budget change and assess its impact to each of the COCOMs; these measures may influence decisions in the Services preparing their budget submissions, in PBR discussions among the Services, COCOMs, Joint Staff, and OSD, and in congressional debates and votes. The score may be evaluated for changes from a single program to many changes throughout the budget. These COCOM priority weightings would be useful in evaluating Service budget options throughout the PPBE and budget enactment processes.

Whereas current COCOM requirements (for example, as reflected in the COCOM IPLs) can reflect unconstrained or unrealistic demands, this proposal produces achievable indicators of demand since the COCOMs are bound within the collective Service budgets in the baseline. Funding exactly the baseline budget again would result in each COCOM achieving a 100 percent weighted score. Restricting the COCOM ratings within a DOD budget baseline, such as the President's budget submission, enforces that the rationale in the process at the expense of new and emerging joint military requirements cannot be indicated within this approach. Like the analogy to the marketplace, customers can only purchase what is for sale; however, producers are concerned about responding to customer demands to maintain future business.

Second, this budget scoring proposal would change the incentives and behavior among and within the Services. Within the Services, there are different communities (for example, major commands) that often compete for a larger portion of their Service's budget. This scoring system would likely shift the funding allocation toward programs that the various COCOMs rated as high priority. Program managers and unit commanders who want to maintain their funding would want to convince the COCOMs to rate their systems high. Presumably, the best way to advocate for COCOM support would be to deliver desired joint military capabilities. Hence, an expected benefit is increased dialogue between the Services and COCOMs regarding what the COCOMs really require and how the Services can meet these needs. Thus, the Services would be incentivized to add value as perceived by the COCOMs.

Moreover, this proposal would provide incentive to reduce indirect costs to the extent that they do not contribute to adding value to the COCOMs. The COCOM would not generally perceive value in indirect costs that the Services encounter to provide capabilities. Hence, a COCOM would be unlikely to give a priority rating to any indirect program. For example, we would not expect a COCOM to score professional military education or academic education. The Service would still want to continue education to the extent that it provides a perceived quality of its units to the COCOMs. The Services would have two choices: either fund these indirect costs separately, realizing that education is not going to be scored by the COCOMs, or add the "burden" for these costs into their operational units. However, as the costs and budgets of the operational units and programs increase, the COCOMs could perceive a decrease in the benefit relative to the cost and would likely reduce the priority ratings. Thus, whether the Services leave the indirect costs separate or incorporated into their budgets for operational units, they would be incentivized to minimize indirect costs that do not reflect in the quality, and hence, in COCOM priority ratings. Like corporations, Services would be motivated to control their indirect costs.

Third, Services, not the COCOMs, retain acquisition responsibility and authority, so the Services' Title 10 responsibilities to organize, train, and equip remain intact. The Services maintain flexibility to address issues, such as program cost, schedule, and performance tradeoffs. The COCOMs retain their focus on accomplishing their assigned missions without getting involved in detailed acquisition programmatic issues.

There are three potential concerns regarding this budget scoring proposal:

  • level of effort required for implementation
  • ability for combatant commands to game the system
  • ability of Services to game the system.

First, this proposal requires a relatively small amount of overhead for either the COCOM or Services. Unlike their roles in developing IPLs or participating in the CPM process, the COCOMs do not have to identify or evaluate programmatic challenges. With JCIDS, IPLs, and CPMs, the COCOMs are already evaluating Service programs. This proposal would provide them a means, with little additional effort, to provide quantitative assessments. The ease of application and quantitative nature would likely make these inputs have more impact on PPBE and ACQ than the IPLs.

An additional aspect of this first concern is that, from inception, the COCOMs traditionally had a short-term focus on operations. Hence, the commanders and their staffs were not readily able to address or prioritize future acquisitions. With the increased role of COCOMs in the JCIDS requirement process, the COCOMs have created the equivalent of a J8 office to address future acquisition issues. Hence, while the COCOMs of the late 1980s would have difficulty implementing this approach, the COCOMs of today have offices that are already involved in decisions regarding future acquisition programs.

Colombian army special forces at Tolemaida Air Base during technical demonstration

Second, COCOMs could attempt to game the program scores—for example, a COCOM commander could inject parochial bias into providing high priority weights for programs proposed by his own Service. Two aspects of transparency and impact mitigate this risk. First, these commanders would have to withstand scrutiny from OSD, the Joint Staff, and Congress. Their priority ratings should match their request for forces. After a few years of implementation, a significant departure from a predecessor's ratings would result in a call for justification. If scores lack credibility, they could be discounted with a default of returning to the current process. Second, high or low scores do not directly affect the Service programs. Favoring a particular group of programs with high scores does not necessarily increase those programs' funding. High scores from a COCOM would make the Service have to justify reduced funding; hence, the Services lose some flexibility from high scores. While low or zero scores allow programming flexibility, they would undermine any attempt to object about those programs being reduced. This double-edged sword of high ratings with operational support versus low ratings with budgeting flexibility would tend to enforce honest ratings.

Third, the Services could attempt to game this scoring system in a couple of ways. First, they could maintain acquisition scores by delaying programmatic funding to the out years of the FYDP. However, the unrealistic budget profiles for RDT&E or procurement profiles would be apparent to anyone reviewing the approach. Second, the Services could inflate their budget values; however, the COCOMs would likely reduce ratings on programs with apparent increased costs because they would not provide proportionally more benefits for the associated resources. A limit on total COCOM scores would further inhibit high scores for programs with inflated funding. The transparency and nonbinding approach of this proposed process result in little risk of testing this scheme.

Goldwater-Nichols legislation established the roles of the Services as materiel providers and COCOMs as materiel customers. Subsequent legislation provided COCOMs with direct input to the development of operational requirements. This proposal extends those legislative actions by providing a simple approach for the combatant commands to provide priority weights and quantitative scores to Service budgets. The net result would be the creation of a "market" where programs (vis-à-vis operational units) compete to initiate and maintain funding. The Services, COCOMs, Joint Staff, OSD, and Congress could use the COCOM program scores to validate RDT&E, PROC, and O&M funding and to evaluate potential changes in the development of the next budget. The COCOMs would be fiscally limited to recommend priorities within the existing Service budgets to ensure realistic requests. The Services, including their internal fiefdoms, would be incentivized to deliver capabilities that the COCOMs highly valued through scores to maintain their funding. The approach is compatible with all existing budget processes. The approach may be implemented, at least on a partial scale, by any COCOM, Service, the Joint Staff, or OSD. JFQ

 

Note

  1. Government Accountability Office, "Defense Management: Perspectives on the Involvement of the Combatant Commands in the Development of Joint Requirements," available at <www.gao.gov/products/GAO-11-527R?source=ra>.
  2. President's Blue Ribbon Commission on Defense, Final Report, June 30, 1986, available at <www.ndu.edu/library/pbrc/36Ex2AppC1.pdf>.
  3. Public Law 99–433, section 3, October 1, 1986, available at <http://osdhistory.defense.gov/docs/Goldwater-NicholsDODReorganizationActof1986.pdf>.
  4. "DOD Decision Support Systems," available at <https://dap.dau.mil/aphome/Pages/Default.aspx>.
  5. Maxwell Taylor, The Uncertain Trumpet (New York: Harper & Brothers, 1960), 123.

Âåðíóòüñÿ íàçàä