Obrazy na stronie
PDF
ePub
[subsumed][subsumed][merged small][merged small][graphic][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed]

of having dual UHF radio installations on weapon systems A and C is that the pilots tend to be less critical of the radio equipment when a discrepancy occurs, because the crew can switch to the other radio and complete the communications, and consequently they are less likely to report the discrepancy on the postflight debriefing form than if it were a single installation on which they were solely dependent.

The maintenance-handling factors likewise reflect significant differences in some cases and reasonably similar performance in others. These show both the range and magnitude of the factors of interest and reflect total fleet values for a one-year period, thereby eliminating much of the variation normally seen when dealing with more limited data samples.

The first two factors-frequency in terms of mean time between maintenance actions (MTBMA) and maintenance man-hours expended per equipment flight hour (MI)-are indicative of the maintenance resources expended. The remaining maintenance-handling factors listed are expressed as percentages of the indicated "How Malfunctioned" or "Action Taken" codes relative to the total number of equipment maintenance actions associated with the equipment item. The values give a good indication of the range of variation and absolute values

that are typical for this equipment on the six different aircraft systems.

The last group of data in Figure 8 gives a good indication of the magnitude and range of the three primary use factors of interest: mission duration, utilization rate, and operating to nonoperating time ratios for each of the six weapon system applications.

In summary, the data indicate that there is a considerable range of variation in field MTBF among the six different aircraft types, as well as variations in the associated maintenance handling and use factors.

The field performance indices for the TACAN navigation set installed on weapon systems A and D are shown in Figure 9. These are both supersonic jet aircraft; weapon system A is a medium bomber, while D is a high-performance training aircraft.

The data presentation format is the same as that described previously. Examining the MTBF values, it is seen that this equipment had an MTBF requirement of 1000 hours, a predicted MTBF of 2900 hours, and a demonstrated MTBF of 669 hours. The reassessed field MTBF shows a value of 129 hours on weapon system A and 106 hours on weapon system D, neither of which is significantly different.

Figure 9. Avionics Performance Indices - TACAN Navigation Set
Maintenance and Handling Factors

[graphic]

The remaining data presented in Figure 9 illustrate that contrary to the slightly lower indicated value of field MTBF for the TACAN on system D, the maintenance index (maintenance man-hours expended per equipment flight hour) is considerably lower than that reported for system A. The remaining maintenance-handling factors are comparable for both installations.

A review of the use factors indicates that there is almost a 3 to 1 difference in mission durations between the two applications, and almost a 2 to 1 difference in the utilization rates. However, the combination of a high mission duration and low utilization rates or vice-versa tends to yield no significant difference in the composite operational influences for either application.

The relative contribution of operating and nonoperating equipment failures was analytically assessed to calculate the expected number of failures based on given values of operating to nonoperating failure-rate ratios and given levels of equipment utilization rates. Since data on actual failure-rate ratios were not available, ratios of 10, 20, 40, 80, and 160 to 1 were assumed for this evaluation. The range of equipment-utilization rates used for the analysis was from 15 to 100 operating hours per month per equipment. These ranges encompass the range of values characteristic of the equipments included in the study. This resulted in the development of the nomograph in Figure 10, from which one can estimate the expected distribution of operating and nonoperating failures for a given item of equipment.

Maintenance action data for a one-year period on one equipment type were then analyzed and classified to derive a count of the number of failures that occurred during operating and nonoperating times. To classify the failures as either operating or nonoperating failures, the following rationale was applied to the "When Discovered" codes pertaining to each failure occurrence: If the code used was indicative of either preflight inspection, special inspection, quality control check, depot-level maintenance, or withdrawal from stock, the failure was considered to have occurred during the nonoperating time. All other failures were considered as having occurred during the equipment operating time.

[graphic][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed][subsumed]

The results of the failure data reclassification described indicate that 40 percent of the failures were associated with the nonoperating period and the remaining 60 percent occurred during the operating period. The values of utilization rates and operating-hour to flight-hour ratios for the equipment item investigated were 38.9 hours and 1.23 hours per month, respectively. From these values, 47.8 is calculated as the equipment's utilization rate. Entering the data on the nomograph, it can be inferred that the ratio of operating to nonoperating field failure rates for this equipment is approximately 20 to 1.

In the absence of more detailed data, this approach can provide a useful estimate of the current operational experience on a typical equipment item. The point made by this analysis is that a significant percentage of failures can and does occur during nonoperating time, yet the denominator of the equation

[blocks in formation]

now operational on 10 different Air Force aircraft types has identified a number of significant factors which contribute to the differences between field-observed reliability and demonstrated, predicted, or required MTBF values.

While many contributing factors were revealed, the primary causes were found to be the data base used for assessment, and, to a lesser extent, the operational influences of maintenance handling and use. However, generalizations do not always fit particular cases. For this reason, the prediction models developed for estimating field operational MTBF were designed to give upper and lower bounds for the estimated value.

The definitional differences observed are attributed to the differences in the failure criteria and time base used by the logistic support community (AFLC) which collects and analyzes the data, and the engineering community (Air Force Systems Command and industry) which establishes requirements, performs predictions, and conducts reliability demonstration tests. The definitional differences are composed of two parts, one related to the time base used for MTBF assessment, the other to the failure criteria used for assessment purposes.

The following conclusions and recommendations are offered:

• The AFLC-reported MTBFs based on aircraft flight hours were increased by factors ranging from 1.2 to 2.7 when the equipment operating hours were used to compute the field MTBF. The average increase for all 16 equipments was twice the AFLC-reported field MTBF. In terms of operational MTBF achievement relative to the required MTBF, the AFLCreported MTBF increased from 0.21 to 0.42. When the failure classifications were reassessed and adjusted, the composite effect of the failure definitional adjustments increased the fieldto-required MTBF ratio from 0.42 to 0.51.

• The primary operational influences are those related to maintenance handling and equipment use. These operational influences collectively account for about half of the remaining differences after definitional factors between the observed (reassessed) field MTBF and the demonstrated MTBF are removed, and about 45 percent of the remaining differences when related to the predicted MTBF.

• The review of the failure relevancy criteria

revealed that two related, but differing, reliability characteristics are responsible for the differences in failure classification criteria. These are the inherent reliability (engineering oriented) and the operational reliability (logistics support and operations oriented). Until these differences are clearly recognized and understood, confusion as to the meaning of MTBF will continue to exist.

• A limited analysis of maintenance action data indicates that approximately 60 percent of maintenance actions are expended on the prime avionics equipment items and the remaining 40 percent on system interfaces and associated hardware. This suggests that the latter two areas may be prime targets for cost reduction and reliability improvement.

The differences in assessed field operational MTBFs suggest that it may not be valid to use a single environmental factor for aircraft without regard to the type of aircraft when making reliability predictions. Possibly the environmental factors given in Military Handbook 217B should be adjusted by an appropriate modifier to reflect differences in aircraft type. • Nonoperating failures can make a significant contribution to the assessed field MTBF for avionics equipment. It is estimated that between 20 and 60 percent of the failures recorded during operational deployment of avionics equipments are actually nonoperating failures. This suggests the need to establish several separate measures of MTBF, each directed at a specific objective. One would be used to determine inherent reliability (based on operating hours), one to determine field operational reliability (based on flight hours), and another to determine logistic support reliability (based on calendar time; that is, months or years). Such an approach would contribute significantly to a better understanding of the meaning of MTBFS and result in more accurate data. DMJ

GEORGE A. KERN is a Program Manager at Hughes Aircraft Company, Culver City, CA, with responsibility for advanced study programs for the Design Effectiveness Department of the Aerospace Groups.

Mr. Kern has some 28 years of industry experience, with specific expertise in the design, development, and test and evaluation of electronics equipments for military weapon systems.

Mr. Kern has a B.S. degree from Newark College of Engineering, Newark, NJ, and has done graduate work at Stevens Institute of Technology and at UCLA.

Taking a Look at

the Request for Proposal

This very basic instrument in the procurement process needs careful consideration by DoD and industry alike.

[graphic][subsumed][subsumed][subsumed]

When the government pur- evaluation of several requests

chases large, complex weapon systems, the request for proposal is one of the key instruments of communication between the procurement officer and the seller of the product or services. The potential seller must understand clearly the contents of the request for proposal in order to respond properly to the procurement requirements. The request for proposal, however, has been criticized variously for being too long, for requiring too much nonessential data from the offeror, and in general for being a poor communications instrument.

In 1975 a study was conducted under the auspices of the Business Research Management Center, Wright-Patterson Air Force Base, to determine how contract administrators viewed the request for proposal as the first communications instrument in the acquisition process. The study was designed to solicit the opinions of major defense contractors. It was preceded by a review of the literature regarding the request for proposal to determine what actions had previously been taken.

In the literature search, two prior studies were located. Both studies were conducted in 1969: one was completed by an Air Force team of procurement specialists; the other, by members of the Aerospace Industries Association of America.

The studies followed similar formats. Each included a critical

by Cecil V. Hynes Associate Professor of Marketing University of Maryland

College Park, MD

Opinions expressed herein are those of the author and not necessarily those of the Department of Defense.

evaluation of several requests for proposals and a review of the regulations and directives that applied to the procurement

process.

The Air Force study reviewed in detail three requests for proposals which were typical of recently completed formal source selection activities. The RFPs selected were: the second source procurement for the Minuteman III Guidance and Control Set contracted by the Space and Missile Systems Organization, the Aeronautical Systems Division AGM-65A Maverick Missile procurement, and the Electronic Systems Division TPN-19 Landing Control central procurement.1

The Aerospace Industries Association study used proposals for the short-range attack missile, the C-5A cargo airplane, and the airborne warning and control system as case studies to demonstrate industry's experience in working with requests for proposals.2

Both studies made similar, fundamental recommendations. For example, each recommended page limitations for the requests for proposals and for the contractors' proposals. Both studies also recommended a careful screening of the requirements placed on contractors for the provision of data, certificates, and management reports.3

The only published articles located during the literature search which dealt directly with the request for proposal appeared in the January 1973 issue of the Defense Management Journal. These articles generally characterized the requests for proposals as excessively long documents, written in such a manner as to encourage contrac

tors and responding companies to place a large volume of nonessential essential information in their proposals. Many of the points raised in the Defense Management Journal articles were the same as those addressed in the Air Force and Aerospace Industries Association studies.

In May 1973, the Aeronautical Systems Systems Division of the Air Force Systems Command published ASD Pamphlet 800-6 to provide guidance for improving and simplifying the request for proposal. This publication suggested reorganizing the request for proposal in a format that

1Assistant Secretary of the Air Force (Procurement), "Air Force Request for Proposal, Study Team Final Report," Washington, DC, 1969.

"Aerospace Industries Association of America, "An AIA Study of United States Air Force Requests for Proposals, Critiques, and Recommendations," Washington, DC, December 1969.

The Air Force study suggested assembling experts to assist the system program office in identifying unnecessary requirements in the RFP. The AIA study suggested a formal, top-level RFP review, preferably at the Assistant Secretary level, to take place prior to RFP issuance to ensure elimination of requirements which trigger excessive effort and documentation.

"Brigadier General A. L. Esposito, USAF, "An Analysis of Frustrations: The RFP-Proposal Cycle," Defense Management Journal, January 1973, p. 8; Lieutenant General James T. Stewart, USAF, "Source Selection Process Faces Winds of Change," Defense Management Journal, January 1973, p. 13; and Steele Morris, "Communications Effectiveness' Needed in RFP-Proposal-Contract Award Cycle," Defense Management Journal, January 1973, p. 17.

Headquarters Air Force Systems Command, Aeronautical Systems Division, ASDP 800-6, "Request For Proposal Preparation Guide," Andrews Air Force Base, MD, May 29, 1973.

« PoprzedniaDalej »