Medium-class Explorers (MIDEX)

Lessons-Learned Workshop

Hampton, Virginia

June 26 -27, 1996

 

 

 

Final Report

 

 

 

 

 

August 1996

 

 

 

 

Table of Contents

 

 

 

Introduction............................................................................................. 1
Agenda..................................................................................................... 2
Chairman's Conclusions.......................................................................... 3
Recommendations.................................................................................... 6
I. Two-Step Process 6
II. Dual Mode Option 9
III. Selection Criteria 11
IV. Cost Assessment 12
Appendices.............................................................................................. 14
A. Attendees................................................................................. A1
B. Letter of Invitation.................................................................. B1
C. Minutes of the Meeting........................................................... C1
Science Community Perspectives...................................... C2
Industry Perspectives.......................................................... C14
Government/FFRDC Perspectives..................................... C19
Written Comments............................................................. C22
Chair’s Charge to Splinter Groups...................................... C24
Plenary Session Endorsement of Recommendations......... C26

 

 

 

 

 

 

 

 

Introduction

 

An Announcement of Opportunity (AO) for the first Medium-class Explorer (MIDEX) missions was issued in March, 1995 (AO-95-OSS-02). This AO included, for the first time in the Explorer program, several features that were intended to facilitate the preparation and review of flight investigation proposals: two-step procurement process, an option for a Principal Investigator (PI)-mode, and strict adherence to cost caps. In addition, the AO addressed new programmatic initiatives for technology, education and outreach, and small and disadvantaged businesses.

After the selections under the AO, the Associate Administrator for Space Flight, Dr. Wesley Huntress, requested that a Workshop be conducted to solicit feedback on the process from the community it served. Participants in the MIDEX AO process, including proposal team members, reviewers, and NASA program management personnel were invited to participate. Other interested parties were also invited and did attend. The Workshop addressed various aspects of the solicitation and selection cycle and resulted in a set of recommendations for NASA to consider for future Explorer AO cycles.

Four areas were identified by Workshop participants as requiring focused attention. These were the two-step selection process, the dual-mode spacecraft options offered in the AO, the AO selection criteria, and cost assessment. The recommendations in these four areas follows the Agenda in this report. These recommendations were first formulated by separate splinter groups, then presented to a plenary session and endorsed by all attending the workshop.

Following the recommendations are the Chairman’s comments. Kenneth Lang, Professor of Physics and Astronomy at Tufts University, served as the very able Chair of the Workshop. Professor Lang, who is an experienced space scientist, had no prior involvement in this particular AO process.

Including in Appendices are the list of attendees, a copy of the letter of invitation to the Workshop, and Workshop minutes.

The Workshop was conducted on June 26 and 27, 1996 at the Holiday Inn Conference Center in Hampton Virginia. It was sponsored by the NASA Langley Research Center with support from Jorge Scientific Corporation.

 

 

MIDEX Lessons Learned Workshop
June 26-27, 1996
Holiday Inn, Hampton Virginia
~ Agenda~

Wednesday, June 26, 1996

8:30-8:45 Introduction and Purpose D. Bohlin, NASA

8:45-9:15 Agenda, Scope of Workshop K. Lang, Tufts

9:15-9:30 MIDEX AO Genesis J. Lintott, NASA

9:30 Science Community Perspectives M. Janssen, JPL P. Evenson, Del N. Johnson, NRL R. Paulos, Wisc. B. Mauk, APL J. Raitt, Utah State D. Rust, APL B. Dennis, GSFC

Industry Perspectives S. Savarino, TRW L. Di Biasi, OSC W. Gail, Ball

Govt./FFRDC Perspectives Mike Janssen, JPL Larry Crawford, APL

1:00 Summary of Written Comments J. Lintott

1:30 Splinter Group Objectives and Assignments K. Lang

2:00 Splinter Group Sessions 1. Two-Step Process 2. Dual-Mode Spacecraft 3. Selection Criteria 4. Cost Accounting

 

Thursday June 27, 1996

8:30 Plenary Session: Splinter Group Reports and Discussion

12:00 Conclusions K. Lang D. Bohlin

 

 

Chairman's Conclusions

Representatives from academic, industrial, and government institutions met to discuss the strengths and weaknesses of the recently completed Medium-class Explorer (MIDEX) Announcement of Opportunity (AO-95-OSS-02). Participants in this workshop were experienced and well-informed, but many felt that they had not been well-served by the recently completed MIDEX AO process. In the course of the meeting it became clear that there had been many misunderstandings and a good deal of frustration. NASA representatives indeed accepted the need for comment and community input, hence the workshop took place.

The workshop participants were divided amongst themselves as to the best approach, resulting in some final recommendations that could be contradictory. Nonetheless, everyone agreed that the exercise was valuable and all the final recommendations are being seriously considered by NASA.

Participants reaffirmed their support and enthusiasm for the MIDEX program, and focused on methods for improving the evaluation process.

The meeting was carried out in three stages:

Stage 1. Presentations by twelve participants who were team members of proposals that did not win the recent MIDEX competition. Their comments, which are reproduced in this report (pages C-2 to C-21), provide an objective assessment of the positive and negative elements of the evaluation process, from the perspectives of the science, industry, and government communities. This was followed by a summary of written comments sent to NASA before the workshop, and also reproduced in this report (page C-22).

Stage 2. Four focal points were identified for further discussion, and chairpersons assigned. The assignments and objectives for these splinter groups are provided in the meeting minutes (on pages C-24 and C-25 of this report). The four groups met individually for several hours, and then met collectively in a first plenary session. The four chairpersons subsequently met late into the first night to identify the primary issues, remove overlap, and agree upon recommendations.

Stage 3. The suggestions resulting from Stage 2 were further distilled and endorsed by all attending in another plenary session during the morning of the second day. Each splinter group chairperson presented the relevant ideas, which were corrected, discussed, and eventually confirmed in an open group process. The resulting recommendations are also reproduced in this report (on pages 6-13).

 

Concluding Synopsis of Suggestions

An abridged overview of the four splinter group suggestions for further improving the MIDEX selection process follows; this reflects the Chairman's interpretation of the events and is not meant to portray all of the group perspectives.

A. Two-Step Selection Process.

The initial evaluation should depend only on science and the technology required to do that science. NASA's interpretation of community scientific priorities should be introduced during this initial evaluation, and the number of proposals continuing to further evaluation of cost and management should be reduced to no more than four per final selection. This will minimize the discontent and total cost to the proposing community. Changes in science goals and instrumentation should not be permitted after the initial evaluation.

B. Dual-Mode Selection Process.

Only one mode of proposing should be allowed, in which all proposals are treated completely equally on a level playing field. To avoid inequities, NASA should provide proposal support for cost, management, and technical issues when proposers need it; such support should not depend on inclusion of a NASA Center in the proposal. Discontent can be avoided by clearly stating, in the AO, the expected extent of Goddard Space Flight Center (GSFC) involvement in all aspects of the MIDEX program. NASA should be viewed as the contractor of last resort, providing key links to make otherwise feasible missions possible.

C. Selection Criteria.

The initial evaluation should include an assessment of existing and new technology by qualified experts. Some form of interactive feedback should be established between panel reviewers and proposers to avoid misinterpretations and misunderstandings. A common complaint was that previous evaluation of technical risk and science objectives could have been misunderstood by scientific reviewers who were not experts in the relevant area.

D. Cost Assessment.

There is overwhelming support for full cost accounting of all NASA contributions to, and participation in, flight programs. This should be accomplished without grandfathering or additional costs to the Explorer program. NASA should consider foreign, non-NASA domestic, and non-Explorer NASA contributions to the MIDEX program, with a cap to the level of foreign contribution.

To summarize, some of the most qualified members of the science, industry, and government communities have carried out a democratic and honest evaluation of the MIDEX selection process. They have identified seminal issues, and provided recommendations that will undoubtedly lead to improvements in the MIDEX program in general and the MIDEX AO in particular.

We are grateful to NASA for its open responsiveness to the external communities that it serves, and for providing this opportunity to help retain the utmost integrity in its selection processes.

Signed,

 

 

____________________________
Professor Kenneth R. Lang
Tufts University
Chairman, MIDEX Lessons-Learned Workshop

 

 

 

 

 

Recommendations of the MIDEX Lessons Learned Workshop


I. Two-Step Process

Chair: Michael Janssen, Jet Propulsion Laboratory

The overall goal of the selection process must be to obtain the best science for missions chosen within MIDEX program constraints. To effectively attain this goal, however, two conditions must be met:

The consensus of the workshop was that the two-step process that was used fell short of these conditions. It was felt that much unnecessary effort was spent in the development of proposals. Dissatisfaction was expressed by both industry representatives and proposers on several other aspects of the process as well.

Specific problems that were identified are:

  1. Too many proposals were selected from Step One. Industry representatives and proposers felt that they spent too much time on proposals that had too little chance to win to justify the Step Two effort.

  2. The importance of and requirement for cost evaluation expressed in the AO for Step One and as interpreted by proposers were treated differently in the Step One reviews.
    Much wasted effort was ascribed to proposer’s emphasis on detailed design and costing efforts to arrive at a solid cost-not-to-exceed that was largely irrelevant in the Step One selection.

  3. Reviewers seemed to misunderstand key elements of some proposals.
    The outcome of several debriefings was a perception on the part of proposers that the reviewer did not properly understand, or failed to read, essential parts of their proposal.

  4. Specific complaints were directed toward the evaluation of technical risk and cost realism by the Step One review panels.
    The scientific excellence of the review panels was never questioned. However, the perception was that their evaluations of cost and risk mattered in the selections, and that these were beyond their resources to judge fairly.


Recommendations:

Overall, a two-step process is favored, with suggested improvements:

II. Dual Mode Option

Chair: Paul Evenson, University of Delaware

Issues:

Recommendation:

A binary choice of investigation mode should be replaced by a single mode. A management approach should be proposed by the investigator as part of the mission design.

Comments:

A detailed list of services available from NASA should be included in the AO. This should include management services in addition to more traditional items such as tracking, integration, test, spacecraft systems, or even complete spacecraft.

Detailed consultation with NASA on the use and costing of these items should be part of the preparation of the Step Two proposal. GSFC should develop a point of contact for these discussions and a standard procedure for costing these products and services.

The name of the single but flexible investigation mode should be chosen with care. The term "PI mode" carries with it a certain connotation within NASA, such that continued use of the term might cause confusion.

It is the express intent of this recommendation that a proposer could specify any component of a mission (such as a particular commercially provided spacecraft) but could at the same time request NASA management of the mission.

It would be appropriate for the AO to contain examples of management approaches, but the proposer should not be required to adopt one of the examples.

Issue Relating to the Scope of the Explorer Program:

As expertise in building and operating spacecraft has improved, the provision of space platforms is being transferred to commercial ventures. This has already happened at geosynchronous orbit, which now accommodates many sophisticated communication satellites. Commercialization of space is about to occur at LEO altitudes as the funding is established to implement large constellations of communication satellites for telephony and data transfer between any two or more points or the earth’s surface. Commercial platforms provide an opportunity to perform certain types of experiments at low cost and with unique features.

By purchasing space and resources from the commercial operators NASA could provide opportunities for certain experiments that could not be economically done without the leveraged advantage of the commercial money invested in the program. On the other hand the funding provided by NASA would represent a government investment in systems to help keep the leadership role the USA holds in space activities.

Recommendation:

NASA should find a means of selecting and funding instrument investigations to take advantage of launch targets of opportunity. This should include the possibility of purchasing space on commercial satellites.

 

 

III. Selection Criteria

Chair: Shirley Savarino, TRW

 

Science Recommendations:

New Technology:

Recommendations:

General/Evaluation Criteria:

General Recommendations:

 

IV. Cost Assessment

Chair: Larry Crawford, Johns Hopkins University Applied Physics Laboratory

Issue: All NASA center contributions to flight programs should require full cost accounting for all personnel, services, and hardware.

Comment: It has been stated by NASA that full cost accounting for all NASA contributions to flight programs will be implemented soon. The question is: How much grandfathering will be permitted? (i.e. Will missions designed under the old rules be able to avoid full cost accounting?)

Recommendation: Full cost accounting for all NASA contributions to and participation in flight programs should be implemented. Further, this should not be accomplished at the expense of science (i.e. the amount of dollars available to the Explorer Program should not be correspondingly reduced). No grandfathering should be permitted.

Issue: Why doesn’'t NASA permit foreign contributions outside of the program cap?

Recommendation: NASA should permit foreign contributions to be included in proposed efforts, but require proposer to provide contingency for foreign risk. NASA to provide cap for foreign contributions. Further, don’t subtract the dollar value of the foreign contribution from the program cap.

Issue: The AO cost evaluation approach does not allow for different costing methodologies to be proposed (e.g. it is required to explicitly list cost reserves in proposals).

Recommendation:Allow proposers freedom to justify their handling of contingency (cost reserves); don’t force them to explicitly carry contingency, but require them to explain their approach.

Issue: NASA’s current approach to cost assessment during proposal evaluations relies primarily on cost model outputs. Why doesn’'t NASA adopt the new government guidelines based on the contractor’s past cost performance?

Recommendation: NASA should review the contractor past cost performance-based Federal Procurement Policy guidelines (Federal Report 0014-9063/95; Vol. 63, No. 18; page 573) and issue a NASA policy and position on their use in the source selection process.

Comment: A possible approach that combines cost estimating model outputs and a contractor’s past cost performance in the cost assessment evaluation process is shown on the Figure below.

 

 

 

 

 

 

 

 

 

Appendices

 

 

 

 

 

 

 

 

 

 

 

 

 

 

APPENDIX A

Attendees

 

Jay T. Bergstralh NASA Headquarters Code SR Washington, DC 20546 phone: 202-358-0313 fax: 202-358-3097 e-mail:

Gregory D. Berthiaume Massachusetts Institute of Technology Lincoln Laboratory 244 Wood Street Lexington, MA 02173-9108 phone: 617-981-4903 fax: 617-981-0969 e-mail:

J. David Bohlin NASA Headquarters Code SR Washington, DC 20546-0001 phone: 202-358-0880 fax: 202-358-3987 e-mail:

Michael L. Cherry Louisiana State University Department of Physics and Astronomy Baton Rouge, LA 70803-4001 phone: 504-388-8591 fax: 504-388-1222 e-mail: phcher@lsuvax.sncc.lsu.edu

Larry Crawford Applied Physics Laboratory Space Department 11100 Johns Hopkins Road Laurel, MD 20723 phone: 301-953-5193 fax: e-mail:

Brian R. Dennis NASA/Goddard Space Flight Center Code 682 Greenbelt Road Greenbelt, MD 20771 phone: 301-286-7983 fax: 301-286-1617 e-mail: brian.dennis@gsfc.nasa.gov

Lamont Di Biasi Orbital Sciences Corporation Business Development 20301 Century Boulevard Germantown, MD 20874 phone: 301-428-6610 fax: 301-428-6641 e-mail:

Mary S. Di Joseph NASA/Goddard Space Flight Center Code 490 Greenbelt, MD 20771 phone: 301-286-0118 fax: e-mail:

Paul A. Evenson University of Delaware Bartol Research Institute 222 S Chapel Street Newark, DE 19716-4793 phone: 302-831-2988 fax: 302-831-1843 e-mail: penguin@bartol.udel.edu

Cynthia Faulconer Lockheed Martin Astronautics Mail Stop S8110 PO Box 179 Denver, CO 80201 phone: 303-977-9277 fax: e-mail:

Orlando Figueroa NASA/Goddard Space Flight Center Code 701 Greenbelt, MD 20771 phone: 301-286-4489 fax: e-mail:

Dennis Finley TRW Space Technology Division 515B 14320 Sullyfield Circle Chantilly, VA 22021 phone: 703-802-1918 fax: e-mail:

Peter G. Friedman California Institute of Technology MS 424-47 Pasadena, CA 91125 phone: 818-395-6786 fax: e-mail: peterf@valkyrie.columbia.edu

William B. Gail Ball Aerospace Systems Engineering Communication Systems Division PO Box 1062 Broomfield, CO 80038 phone: 303-939-4418 fax: 303-939-4430 e-mail: bgail@ball.com

Tim Gehringer NASA/Goddard Space Flight Center Code 790.4 Greenbelt, MD 20771 phone: 301-286-6831 fax: e-mail:

David Gilman NASA Headquarters Astrophysics Division Washington, DC 20546-0001 phone: 202-358-0349 fax: e-mail:

Lisa Guerra NASA Headquarters Code B Washington, DC 20546 phone: 202-358-2513 fax: e-mail:

John B. Hall, Jr. NASA/Langley Research Center MS 328 Hampton, VA 23681-0001 phone: 804-864-1742 fax: 804-864-1975 e-mail: john.b.hall.jr@larc.nasa.gov

F. Rick Harnden, Jr. NASA Headquarters Code SR Washington, DC 20546-0001 phone: 202-358-0351 fax: 202-358-3096 e-mail: frh@hq.nasa.gov

Andrew Hunter NASA Headquarters Code B Washington, DC 20546 phone: 202-358-2514 fax: e-mail:

Michael A. Janssen Jet Propulsion Laboratory MC 169-506/Space Sciences Division 4800 Oak Grove Pasadena, CA 91109-8001 phone: 818-354-7247 fax: 818-354-8895 e-mail: michael.a.janssen@jpl.nasa.gov

W. Neil Johnson Naval Research Laboratory Code 7651/Space Science Division 4555 Overlook Avenue, SW Washington, DC 20375-5352 phone: 202-767-6817 fax: 202-767-6473 e-mail: johnson@osse.nrl.navy.mil

Kenneth J. Johnston US Naval Observatory Code SD 3450 Massachusetts Avenue, NW Washington, DC 20392-5420 phone: 202-762-1513 fax: 202-762-1461 e-mail: kjj@astro.usno.navy.mil

Thomas C. Jones NASA/Langley Research Center MS 431 Hampton, VA 23681-0001 phone: 804-864-7037 fax: 804-864-7202 e-mail: t.c.jones@larc.nasa.gov

Kenneth R. Lang Department of Physics and Astronomy Robinson Hall Tufts University Medford, MA 02155 phone: 617-627-3390 fax: 617-627-3878 email:

John A. Lintott NASA/Langley Research Center Code S Support Office Mail Stop 160 Hampton, VA 23681-0001 phone: 804-864-9864 fax: 804-864-8894 e-mail: j.a.lintott@larc.nasa.gov

Barry H. Mauk Applied Physics Laboratory Space Department 11100 Johns Hopkins Road Laurel, MD 20723-6099 phone: 301-953-6023 fax: 301-953-6670 e-mail: barry_Mauk@jhuapl.edu

Arlene A. Moore NASA/Langley Research Center MS 365 Hampton, VA 23681-0001 phone: 804-864-4407 fax: 804-864-8671 e-mail: a.a.moore@larc.nasa.gov

Jayant Murphy Johns Hopkins University Department of Physics Baltimore, MD 21218-2695 phone: 410-516-7027 fax: e-mail:

Paul J. Ondrus NASA/Goddard Space Flight Center Code 510.1 Greenbelt Road Greenbelt, MD 20771 phone: 301-286-9858 fax: e-mail:

Pete Partridge Applied Physics Laboratory Space Department 11100 Johns Hopkins Road Laurel, MD 20723 phone: 301-953-5623 fax: e-mail:

Robert J. Paulos University of Wisconsin, Madison Space Science and Engineering 1225 West Dayton Street Madison, WI 53706 phone: 608-263-6729 fax: e-mail:

Cynthia A. Peslen NASA/Goddard Space Flight Center Code 406 Greenbelt, MD 20771 phone: 301-286-5917 fax: e-mail:

W. John Raitt Utah State University Center for Atmospheric and Space Sciences SER Building Room 250A Logan, UT 84322-4405 phone: 801-797-2849 fax: 801-797-2492 e-mail: raitt@cc.usu.edu

R. W. Richie NASA/Langley Research Center Code S Support Office Mail Stop 160 Hampton, VA 23681-0001 phone: 804-864-9863 fax: e-mail:

Chris Roberts CTA Incorporated 1521 W. Branch Drive McLean, VA 22102 phone: 703-883-1017 fax: e-mail:

David M. Rust Applied Physics Laboratory Johns Hopkins University Johns Hopkins Road Laurel, MD 20723-6099 phone: 301-953-5414 fax: 301-953-6670 e-mail: david.rust@jhuapl.edu

Shirley A. Savarino TRW Spacecraft Technology Division Civil & International Systems One Space Park R9/1076 Redondo Beach, CA 90278 phone: 310-814-6366 fax: 310-813-3457 e-mail: shirleysavanno@gmail4.sp.trw.com

James Thieman NASA/Goddard Space Flight Center Code 633.2 National Space Science Data Center Greenbelt, MD 20771-0001 phone: 301-286-9790 fax: 301-286-1771 e-mail:

Melville P. Ulmer Northwestern University Department of Physics and Astronomy 2145 Sheridan Road Evanston, IL 60208-2900 phone: 847-491-5633 fax: 847-491-3135 e-mail: m-ulmer2@nwu.edu

Peter W. Vedder Omitron Incorporated 6411 Ivy Lane, Suite 600 Greenbelt, MD 22209 phone: 301-474-1700 fax: 301-345-4594 e-mail: peter.vedder@omitron.gsfc.nasa.gov

Richard I. Weiss NASA/Goddard Space Flight Center Code 410 Greenbelt, MD 20771 phone: 301-286-7493 fax: e-mail:

Theodore B. Williams Rutgers University Physics and Astronomy Department Piscataway, NJ 08855 phone: 908-445-2516 fax: 908-445-4343 e-mail: williams@fenway.rutgers.edu

 

 

APPENDIX B

Letter of Invitation

SR

 

Dear Colleague:

You are cordially invited to attend a Medium-class Explorer (MIDEX) Lessons-Learned Workshop to be held June 26 and 27, 1996, at an appropriate hotel in Hampton, Virginia. Dr. Wesley Huntress, Associate Administrator for the Office of Space Science, has requested that this Workshop be conducted to focus on the recently completed Announcement of Opportunity (AO-95-OSS-02) solicitation and selection cycle. The Workshop is open to all MIDEX proposers, including their proposal teams, science and technical peer reviewers, and NASA program management. The Workshop will begin at 8:30 am on both days. Prof. Kenneth R. Lang of the Department of Physics, Tufts University, will serve as the Workshop Chair. The results of the Workshop, both positive and negative, will be captured, adjudicated and approved by the Chairman.

 

The objective is to identify both the strengths and weaknesses of the process up to this point, with the goal of improving the process for the future Explorer AO cycles, including those for Small Explorers (SMEX) and MIDEX. NASA plans to release a SMEX AO this fall and the next MIDEX AO in the spring.

 

The Workshop is scheduled for a full day on June 26 and the morning of June 27. Given this tight schedule, we want to focus the Workshop on the areas of highest interest. Therefore, we are soliciting your written comments, suggestions, and lessons learned about the recently completed MIDEX AO process in advance. Your comments, which will be held anonymously, will be used to establish a final agenda that will be distributed prior to the Workshop. Your written comments should be submitted by June 10 to:

 

Mr. John Lintott
Code S Support Office
Mail Stop 160
Langley Research Center
Hampton, VA 23681-0001 
	Phone:	(804) 864-9850
	FAX:	(804) 864-8894
	email:	j.a.lintott@larc.nasa.gov

It is anticipated that after introductory remarks, the first day will be devoted to open presentations followed by splinter working group sessions — into the evening if necessary. The splinter groups would report the results of their deliberations at a plenary session on the morning of the second day. Specific areas for consideration include, but are not limited to, the following:

You may, if you wish, submit private communications in addition to, or in lieu of, your personal attendance. These comments should be addressed to Mr. Lintott at the address above or to me at the following address:

Dr. J. David Bohlin
Science Program Executive
  for Review and Evaluation
Code SR
NASA Headquarters
Washington, DC  20546
	Fax:	(202) 358-3097>
	email:	jbohlin@hq.nasa.gov

Funds to support this meeting are limited to those from academic institutions who were either proposing as MIDEX PI’s or those who served as science and technical reviewers. Government per diem at the local Hampton rate and reimbursement of travel expenses will be provided to these individuals only. A block of rooms will be reserved for this Workshop. Detailed information on logistics, travel, accommodations, and reimbursements will follow shortly from Ms. Bonita Hawkins of Jorge Scientific Corporation. She can be reached on (202) 554-2775 if you have any immediate questions.

Sincerely,

 

 

 

J. David Bohlin
Science Program Executive
  for Review and Evaluation
Office of Space Science

 

 

 

 

 

 

APPENDIX C

Minutes of the Meeting

Wednesday, 26 June, 1996

Dr. J. David Bohlin opened the meeting and described his role in the Medium-class Explorer Mission (MIDEX) process. He had not been the selecting official but was now in charge of ensuring the integrity of the process, as a keeper of the corporate knowledge. The purpose of the meeting is to identify the strengths and weaknesses of the process to see how things might be improved in the future. We need to be forward looking. NASA is not ignorant of the imperfections in the MIDEX AO. It was clear as we went along and requirements changed that there were impacts on the AO. The results of this meeting will be applied to the upcoming Small Explorers (SMEX) AO due for release next fall. However, there is no guarantee that all inputs can be taken into account; NASA is working within certain boundary conditions.

Professor Ken Lang then outlined the purpose of the workshop: to review the strengths and weaknesses of the solicitation and selection process for the MIDEX Announcement of Opportunity (AO-95-OSS-02). He asked the participants to review what went right, what went wrong, to ask themselves how "right" and "wrong" might depend on a particular perspective, such as certain scientific disciplines, academia in general, industry, and government. This also needed to be viewed in light of NASA's stated objectives of achieving low-cost research opportunities in space.

On the basis of written comments submitted prior to the workshop, Prof. Lang suggested splinter groups should consider the following aspects of the process: the two-step process; the dual-mode selection process; the selection criteria; and full cost accounting. He asked that these groups return to a plenary session the following morning with the objective of providing an integrated set of findings for NASA to consider to improve future SMEX and MIDEX AOs.

John Lintott then described the genesis of the MIDEX AO, which was released in March 1995. It had been based on the most recent SMEX issued in 1992 and previous Explorer AOs, with the incorporation of some Discovery-like features. Only charges to the Explorer program were counted, not the launch vehicle, nor civil service manpower and facilities. The Principal Investigator (PI)-mode was added as an option during the development of the AO but because Discovery had not yet flown it was regarded as an experiment, and thus proposals had to show a compelling rationale for such an option. NASA anticipated selecting one Goddard Space Flight Center (GSFC)-built spacecraft and one procured from industry, as stated on pages 1 and 2 of the AO.

Lintott said that the issue of NASA civil service manpower and facility charges had been addressed at the April 1995 MIDEX AO preproposal conference. He said that NASA had in fact selected, on the basis of the evaluation and selection criteria in the AO, two missions, one with a GSFC-built spacecraft and one with the spacecraft procured from industry.

Mike Janssen, JPL, asked Lintott to elaborate on the consistency between GSFC spacecraft and one from industry: was that consistent with PI-mode? Lintott said that the industry spacecraft could have been PI-mode.

Barry Mauk, APL, asked about the inconsistency of not allowing foreign contributions to count in reducing the cost cap, whereas free civil service input was allowed. Lintott replied that the intent was to avoid having disproportionately-sized proposals. At the extreme, for example, this could have led to simply having instruments proposed for launch on major foreign missions, and so the approach on foreign contributions had been to maintain the traditional scope of the Explorer program.

Chris Roberts, CTA Space Systems, questioned the result in which all four missions finally selected proposed a GSFC-provided spacecraft. The discussion illustrated that there was also confusion in the meaning of a "GSFC-provided" spacecraft. Did that mean "built" or "procured"?

Mel Ulmer, Northwestern University, questioned the tension between cutting-edge technology and risk assessment. He said that two and a half years was not enough time to bring a new technology to a risk-free state of development.

The Chairman noted that these issues would be addressed in the splinter groups and then introduced the first speaker in the next session.

Science Community Perspectives

(i) Mike Janssen, JPL

Janssen briefly described the Primordial Structures Investigation (PSI) mission, which he believed set a good example for the PI-mode of operation. The proposal exhibited strong science, the team had put in a major effort, and he believed that the proposal was a real contender.

He had two areas of concern with the AO. It was inherently biased because the full costs of GSFC and other civil service were not to be taken into account, amounting to an unfair competition in a cost-constrained environment. His recommendation would be to adopt the PI-mode and let everyone compete equally. As currently set up, PI-mode proposals had to explain a lot more than those using the NASA-provided spacecraft.

He had taken the cost cap in Step One seriously into account (a lesson learned from Discovery) and had worked hard to achieve a cost below $70m. This meant that he had to do a comprehensive costing at Step One. Most people had the same experience. The two-step process did not solve the problems it had been meant to solve.

Janssen felt that a down-select to 13 was too many. There was no interaction between reviewers and proposers and he felt it important to have oral presentations at that stage. The science review of PSI got sidetracked at Step One and he would have liked the opportunity to correct what he described as misunderstanding on the part of the reviewers. Finally, he commented that the evaluation criteria were not consistent with the goals as stated in the AO.

The presentation given at the meeting by Mike Janssen follows.

 

 

 

Title: The Primordial Structures Investigation (PSI)

Spacecraft Option: PI Mode
Managing Institution: JPL
Scientific Objective: Image the CMB (Astrophysics)

Status:

Passed Step One Review. Category 1 in Step Two
Missed making selection

Principal Investigator:		Michael A. Janssen	JPL
Deputy PI: Charles R. Lawrence JPL Co-Investigators: Barry R. Allen TRW

Mark Dragovan U Chicago

Todd Gaier UC Santa Barbara

Krzysztof M. Górski Hughes/STX

Sam Gulkis JPL

Andrew E. Lange CalTech

Steven M. Levin JPL

Gerald W. Lilienthal JPL

Philip M. Lubin UC Santa Barbara

Peter Meinhold UC Santa Barbara

Anthony C. S. Readhead CalTech

Kenneth H. Rourke TRW

Michael D. Seiffert UC Santa Barbara

Douglas Scott U British Columbia

Lawrence A. Wade JPL

Martin White U Chicago

 

TWO MODES: PI or NASA-PROVIDED

PI-mode is a wonderful addition in MIDEX; however, the two-mode structure was seriously biased:

  • Full costs of Goddard spacecraft and other civil service labor not included.
    –- BUT SAME $70 M CAP!!
  • •Spacecraft and interface performance and cost risks evaluated for PI-mode but not NASA-provided. Could only hurt the former in evaluation.
  • •Step Two proposal page limits were the same for both modes.
    - But PI-mode had more to describe on spacecraft, integration and test.

 

TWO-STEP PROCESS

SUCCESSES:

  • • Reduced number of implementation plans prepared from 43 to 13.

PROBLEMS:

  • No interaction between proposers and reviewers
    –- No way to correct misunderstandings.
    -– Makes job of reviewers harder and riskier.
  • 13 in Step Two still too many
    -– Can’t have interactive review.
    –- Long odds for industry to bet real money on.
  • •Detailed Step One review was based on incomplete information, compartmentalized into wavelength panels, but treated pretty much as final on science
    - Overall technical merit depends on implementation information not available in Step One. •
  • In active fields the Step One science review was out of date by Step Two.
  • The AO said science/unit cost was key, so cost was competitive. Specification of not-to-exceed cost in Step One required essentially the full costing reported in Step Two.

 

EVALUATION CRITERIA NOT CONSISTENT WITH GOALS

GOALS STATED IN AO:

  • • "...to capitalize on...the investment in the design and development of small spacecraft by industry and by GSFC."
  • • "...inclusion of new technology to achieve performance enhancements and reduce costs...."
  • • "...to produce missions with the highest possible science value per unit cost."

INSTEAD:

  • • Industry participation was discouraged by the advantage given GSFC-mode missions.
  • • Innovation and new technology penalized in top criteria of Step One (p. 13, B1 and B2) and Step Two (pg. 13, C1 and C3) because they were assessed as risky. Rewarded only in bottom criteria of Step Two (p. 14, C6a).
  • • Science/$ not really evaluated.

 

(ii) Paul Evenson, Bartol Institute

Paul Evenson noted that he was from a small university group and so had a rather different perspective from the previous speaker. The Positron Electron Magnet Spectrometer (POEMS) investigation has a long history; it was originally selected for EOS, but was then down-selected. It was also selected for the last SMEX: it completed Phase B and was judged flyable. The team saw MIDEX as a golden opportunity, ideally suited for the experiment. The team interpreted the GSFC-provided spacecraft as one procured from industry by GSFC. They did not think GSFC could provide the kind of spacecraft they needed, so they had no option other than PI-mode.

Criticisms of the proposals were focused on management issues. Being a small group (but with an industrial partner who did a good job), Evenson pled guilty as charged to the criticisms of the POEMS proposal. But he felt the criticisms were based on issues that could be worked out in Phase B, e.g., a Program Manager could have been hired in Phase B once they had a budget. The team was using successful SMEX experience which had not requested the kind of Work Breakdown Statement (WBS) that MIDEX AO had. They had depended on their SMEX heritage.

Evenson thought that Step Two was an excellent idea but that the down-select was not stringent enough. Bartol did not have deep enough pockets to adequately prepare the proposal without being guaranteed a flight. The AO demanded a high standard of management for a small scientific group. Such standards should be fixed, that is, if a WBS is acceptable for one Explorer AO it should be acceptable for another. Once, the POEMS experiment had been defined for flight; NASA had once thought that its WBS process was acceptable.

He said that NASA may not want groups like the Bartol group and they will not be proposing again to NASA. They will propose to groups that have enough resources to help them propose to NASA.

Kenneth Johnston, Naval Observatory, suggested that resources be made available to PIs after the down-select in order to help them meet the stringent demands of the AO in the management and cost areas.

Mel Ulmer from Northwestern University asked whether there should not be more justification of cost at Step One, with oral briefings, a Phase A, and then a down-select. Evenson replied there was not enough time for a Phase A study. The two-step process should be modified to address the technical, management, and resource issues.

 

(iii) Neil Johnson, NRL

Johnson was the PI on the Burst Locations with an Arc Second Telescope (BLAST) proposal, and was presenting from a high-energy perspective. Many of these high-energy missions are rather large and opportunities for flight are restricted although there are some sub-orbital opportunities. How can one get from SR&T level to a flight instrument? One needed to take more risk.

The team needed more information on the NASA spacecraft and ground-support. It was very difficult to get information. Johnson asked why free foreign participation was not a plus. Was the purpose of the program to do science or to create jobs in the U.S.?

Michael Cherry, Louisiana State University, noted that the Step One process set up proposals for failure. Step One proposals did not contain enough information on the cost and management issues. The science review was looking for exciting science; they identified potential technical problems. There was no feedback from Step One. He thought it would have been helpful to have engineering and cost reviewers in the Step One review.

Kenneth Johnston, Naval Observatory, commented that there should be more input from science reviewers at Step Two to see whether the science was good enough to justify whatever risks there may be. Brian Dennis asked whether this was a case of simply not being high-energy’s turn this time.

Johnston commented that in the Solar Max review process, the two-step approach had worked well. The first selection had been followed by a study, then an oral presentation, followed by a second-step selection.

The presentation given at the meeting by Neil Johnson follows.

High Energy Astrophysics Perspective
W. Neil Johnson
Naval Research Lab

(MIDEX Step Two Proposal: BLAST)

 

High Energy Astrophysics (HEA) Context

  • •Instruments in this energy domain tend to be heavy and large.
  • Current and planned missions are "Great Observatories"—NASA’s Compton Gamma Ray Observatory, Large Explorers—Rossi X-ray Timing Explorer, or Large international missions—ESA’s INTEGRAL Mission (2001).
  • •Few opportunities for flight outside of NASA’s SR&T-funded sub-orbital program.
  • Small Explorer program generally does not provide $$ or payload capability for competitive experiments (and developmental experiments don’t win peer review).
  • •MIDEX program represents the only option for high energy astrophysics below the intermediate class mission (one per decade or so).

 

MIDEX Two-Step Competition is a good idea, but

  • Step One selects most exciting science (most ambitious experiment) and it fails in Step Two on cost realism or risk.
  • Step Two review seemed to want Phase A study results—a big effort.

 

RECOMMENDATIONS:

  • Keep the two-step process, get better cost/risk evaluation in Step One.
  • Reduce the number of proposals selected for Step Two.
  • •Extend the Step Two study period.
  • •Fund the Step Two study period, ~$100K/proposal.
  • Reconsider MIDEX funding cap???

     

Dual Mode—PI or NASA-Provided Spacecraft—is bad

  • •NASA S/C sounded good until you discovered that there was no help in understanding or optimizing capabilities.
  • •Evaluation inequities developed due to levels of S/C detail available in PI mode relative to inadequate information on NASA S/C.
  • •Evaluation inequities developed due to risk factored into PI S/C while NASA S/C was considered fixed-price.

 

RECOMMENDATIONS:

  • Pick one—PI or NASA S/C. Require the same information for all proposals.
  • •Establish mechanism for better communications with NASA ground segment support (important for PI or NASA S/C).
  • If NASA S/C mode, assign NASA Engineer to support each Step Two proposal.

     

The Explorer Program Summary/Recommendations:

  • Approach is good, more money would be great.

  • •Small Explorers are not much of an opportunity for many high-energy astrophysics experiments

  • –Consider restructure of number of Small vs. Medium Explorer flights.

  • •The funding cap is serious constraint on many HEA experiments

  • –Reconsider the requirement for the sum of NASA, foreign, and other agency costs to stay within the NASA MIDEX cost cap.

     

 

(iv) Robert Paulos, University of Wisconsin

Paulos presented the perspective of the Hot Interstellar Medium (HIMS) PI, Dr. Wilton Sanders. HIMS had chosen the NASA-provided spacecraft mode and also had a difficult time communicating with GSFC. Paulos suggested that the basic bus with a cost breakdown should be described in the AO. He felt it was also difficult to address all the management points within the given page limit. On the two-step process, he suggested that reviewers, or NASA, should simply assign a ranking at the end of Step One, and then leave it to proposer to decide whether to go ahead. He endorsed finding some way of getting feedback to reviewers to avoid misunderstanding. This might be oral or submitted written questions from reviewers making reference to material already in the proposal.

 

Mel Ulmer, Northwestern University, asked Paulos whether they had subtracted anything from the spacecraft bus: they had in fact done that. Johnston said that there were clearly points in the proposal that the reviewers missed and suggested that questions could have been sent out. Paulos said a down-select to four proposals would prevent overworking the reviewers and so avoid their missing information in proposals. Evenson had learned from the debriefing that there had been misunderstanding on the reviewers' part. Friedman, CalTech, also suggested that, had there been feedback, misunderstandings could have been cleared up. The presentation given at the meeting by Robert Paulos follows.

 

 

Hot Interstellar Medium Spectrometer (HIMS)
Robert J. Paulos
University of Wisconsin—Space Science and Engineering Center

 

The AO
  • NASA provided S/C needs better definition
    - Provide detailed performance of a single bus.
    - Provide detailed cost by element.
    - Show available options with performance and cost breakdown.
  • Step Two management process section needs improvement.

The Two-Step Process

  • •The two-step process is good conceptually.
  • •The current process does not save time or money for the proposer.
  • •Step One cost detail should be replaced with a broad cost range.
  • Limit the Step One submission to fewer total pages.
  • No down-select; assign a grade or ranking.

Debriefings

  • •The debriefing was excellent.
  • Some stated weaknesses were the result of reviewers missing points that were clearly made - Is there a mechanism for preventing this?

 

(v) Barry Mauk, APL

Paulos asked whether if Mauk had had a choice of going to Step Two, would he have done so? The answer was yes. Neil Johnson, NRL, noted that Step Two reviewers did not have the results of the Step One review. Janssen asked whether anyone from NASA could elaborate on the comment that the magnetosphere panel did not reflect the discipline, as this would be a serious flaw. (See Mauk presentation below). The Chairman commented that he had looked at the list of reviewers and assured the audience that the reviewers were among the finest in the country. Janssen noted that he would have liked to have seen broader participation by the community. Mauk commented that his remarks about the reviewers were based on information gained in the proposal debriefing. The presentation given at the meeting by Barry Mauk follows.

 

Barry H. Mauk
The Johns Hopkins University
Applied Physics Laboratory

Character of Phase 1 Evaluations

  • It appears that the Phase 1 selection process was carried out with a level of depth that was no greater than is typical for the evaluation of $60K SR& T NRA proposals.
  • •The process did not afford an adequate technical evaluation of the proposals nor an adequate evaluation of the proposed costs:
    - The evaluation panel came to demonstrably incorrect conclusions about the technical aspects of proposals (in the case of our proposal, this statement was validated by the debriefing team).
    - The evaluation panel passed judgment on the credibility of proposal costs. Yet, as a strictly scientific panel, the panel apparently had neither the expertise nor the resources to make such judgments.
  • This level of evaluation is not adequate for a $70M hardware program. Since only one mission per discipline is likely to be promoted to phase 2, how does NASA safeguard against unflyable missions winning out against flyable missions that have adequately addressed technical and cost issues?

MIDEX as the Vehicle for Discipline Science

  • In today’s climate, $70M MIDEX missions can represent the major investment in the science of a particular discipline for a number of years.
  • • Apparently (based on debrief comments) our magnetospheric science proposal was evaluated in phase 1 by a panel of mostly ionospheric and other non-magnetospheric scientists.
  • • Thus, non-magnetospheric scientists apparently choose what may be the major magnetospheric mission of the next half decade.
  • • An informal poll of colleagues revealed numerous qualified and interested individuals who were not MIDEX proposers and who were not asked to participate as MIDEX panel members.
  • • Given the anticipated role of MIDEX in carrying out discipline science, phase 1 evaluations need to be carried out in a more comprehensive fashion and not in a fashion best suited for grant proposals.

Full Cost Accounting

  • The MIDEX AO required that foreign contributions be accounted for under the $70M MIDEX cost cap in order to make sure that all proposed investigations are of the same scope (or no bigger than a certain scope).

  • The spirit of this requirement was apparently violated by some proposals by the heavy use of essentially free civil service personnel at NASA Centers. These proposals had an unfair advantage and violated the spirit of the AO and the spirit of recent NASA administrator statements on this issue.

  • •This issue was anticipated during the AO preproposal briefing with the following Q&A exchange:
	-Q:  Some of our Co-Investigators are U.S. Government employees.  
	     The stated reason for including foreign contributions in the 
	     cost cap constraints is to make sure that all proposed 
	     investigations are of the same scope (or no bigger than a 
	     certain scope).  For the same reason, are the full costs of 
	     the participation of U.S. Government employees to be included 
	     in the proposals (i.e., must one include the true costs of 
	     utilizing the "free" labors of U.S. Government employees)?

-A: The proposer needs to negotiate the appropriate rate with the government agency providing the civil service personnel, and include that cost in the proposal. The rate chargeable to the proposal is included in the cost cap.

Recommendations

  • Given the importance of the MIDEX program for achieving primary science discipline goals, you must make sure that affected disciplines are adequately represented. Place higher priority on the fidelity of the phase 1 evaluation process.

  • •More expertise, time, and resources must be brought to bear on technical and cost evaluation during phase 1. The science panels do not have the expertise to do this. Perhaps multiple panels are needed. Given the likelihood that only one mission per discipline will be promoted to phase 2, safeguards must be in place to assure that unflyable missions do not eliminate flyable missions in the early phases.

  • If the second item cannot be adequately addressed, then abandon the 2-stage evaluation process.

  • Full cost accounting with regard to civil servant participation should be put in place immediately to level the playing field.

 

(vi) John Raitt, Utah State University

Raitt was the PI on COPIES (a Constellation of Polar Ionospheric Electrodynamic Satellites), which needed 8 spacecraft. He did not think that NASA could provide these spacecraft and so opted for the PI-mode. In general, he thought the two-step approach a good concept that needed more work. He was concerned that PIs overestimated costs at Step One so as not to exceed that cap at Step Two. At Step One NASA should do a proper cost review so as to ensure that a PI can stay within the limit. He suggested that the AO not request a cost definition, but simply allow the PI to state that he can stay within the program cost cap.

Raitt was relieved not to have been selected for Step Two. A relatively large number of proposals had been selected for Step Two so there was a great risk of spending resources and not winning. He suggested that funding should be provided before Step Two to enable good costing to be done, like a Preliminary Design Review. Although a small percentage of the total cost of mission, it would be a good investment.

He was not sure whether the PI-mode put some at a disadvantage but his experiment required that choice. He did understand from the AO that they could work with GSFC to have GSFC procure what was needed but that would have required too much interaction. He felt that they were too far away, and so went with local industry.

He would have liked some feedback on why the experiment had not been selected for Step Two. He felt that the science was good as the obvious next step, but the limited page count meant the proposal omitted some points that could have been clarified with interaction with the reviewers. Perhaps NASA should consider a three-step approach? It would be easy to cut from a large number of proposals to dismiss some non-contenders, so as to arrive at some number that needed serious consideration. Then there might be an interview process with this smaller group of PIs, followed by a down-select to a smaller number for a Step Three process which might be funded at $100K level to produce high-level proposals for a better selection.

Ulmer asked how Raitt expected reviewers to assess his costs if the proposal did not give them any cost information. Raitt said that the proposal has to give enough technical information on implementation to support the contention that the mission can stay within the cost limit. Confidence in the estimate comes from experience; Raitt said his costs are mostly manpower, not hardware. But there was too much work involved at present in documenting that the estimate was good.

Janssen suggested the use of cost models, accepting a level of uncertainty.

Raitt asked whether NASA was really ready to spend $70m on a mission, or were they really looking for a cheaper mission?

Jayant Murphy, Johns Hopkins University, said he was very happy with the science review but said that costing should be in Step One proposal. Not enough information was required at Step One for reviewers to judge costs. The MIDEX program requires both science and other resources to produce a good proposal. The Step One proposals each took a different approach to costing; one could be penalized for too little information and also penalized for providing too much information. Neil Johnson added that it was made clear that the cap was $70m; proposals had to be very carefully costed so as not to come above $70m. Raitt noted, however, that that was only true where mission costs were near the cap. His experiment was not that costly.

 

(vii) Dave Rust, APL.

Rust was the PI on the Heliospheric Links Explorer (HELIX) experiment that, at the beginning of the process, used a very new approach that had now become very well-known. The MIDEX process was inherently conservative and encouraged division by discipline. The Step One selection, with one review panel per discipline, produced one proposal for each discipline. He felt that programmatics dominated the process. NASA should decide one way or another on the conception of program: was this an Explorer program? Did NASA have to select the highest priority science or was it discipline-divided?

He queried the cost accounting. The science panel had said they did not believe this experiment could be done for this price. But APL always rigorously costs everything. The proposal was thoroughly cost accounted, so it was outrageous for scientists to judge the cost inadequate. He suggested the need to either do real cost accounting, or check boxes for cost ranges, e.g.. $30-$60m; $50m-$70m.

He asked whether MIDEX was eating up the Solar-Terrestrial Probe line and suggested that the appropriate level of science effort in the overall program was not being achieved.

Feedback was very important. It was not a question of sealed bids. The SOHO selection had engaged in dialogue with PIs. Evenson suggested that it was ESA that had introduced the dialogue. But Rust said he did talk to GSFC during that process.

 

(viii) Brian Dennis, GSFC

Dennis made a couple of unique points. He was a Co-I on the High Energy Solar Spectroscopic Imager (HESSI) proposal, which had been chosen as runner up. He was also a free civil servant at GSFC. Dennis noted that full cost accounting was being introduced and that one had to wait to see how it was implemented. He had also had difficulty getting information about the spacecraft. Once selected at Step Two, they, like all NASA-mode proposers, had been invited to talk to GSFC, but then got the message that GSFC could not handle the demand for information. He preferred a single step approach.

He said that, in the final analysis, the evaluation was ignored. NASA wanted to do certain experiments (e.g., cosmic background). NASA did not want to select a solar mission. The team was told that the proposal was perfect and could not have been improved.

HESSI had been studied for a long time, compared to HELIX which was a new, unstudied idea. He suggested that Explorer money should not be used to study new mission concepts as such money is available elsewhere. The two-step process levels the playing field for those whose ideas have not been studied; the other study routes should be used so as to get on with the selection of MIDEX.

He felt that debriefing must be face-to-face. The team could not get the written evaluations at the technical debriefing; it was based only on the notes of those who happened to be there. There should be direct contact with reviewers and PIs should have to go forward to answer reviewers' questions. There were misunderstandings of technical questions and thus communication with reviewers was needed. Too much of the process was driven by lawyers.

Contributions from foreign partners should be encouraged. The objective is to achieve the maximum science per US dollar. It was very hard to satisfy requirements for detailed costs, and the 10-month and the 22-month definition phase merely doubled the work.

Neil Johnson did not want to see the program reduced to only long-studied proposals. He noted that there had been questions about how much heritage the winning proposals had.

Bohlin addressed the question of a discipline-driven selection. He said the selection team had not been predisposed, although everyone in the community knew that there were high-priority science areas. At the ultimate selection between several excellent proposals, programmatics had to come into play. The presentation given at the meeting by Brian Dennis follows.

 

Brian Dennis — HESSI Lead Co-I
NASA GSFC

 

Clarity and Content of AO

  • Confusion over allowed spacecraft trades.
  • Unclear what spacecraft components were offered.
  • •Error in altitude dispersion of launch vehicle.
  • •Confusion over calculation of percent contingencies.

Evaluation Criteria

  • Generally clear.
  • Relative ratings ignored in final selection step.

Proposal Conference

  • •Good but some of the answers were wrong or misleading.

Evaluation Process

  • Telephone debriefing entirely inadequate.
  • Technical debriefing of limited use because of misunderstandings of the design concept.

General Comments

  • Allow cost trades between spacecraft, instrument, and launch vehicle.
  • Allow credit for contributions (foreign or domestic) that are free to NASA.
  • Reduce detail required in Step One budget and Step Two cost volume.
  • Improve technical review process to avoid obvious errors.

That concluded the presentations from the science community. The next item on the agenda began presentations on perspectives from industry.

Industry Perspectives

(i) Shirley Savarino, TRW Civil & International Systems Division, Space & Electronics Group

Savarino noted that no PI-mode missions won, which she thought sent a message as to what NASA wanted. It had been made very clear that PI-mode was not wanted. Five proposals had industry participation. Now that the IMAGE (Imager for Magnetopause-to-Aurora Global Exploration) team was to procure a spacecraft, industry now had to bid again on the same competition.

Ulmer asked whether, as the new NASA structure encourages disciplines merge, there were fewer chances of discipline-specific missions? Bohlin responded that the community has to work within disciplines, but that there may be opportunities for cross-discipline science. However, line item funding from Congress for Discovery (planetary) and MIDEX (space physics and astrophysics) still continued. Jay Bergstrahl, NASA Headquarters, noted that the next Discovery AO includes Origins science.

The presentation given at the meeting by Shirley Savarino follows.

Industry Perspective

Three key issues relating to MIDEX procurement

  • Win probability low.
  • Spacecraft supplier provides low leverage on outcome.
  • Uncomfortable to compete against our customers.

Issue #1: Win Probability Low

  • • PI-mode procurements extension of old AO process to select science, but now mission selections.
  • • Probability of winning Step One is ~30% (13 RFP’s / 40+ white papers).
  • • Probability of winning Step Two is ~15+% (2 winners / 13 proposals).
  • • Generally, TRW enters competition with one to three competitors for hardware contracts (25–50% win).
  • • NASA should consider stronger filter on Step One to have Step Two win probability >30%.

Issue #2: Spacecraft Supplier Provides Low Leverage on Outcome

  • Stated objective of MIDEX is to gain maximum science/dollar, inherently spacecraft not as important.
  • •MIDEX results show that industry provided low positive leverage.
  • •Spacecraft procurement on IMAGE.

Issue #3: Uncomfortable to Compete Against Our Customers

  • Both Goddard Space Flight Center and JPL procure or build most NASA science missions
      -Future trends (role and missions, program queue’s) strengthen this.
  • TRW had a wonderful partnership with JPL on MIDEX but went against Goddard mission of similar science.
  • •Goddard "not allowed" to team with industry, setting-up competitive environment from start.

Thoughts for Future Procurements

  • •All proposals have same groundrules
      -PI mode.
      -Full cost evaluation.
  • Provide better filter to Step One, resulting in >30% chance win Step Two.
  • Evaluation criteria needs to be consistent with stated objectives
      -Science.
      -Science and costs evaluation during review need result in science/dollar metric.
      -Enabling technologies with well-understood risks should be rewarded and not penalized
      »In evaluation, use of new technology provided a small plus, ~3–5%, but counted heavily against us in science and technical evaluation, ~35% most important evaluation factor.

 

(ii) Lamont Di Biasi, Orbital Sciences Corporation

Di Biasi noted that it was not easy to understand the AO process. He said there were certain areas of science that Orbital Sciences felt NASA wanted to do, and so the company pursued certain investigators in certain science disciplines. He noted that the actual expenditures had been closer to $600K, rather than the $500K noted below. He was now having trouble convincing his management to invest in the next SMEX and MIDEX proposals.

He noted the inconsistency between the value of high technology and risk. At one extreme is a mere concept, others have some heritage, testbed experience, or flight history, perhaps on the Shuttle.

Citing his experience of the Earth System Science Pathfinder (ESSP) program, he said that the various Centers direct their PIs to procure industry participation in very different ways. These can vary from a one- or two- page proposal to a comprehensive effort. This is something NASA could address. In addition, universities have their own differing rules about how to procure industry partnerships, but this is not something that NASA fix.

Di Biasi noted that when some PIs found they could not get information from GSFC on the NASA-mode spacecraft, they wanted to switch to PI-mode, but the AO did not allow this. Criteria were different in the two modes: the PI-mode proposals were evaluated on their Program Managers, but the NASA-mode proposals were not. The presentation by Lamont Di Biasi follows.

PERSPECTIVE - 1 - EXPENSIVE

  • • Prior to the AO release, OSC did an extensive assessment to determine our primary Science discipline targets.
  • • We pursued 14 Principal Investigators
      -4 required written proposals.
      -5 required several detailed presentations.
      -5 required a meeting and a capabilities presentation.
  • •We had 2 decide to no-bid, 2 used NASA supplied spacecraft and 1 used an in-house spacecraft.
  • We supported 5 in their Step One proposals
      -3 in Space Physics.
      -2 in Astrophysics.
  • We supported 2 in the Step Two proposal
      -1 in Space Physics.
      -1 in Astrophysics.
  • •Approximate Expenditures
      -Bid and Proposal, Marketing, Travel - $500K
  • •Results
      -No wins. Starting over via opportunity to bid for IMAGE spacecraft.
      -Evaluating advisability of pursuing next SMEX and MIDEX AO.

PERSPECTIVE - 2 - PROCESS

  • DURATION — One year from AO release to final selection is too long.
      -45–60 days is more than sufficient for Step One proposals.
      -60 days should be sufficient for Step One evaluations based upon 40–50 proposals.
      -60 days for Step Two proposals.
      -60 days for Step Two evaluations based upon 6–8 proposals.
  • PARTICIPATION IN STEP TWO
      -13 to select to 4 is excessive.

PERSPECTIVE - 3 - INCONSISTENT EVALUATION

  • Based on 2 specific debriefings plus discussions with other colleagues and Principal Investigators.
    COST: (2 very similar OSC spacecraft) - 1 was should costed as bid, 1 was almost doubled.
    COST: NASA supplied spacecraft not evaluated for realism.
    TECHNICAL: High scores for use of flight proven, heritage hardware with good weight margin but presumption made that mass would increase and add costs. High scores for use of innovative, new technology but score lowered because of associated risk.
    MANAGEMENT: PI mode is a more complex organization and would be weak in the management of the development process but there is no requirement to describe how the NASA supplied spacecraft organization would be implemented or would be less complex.

PERSPECTIVE - 4 - INDUSTRY PARTNER SELECTION PROCESS

  • INCONSISTENT: Varied depending upon the NASA Center directives. This is fixable.
    Varied depending upon the Institutional requirements, but I don’t think this is fixable.

  • INFLEXIBLE: PI was not allowed to change after Step One even when support was not readily available from NASA. I had two calls asking me for support. Neither eventually won, so I saved time and money.

SUMMARY

  • The concept has merit: The Step One proposals are relatively inexpensive and your status is known early in the evaluation stage.
    The scope, schedule and costs are defined early.

  • The implementation needs modification: If a PI uses the NASA supplied spacecraft, require that NASA provide a Program Manager, a definition of the specific spacecraft and a real cost and then:
    Evaluate the NASA supplied spacecraft just as the Industry supplied spacecraft.
    Evaluate the NASA supplied management of the spacecraft just as the Industry supplied management.
    Evaluate the cost on an equal basis.

 

(iii) Bill Gail, Ball Aerospace.

The AO process should help improve the science being done through cross-pollination. The cost of the whole process is probably about 10% of the total program value, if everybody’s cost is included. Gail asked what was meant by science/dollar; does that mean low return for low cost? The down-select should be based on issues of science, not resources. Industry has limited resources and cannot support many proposals. Therefore Step One should be judged only on science; reviewers should be relied on to competently assess relative costs. Gail also supported the concept of giving Step One proposals a ranking and letting the PI judge whether to go forward. This is the approach in the Code Y ESSP program.

Mauk commented that if cost were not taken into account at Step One, then those proposals could result in "undoable" science, and beat "doable" science. Cindy Faulconer, Lockheed Martin Astronautics, echoed that sentiment. John Raitt said that he supported describing a technology approach in Step One to show how the science will be done, but without detailed costing. Step Two is really the definition phase of the proposal and people who do the costing need the resources at that point to do it well. Gail, however, emphasized that funds should be available to help people to do costing, not to define the mission. The presentation given at the meeting by Bill Gail follows.

Objectives

  • The AO process should provide the best service to the users (scientists) and the best return to the customer (taxpayers)
    For the science community, the AO should:
      Lead to selection of the proposals that return the best science.
      Enhance the quality of all proposed missions and strengthen the community through the AO process itself.
    For the taxpayer, the AO should:
      Ensure that the selected mission and the AO process is cost efficient.

Issues

  • •We don’t know if the AO process gets us the best science. This question should be asked formally before/after each AO.
  • •Pick the best science for the budget, not science per dollar. The best transportation per dollar may be a bicycle, but we all probably still bought airline tickets to get here.
  • •The stated cost cap should be the real cost cap. We should be spending our time fine tuning ideas, not second-guessing the "real" cost cap.
  • •High cost of Step One (to all) limits ideas that can be proposed. Let the science community be the judge, not resources.
  • •Let in-house/out-of-house compete equally. Full and open competition provides the best value to the taxpayer.
  • Bias against ‘new’ technologies. Disincentive to propose new technologies because reviewers differ widely as to benefit/risk.
  • Success too dependent on interpretation of rules, not merit.

Suggestions

  • Retain the two-step process, but
    Ensure order-of-magnitude difference in effort between two steps (eliminate costing in Step One! — focus on science).
    Minimize "rules interpretation" issues by giving direct detailed feedback prior to Step Two ("diffuse" rather than rigid rules).
    Review the Discovery San Juan Capistrano approach as a model for Step One
    Strengths: 1) limited 15 vugraph/15 min presentation, 2) limited costing, 3) science peers as evaluators, 4) feedback with specific strengths/weaknesses as well as scores.
    Weaknesses: 1) needs much shorter time between Step One and Step Two, 2) no funding between Step One and Step Two.
  • Enhance program feedback
    This type of meeting is a good start.
    Anonymous questionnaire addressed to all proposers and possibly outside scientists?
  • •Continue to converge AO process with Discovery and ESSP.

That concluded the presentations from the industry community. The next item on the agenda was perspectives from government/FFRDC.

 

Government/FFRDC Perspectives

(i) Mike Janssen, JPL

Mike Janssen spoke briefly, noting that he was not really equipped to speak for JPL. He did suggest that NASA eliminate the NASA-mode option.

(ii) Larry Crawford, Applied Physics Laboratory

Crawford emphasized that he did not like to see APL costs compared to model costs. Federal regulations allow NASA to strongly emphasize past performance to evaluate cost estimates. He had made same point at the Discovery Lessons Learned Workshop but the current draft AO still does not include that feature. He urged OSS to consider the ESSP 7-month two-step process.

Evenson commented that scientists can be qualified to provide a technical review; many were experimenters not theorists. Crawford felt that there should be engineers on the review panel. Cherry confirmed that his panel had consisted of scientists (including experimenters) but not engineers nor cost experts. When asked whether there had been 13 selected for Step Two due to lack of expertise to hone the numbers, Cherry said that the 13 was simply a result of each panel having been asked to come up with their best 2 or 3 proposals. The presentation given at the meeting by Larry Crawford follows.

JHU/APL MIDEX Proposals

  • MIM--Magnetospheric Imaging MIDEX
      -Two versions — NASA provided S/C and PI provided S/C.
      -PI is Barry Mauk of APL.
      -Not selected for Step Two.

  • HELIX--Heliospheric Links Explorer
      -PI is Dave Rust of APL.
      -Not selected for Step Two.

  • HUBE--Hopkins Ultraviolet Background Explorer
      -In collaboration with JHU Dept. of Physics & Astronomy.
      -PI is Dick Henry of JHU.
      -Selected as an alternate mission.

Major Shortcomings:
Was "---science -- & instrumentation--" really the primary emphasis for the Step One proposal?

S = Science		T = Technical merit
TC = Total NTE Cost	DR = Data Reduction & Analysis plan

S + T + TC + DR = 100% -- Evaluation Formula??

S + T = TC		If	DR = 20%,
DR <S + T		Then	TC = 40%,
DR <TC		and	T = 20%
				S = 20%

i.e. Science Value Only 20-–40%

  • •MIDEX two-step process did not significantly reduce up-front burden on proposers
    Proposal NTE requirements drove effort--OK but how was evaluation performed?
  • •More detail in AO about evaluation process needed. How was cost evaluated in light of "NTE & outline of cost" guidance? Cost appeared to be 40–50% of score in Step One!!
    What criterion was used to judge cost? In evaluating cost, was at least 25% weight given to the proposing institution's past performance in accordance with current government guidelines? (e.g. Office of Federal Procurement Policy Guidelines; Federal Report 0014-9063/95; Vol. 63, No. 18; page 573.) Who evaluated cost? Did a panel of scientists evaluate cost? We believe the answer is YES. This would be a serious flaw in the selection process. What projects, if any, are grandfathered from full cost accounting of government participation?
  • •How important were programmatics in the selection process? Were new ideas really welcomed? If not, tell proposers in the AO. Could proposers have saved their $'s if they knew?
  • •Adequacy of Evaluation Panels: Some panel members were obviously not knowledgeable in the disciplines they were asked to evaluate - serious flaw in the selection process
  • •Tasking of Evaluation Panels: Were panel members asked to use evaluation criteria outside their disciplines? e.g. were scientists tasked to evaluate costs?

Cost Evaluation

  • •NASA needs to rethink the process of cost evaluation so that it allows for a variety of costing methodologies and is fair to all proposers
  • • The requirement to identify an explicit cost reserve unfairly penalizes organizations, like APL’s Space Department, which use a design and build to cost approach where cost uncertainty is implicit in the process.
  • • APL’s costing methodology has a demonstrated track record for accurate forecasting of space mission costs and was used very successfully on the NEAR Project — it has recently been described in a paper given at the 2nd IAA Conference on Low Cost Planetary Missions, 16–19 April 1996.
  • • A proposer’s cost should not be evaluated based of how it compares to some other organization’s cost model.
  • • The single most important criterion should be the credibility of the organization which performed the cost estimate based on demonstrated past performance (minimum weight of 25% per new Federal policy guidelines).

Recommendations

  • •Retain two-step proposal process: Still preferable to single step despite noted deficiencies. Consider using the proposed ESSP 7-month two-step process with its abbreviated Step One proposal - appears to be an improvement but yet to be tested.
  • Either eliminate cost as a factor in Step One (as in the proposed ESSP process) or include qualified cost analysts on the evaluation panels.
  • Require use of full cost accounting for government participation - no grandfathering.
  • Allow for costing methodologies that do not explicitly identify reserves.
  • •Place more reliance on demonstrated past performance and less on comparison with cost estimates derived from models. In this regard, comply with recent Office of Federal Procurement Policy guidelines which call for weighting demonstrated past performance at 25% or more.
  • Better match qualifications of panelists to disciplines being evaluated.
  • •In the AO discuss all of the programmatics that ultimately will drive the selection process so that PI’s will know what they are up against up front.

Spacecraft Procurement

  • The MIDEX AO forced the PI to choose between a NASA-provided spacecraft or one provided by the PI.
  • This selection had to be made in Step One or, alternatively, the PI had to submit two separate versions of his proposal, one for a NASA provided spacecraft and one for a PI provided spacecraft, and the Step One selection process made the choice for him.
  • •In Step Two the PI was locked into one mode; no ability to change modes.
  • •To allow for the greatest PI flexibility and potential benefit to NASA, the next MIDEX AO should use the same approach as the DISCOVERY and ESSP AO’s - provide PI with as much flexibility as practical in spacecraft procurement throughout process.

SMEX Cost Cap

    Launch vehicle for the last SMEX AO selection process was standard Pegasus, since replaced with the Pegasus XL, with greater payload weight and volume capability. To optimize NASA’s return on its investment in SMEX, PI’s should be able to propose payloads that utilize full capacity of the Pegasus XL. The next SMEX AO should provide proportionately larger payload cost cap, after adjusting for inflation, than the last SMEX AO.

The meeting then adjourned for lunch. Reconvening in the afternoon, John Lintott, in response to the prior discussion, noted that the cost evaluation in Step One had not been an in-depth evaluation, but was only intended to provide a reality check. NASA had relied on the threat of elimination if exceeding Step One costs in Step Two to keep PIs in line on Step One costs. Lintott summarized written comments received prior to the meeting. Comments were received from:

3 Step Two proposers
3 Step One proposers
1 set of consolidated comments from the Step One review team
1 set of consolidated comments from the Step Two review team
3 individual reviewers
3 program participants

Verbal comments received during debriefings were not included. Several compliments had also been received. Some suggested that the two-step process had considerable merit, enabling the proposer to concentrate on scientific and technological issues in Step One. The preproposal conference was thought to be beneficial.

The presentation given at the meeting by John Lintott follows.

Criticisms and suggestions for improvement

The two-step process

Modify the two-step process

    -requires more detailed proposals than in the past.
    -enormous effort on the part of the proposer.
    -expensive - some institutions do not have deep pockets.
    -reduce the number of proposals going into Step Two.
    -fund preparation of Step Two proposals.
    -reduce the length of time for submission of proposals.

Two-step process a disadvantage

    -stretched out the selection process.
    -requires a repetition in description of instrument.
    -possibility of changes in NASA operating rules and capabilities between steps.

Eliminate the two-step process

    -ask for additional detail in the proposal.
    -select 4 missions.
    -conduct a Phase A, then down-select.

 

The AO

  • •The AO did not make extremely clear that implementation issues (technical, management, cost) are taken very seriously now nor did it specifically define what was requested in the proposal for these aspects.
  • •Issue directed AOs, rather than broad ones.
  • •Release a draft AO to the science community for comment.
  • Do not release a draft AO after this one unless there are drastic changes, because this draws out the process, and programs are converging anyway (SMEX, MIDEX, Discovery, and ESSP).
  • •Clarify the concept of "minimum science mission" and how it should be addressed in proposals.
  • •Include science value per unit cost as an explicit evaluation criterion. The AO should describe what it means and how it will be evaluated.
  • The MIDEX AO did not result in the selection of lower cost missions - more dollars mean more science - how should science/dollar be defined to allow the selection of cheaper missions?
  • The evaluation criteria should mention any additional decision elements that will be used in the selection process.
  • •Increase page limits for Step Two proposals

 

Costing requirements of the AO

  • •Do not count domestic or foreign contributions in the cost cap.
  • •Include the cost of the launcher in the budget so that savings of a lower-cost launcher is an advantage.
  • •Allow trades between instrument, spacecraft, and launcher.
  • •Costing requirements for Step One should be relaxed - the AO imposed a substantial costing work in Step One; favored larger universities and NASA Centers with sophisticated costing infrastructures to support proposal writing effort; suggest binning Step One costs in ranges, e.g., $60-70m, $50-60m etc., making the bin the cap at Step Two.
  • •Reduce costing requirements for Step Two - put page limit on cost volume; provide standard budget forms.
  • Provide electronic template for budget; proposal cost data should be provided to NASA in electronic form as well as hard copy.
  • •Cost data should be submitted in real year dollars rather than be tied to a particular fiscal year - reduces requirements on proposers; may reduce time from election to contract award significantly.

 

Preproposal conference

  • •Solicit questions in advance via e-mail and fax - allows NASA time to prepare clear and considered answers to questions.

 

Options for spacecraft development/full-cost accounting

  • •Unfairness to PI-mode investigations - full-cost accounting required for PI-mode but not for NASA-provided spacecraft mode; information requested for PI-mode much greater that for NASA-provided spacecraft mode; make all missions PI-mode and treat uniformly; require full-cost accounting.
  • •Industry participation discouraged by advantage NASA-provided spacecraft mode.
  • •Civil service manpower and NASA facilities should clearly be fully costed in future AOs, using clearly defined, thoroughly spelled out procedure.
  • •Level the playing field when it comes to proposers getting information from NASA (e.g., spacecraft ground systems, launch vehicles); establish points of contact from organizations providing these services to support proposer questions, data requests, analyses, etc.
  • •Unclear as what NASA-provided spacecraft components were offered in Appendix B of the AO.

 

Technology and risk

  • The use of new technology penalized in most important evaluation criteria because perceived as risky; enabling technologies with well understood risks should be rewarded and not penalized.
  • If future AOs are desirous in of new technology efforts, there should be some goals and examples in the AO for proposers to use.

 

Debriefings

  • Telephone debriefings are entirely inadequate; do in person.
  • •Provide written evaluations to proposers.
  • Debriefings indicated apparent misconceptions on part of review team about proposals - suggest interaction between reviewers and proposers to address concerns; apparent lack of continuity between Step One and Step Two reviewers; selection process had discriminators in addition to AO criteria.

The Chairman then invited participants to divide into groups, and circulate between groups if so desired. His charge to the groups follows.

 

Two-Step Selection Process: Chair, Mike Janssen, JPL. This group was to consider:

Dual-Mode Selection Process: Chair, Paul Evenson, Bartol. This group was to consider:

Selection Criteria: Shirley Savarino, TRW. This group was to consider:

Cost Assessment: Chair, Larry Crawford, APL. This group was to consider:

The splinter groups met individually for several hours on Wednesday afternoon, and then reconvened briefly in the late afternoon for a review of progress and potential overlap. It was suggested that the groups continue to work into the evening, and that the plenary session should begin at 8:30 am the following morning, so that all participants could review all recommendations.

Thursday 27 June, 1996

The plenary session began with Mike Janssen, JPL, presenting the recommendations of his group (Two-Step Process). There was wide-ranging discussion and a suggestion that there be a survey of PIs to see how many felt dissatisfied with the review process. There was consensus that too many proposals had been down-selected after Step One. This made it particularly difficult and costly for industry to support the Step Two effort, with statistically unattractive odds of winning. The need for feedback from reviewers was reiterated. In addition, it was felt that those PIs selected for Step Two should have received the evaluations from the Step One review.

It was also felt that there was a disconnect between the emphasis in the AO on the Not-to-Exceed cost cap required at Step One, and the less than rigorous review of costs at that stage. The criteria in the AO gave cost as equal in importance to scientific and technical merit, and so many PIs had worked very hard at getting the correct costs. There was a long discussion about the approach of the review panels to cost feasibility, particularly in the light of the dictates of the AO. It was again suggested that science only should be reviewed and that technical merit and cost should be separated from the Step One phase. However, Brian Dennis, GSFC, suggested that one could not and should not separate science and technical merit, only costs.

Shirley Savarino, TRW, presented her group's recommendations on selection criteria. There were points of clarification discussed but there was general agreement on the need to separate science and programmatics out at Step One, and that the science proposed should not change from Step One to Step Two. There had not been much discussion of the role of the minimum science mission, although some PIs had been confused as to what was meant by the minimum science mission.

Larry Crawford presented his group's findings on cost assessment. There was little discussion, except on the issue of foreign contributions. NASA's goal had been to confine the costs of the mission within the MIDEX range, within NASA's control, and to avoid a proposal that simply made a contribution to a foreign mission, over which NASA would have no control. It was agreed that by imposing a cap on foreign contributions, NASA could achieve that goal without penalizing the PI.

Paul Evenson, Bartol, gave the last presentation. This group recommended a single-mode approach that would be flexible enough to include a menu of services, including management services, that NASA could provide. The terminology in the program should be changed to reflect the fact that a PI could ask NASA to manage the mission, which might incorporate a commercially provided spacecraft.

Meeting participants then unanimously endorsed the recommendations given on pages 6-13 of this report and the plenary session was adjourned.