This is not a NASA Website. You might learn something. It's YOUR space agency. Get involved. Take it back. Make it work - for YOU.
Congress

NASA Totally Flunks FITARA Scorecard 2 Years In A Row

By Keith Cowing
NASA Watch
May 18, 2016
Filed under
NASA Totally Flunks FITARA Scorecard 2 Years In A Row

Federal Information Technology Acquisition Reform Act, Wikipedia
“One of the requirements would be that the government develop a streamlined plan for its acquisitions. The bill would increase the power of existing Chief Information Officers (CIO) within federal agencies so that they could be more effective. Each agency would also be reduced to having only one CIO in the agency, who is then responsible for the success and failure of all IT projects in that agency. The bill would also require the federal government to make use of private sector best practices. The bill is intended to reduce IT procurement related waste.”

https://media2.spaceref.com/news/scorecard.2015.jpg

Oversight Committee FITARA Scorecard (2015) Larger image

https://media2.spaceref.com/news/scorecard.2016.jpg

Oversight Committee FITARA Scorecard (2016) [Note: NASA is the only agency to get an overall ‘F’ grade]
Hearing, Federal Information Technology Reform Act Scorecard 2.0, House Oversight Committee
NASA CIO Wynn Testimony
“Admittedly, NASA’s scores on the FITARA scorecard are unacceptable. We have work to do, and challenges to overcome. But at the same time, I believe it is also important to reflect on the major strides NASA has already taken in improving the management of and protection of the Agency’s IT infrastructure. Thus, the remainder of my testimony today will provide a brief summary of our achievements to date, and other work in progress directed at becoming the best stewards of the Agency’s IT resources.”

Keith’s note: I have to be completely honest: neither this hearing or the FITARA report/scorecard that was released were on my news radar. I need to thank NASA’s AA for Legislative Affairs, Seth Statler, for pointing out the hearing – and NASA’s ‘F’ grade. NASA has the distinction in 2016 for being the only agency to get an overall ‘F’, so congratulations are in order. Of course, in telling everyone about FITARA, it is quite obvious that Statler was doing a little blame shifting as he spoke for NASA CIO Renee Wynn – while throwing her under the bus. You’d expect the @NASACIO Twitter to say something too but they have not tweeted anything since 15 March 2015.
Nor is there any mention of the hearing, the CIO’s testimony, the 2016 score card (or last year’s), NASA’s performance (or lack thereof) and what corrective actions NASA plans to make on the NASA CIO website. Searching for “FITARA” only yields 6 results across all of NASA’s websites. This chatty 2016 newsletter from the CIO makes no mention of NASA’s abysmal score in 2015 but does say “OCIO has made significant progress in the development of a solid implementation plan.” So, as long as they are working on a plan, then everything must be OK.
There is a slightly goofy post at Open.NASA.gov (not findable on the NASA search engine) NASA’s Approach to Implementing FITARA” from 10 March 2016 that opens with “My husband and I are planning a vacation to Disneyworld, an awesome destination for our five year old dreamer. How do we budget for such an grandiose trip?” , and then goes on to spout happy talk – with added IT word salad – about how seriously NASA takes FITARA. If only.

NASA Watch founder, Explorers Club Fellow, ex-NASA, Away Teams, Journalist, Space & Astrobiology, Lapsed climber.

23 responses to “NASA Totally Flunks FITARA Scorecard 2 Years In A Row”

  1. duheagle says:
    0
    0

    So NASA manages IT about as well as it manages JWST. Why am I not surprised.

  2. cb450sc says:
    0
    0

    Well, this act (based on reading wikipedia) sounds like the usual industry buzzwords patched together with little understanding of the scope of the problem. I worked for nearly 30 years in a multi-mission data processing center. NASA has always been computer heavy long before the rest of the world, and as a result has more problems than many other agencies with this. There is a truly byzantine web of legacy techology (both software and hardware) that has to be maintained. “Streamlining it” is often unbelieveably costly as it requires dismantling and rebuilding all of this stuff, some of which is so old the original developers (we used to call them programmers!) are long gone, and the requirements and specs are lost as well. Furthermore, it would all have to be re-tested and certified. And often this stuff is required for active missions and/or archives and we can’t just shut it down. We spent a lot of time sweating this and never came up with a clear solution. If I had a dollar for every time someone told me they didn’t understand why a data archive costs money to maintain, I’d have enough money to maintain one! Or “can’t you just move this to the cloud?”. Um, no, actually.

    • duheagle says:
      0
      0

      Moore’s Law was first asserted over a half-century ago and the rate of change in the computer industry, 1945 – 1965, was already obviously breakneck when NASA was in its Apollo-era heyday. The fact that NASA evidently did nothing institutional to allow for inevitable digital systems evolution even at a time when it commanded several percent of the federal budget does not speak well of its then and subsequent IT management.

      I am aware of numerous private-sector instances in which significant software bases originally developed on then obsolete equipment were ported to newer equipment via development of emulator packages.

      The most significant instance I can think of was the support, by Apple, of much of the Macintosh’s original Motorola 68000-based system-level code via emulation when Apple moved to the PowerPC chip. This emulation mode bought Apple time to do a proper job of re-doing the code in native PowerPC trim. I’m not personally familiar with how Apple handled the subsequent PowerPC-to-Intel migration, but I wouldn’t be surprised to find that the same general approach was used.

      Other emulations I am familiar with: the Motorola 68000-based Amiga computer on a number of subsequent hardware platforms, the IBM 704-series machines on IBM System/3 series machines, the IBM System/34 and System/36 machines on IBM RS 6000 hardware. There are, doubtless, many other relevant examples of which I am not personally aware. Perhaps this site’s readership can identify some of them.

      It seems as though this general approach of developing emulators could have saved NASA a lot of trouble. It would also have saved this site’s proprietor a lot of trouble when he set out to repurpose an ancient NASA deep space probe awhile back.

      These are balls NASA pretty obviously didn’t have to drop, but did anyway.

      • sunman42 says:
        0
        0

        Sorry, but the late Mr. Jobs told the crowd at Apple’s developer conference in 2005 that _every_ version of OS X up to that time been written from scratch on Intel CPUs as well as PowerPC ones, since the NeXT / Mach guts of the OS were not processor-dependent. I believe you are thinking back to earlier days, when OSes and hardware were much more closely interdependent.

        NASA’s issues with old IT (1) have nothing to do with platforms (if a project is stuck on legacy systems, it’s because the funding model for extended missions doesn’t allow enough funds for periodic reengineering),and (2) sometimes old IT is just fine for taking lab measurements or doing less demanding calculations – while presenting security issues because the old OS can’t be patched any longer.

        This law, quite frankly, is about handing large agency wide contracts to IT integratorbs school as Mr. Issa once led, instead of the agency being able to be a bit more picky, project by project. This approach might save some money on sys admin and administrative overhead, but the record shows it would be a disaster for NASA’s mission. Stupid law.

        • duheagle says:
          0
          0

          What you say sounds true for OS X. I’m sure the entire OS X operating system is written in a high-level language. The transition I’m talking about happened way before OS X came along. I think it was Mac OS 7 or 8. Not sure of too many details as I’ve never been a Mac user. But a lot of the core system-level stuff for the early Macs, particularly graphics-related stuff, was written in Motorola 68000 assembly language. That’s the stuff that was emulated on the early PowerPC-based Macs and clones (yes, Apple actually allowed clones of the Mac for a short time in the early 1990’s before reversing course).

  3. sunman42 says:
    0
    0

    The real problem here is the assumption by the bill’s writers, who let’s face it were writing a bill for the benefit of the IT industry not the taxpayers, that all federal agencies’ IT uses/needs/best practices are the same. Well, an engineering and R&D- heavy mix of requirements needs a very different procurement model than a typical government bureaucracy where a small number of thin loads on the desktop suit pretty much everyone.

    I applaud NASA’s failure to force the multifaceted peg of its real IT needs into the contractor-shaped hole of this bogus legislation.

    • duheagle says:
      0
      0

      The national laboratories also have an “engineering and R&D-heavy mix of requirements” and the NSF still managed to get a better grade than NASA. Not a good grade, mind you. NSF got a ‘D,’ the same grade as HHS of Obamacare infamy. Appeals to “uniqueness” always seem to be among the first excuses trotted out by the institutionally incompetent to defend their derelictions.

      • Rich_Palermo says:
        0
        0

        The NSF does research and engineering? I thought they only sponsored it. And, are there NSF National Labs? I thought only DoE, NIST, NIH, etc. were in that mix.

      • Michael Spencer says:
        0
        0

        Very nice use of ‘Obamacare infamy’ ! A+ !

        • duheagle says:
          0
          0

          Thanks. Whatever one may think of Obamacare as public policy, even it’s most rabid supporters don’t pretend the initial rollout of the national web site was anything other than a world-class Charlie Foxtrot. This differed from many other governmental IT disasters only in notoriety.

      • sunman42 says:
        0
        0

        None of the national labs, which are operated by universities or university consortia for the most part (see also DOE, though they have a commercial partner for Los Alamos now) are included in this, so the comparison is with internal NSF ‘business” systems.

        • duheagle says:
          0
          0

          Living in California, I’m quite aware that, say, the University of California runs Lawrence Livermore National Laboratory. But the funds UC uses to do this come from the federal govt. I was under the impression said funds came via the NSF. Evidently, they come from the DOE. The DOE got the same ‘F’ grade as NASA. That suggests that engineering and scientific computing in the federal government is in even worse shape than I thought.

          The Wikipedia entry about FITARA mentions that the federal government spends $80 billion a year on IT. “Almost half of this goes to maintaining old and out-of-date systems.” That means that a program of emulator acquisition and/or development as a means of assisting orderly transition from obsolete to more modern computing infrastructure would be of benefit to the federal government to the tune of ten figures worth of annual spending. That makes failure to adopt a “best practice” that is, and has for decades, been common practice in private industry arguably the most financially consequential dereliction and failure in governmental IT management history.

          • Rich_Palermo says:
            0
            0

            The people who put in FITARA and who want to enforce it are not the friends of technology, technology organizations, or technical people. These are people who want to interpose themselves into administration/control roles where they will profit handsomely.

            There are plenty of reasons to use legacy code and systems. DOS, FORTRAN, FORTH, and other tools have been successful, tested, and robust. As sunman said, it would cost a fortune to migrate these and the funds will not be there because they’ve been siphoned off to pay the suits.

            I’ve seen all manners of this scam run over the years: TQM, Quality Circles, Business Process Reengineering, Six Sigma, PLM, CMMI, Enterprise Resource Planning, Industry Best Practices,… you name it. If you want to know why it costs $1T to go nowhere, start there. The money is in preventing people from doing any work and then replacing them with people absolutely content to do no work but enter data into the software du jour.

          • sunman42 says:
            0
            0

            As Mr. Palermo points out, this has nothing to do with emulators and everything to do with which large “services,” “system integrators,” and “VARs” get to provide Procrustean “solutions” that in the best case may actually provide cost savings in standard load business management desktops, but always fail utterly to comprehend both the mix of systems needed for lab and operations work, and the need for rapid replacement when such machines fail.

            Mission operations defends on “old and out of date” systems because of the budget profile that dictates extended missions always have decreasing resources, but prudent managers always try to find some resources for extending IT resource life and porting the mission critical software to sustainable platforms. The emphasis is on “try,” since NASA as an organization prefers through in billions at new mission development to investing a few million per year on IT longevity.

          • Rich_Palermo says:
            0
            0

            ‘Procrustean “solutions”‘ – Superb turn of phrase!

            I just now noticed that the Office of Personnel Management (OPM), the one that suffered the massive data breach, managed a ‘D’ whereas NASA, DOE, and NSF got ‘F’s.

            https://www.washingtonpost….

    • Rich_Palermo says:
      0
      0

      Ain’t this the truth. I saw so many examples where some slick salespeople sold a “best practices” spiel to execs and left the rank-and-file employee to deal with the inevitable disaster.

  4. DDNH says:
    0
    0

    Looks like one-size-fits-all, a better question to ask how many critical applications running on obsolete hardware and if these are documented, source code is available and can be built. Also, have recovery & transition plans. I know the USAF, IRS, and FAA would not fair well if audited.

    • sunman42 says:
      0
      0

      In theory, through its assessment and authorization process, NASA requires disaster recovery plans for all of its IT systems.

  5. JJMach says:
    0
    0

    Unfortunately, this is not the only IT fiasco effecting NASA at the moment.

    After the GAO warned about malware-containing bits in a knockoff components purchased at too-good-to-be-true pricing, it was decided that all purchases of IT equipment needed to be vetted by IT security. After the edict came down, there did not seem to have been the necessary resources provided, as the approval process quickly slowed to a crawl, with projects and labs waiting for weeks, then months to have specialized components approved.

    Common items do not fare much better, as ironically, there are only a handful of examples of something as common as a USB thumb-drive on the “Assessed and Cleared List,” and half of them are no longer available. Given the speed of evolution of commodity components, it seems that items will increasingly get approved at about the time that they are obsolete.

    Worse, there seems to be little to nothing the engineers and scientists can do to break the logjam, so they are stuck between a bureaucratic rock and the hard place of risking their careers by trying to work around the system in order to get their jobs done.

    • Michael Spencer says:
      0
      0

      It’s endemic. Witness TSA as an instance of bureaucratic response to a problem.

    • sunman42 says:
      0
      0

      Word has it the clearance is performed by the FBI.

    • Matt Linton says:
      0
      0

      Very true, but don’t forget that NASA IT folks didn’t make that mandate, it was directly mandated by Congress and levied on NASA without providing a single piece of extra resources to handle the load of doing all that Vetting.