Avoiding the Career Plateau

Posted on by .

John C. Maxwell conveys 5 basic principles leading to the success of a professional: leadership, growth, creativity, excellence, and service.  In his blog, “Life is Difficult, Don’t’ Make It Harder for Yourself,” he explains that by focusing on success alone an individual’s reality and viewpoint of the world can change and thus hinder their ability to reach their desired personal and/or professional goals.  The successful professional will therefore “get ahead” by thinking differently and avoiding the career “plateau” that has become all too familiar amongst employees at some of the biggest and most respected organizations.

Maxwell explains that a significant number of employees become complacent after they are rewarded or recognized for their most recent accomplishment/achievement, and thus they plateau.  One must ask, What if NASA GSFC employees used this same logic?

For 55 years, NASA GSFC has been a valued contributor and leader for the nation’s civilian space program and has developed technologies and capabilities that have improved the lives of individuals nationwide.  The Agency remains relevant today because our workforce continues to pursue innovative new missions, projects, and science that will improve our lives and understanding of the world while resisting the complacency that Maxwell describes.  NASA GSFC employees were not content with simply landing on the moon in 1969, but instead, continue to inspire the general public and next generation of explorers by delving farther into space, exploring low-orbit Earth, the solar system, Mars, and areas outside our solar system, all in an effort to increase our understanding of the universe in which we reside.

The Agency’s focus is not solely based on each accomplishment or the success of a specific mission, nor is it distorted by unrealistic viewpoints.  At NASA GSFC, we understand that society, technology, and our human capabilities are constantly changing.   For the Information Technology and Communications Directorate these changes yield opportunities and the need for us to explore advanced competencies in areas such as cloud computing, virtual desktop infrastructures, and mobility.  With that, we strive to equip our world-class scientists and engineers with the tools to improve their knowledge of aerospace and aeronautics and the origin of our universe as well as to build partnerships with our external partners.  Through such collaborations, we will invariably become and remain successful in achieving our strategic goals.

Although Maxwell makes a strong case that recognition and awards may accelerate pathways to failure, our organization and the agency take a different view.  We view recognition as an opportunity to boost employee morale, performance, and loyalty.  We see incentives as a tool to improve innovation, increase collaboration and productivity, and create a deeper passion for the incredible work we do here within NASA and GSFC.




Federal Manager Innovators and What You Can Do to Help

Posted on by .

With a bleak future for federalbudgets, it is vital that agencies turn to innovation to accomplish more withfewer resources. MeriTalk, an online community that fosters public-privatedialogues on IT issues, recently surveyed 200 Federal managers on how cuts wereimpacting agency programs and the actions managers have been taking. The study,published April 1st, 2013 “Innovators Anonymous,” found greatchallenges ahead but identified a segment of managers who are taking innovativeapproaches to dealing with them:

  • Agency budgets are trending lower – 73% of the respondents believe that their budget will be even lower in 2015.
  • Shrinking budgets put agency missions at risk – 12% of respondents describe the impact of budget cuts as devastating and 58% see them as impacting their agency’s mission performance.
  •  34% of Federal innovators are already looking at alternative funding approaches.

These innovation-driven managersspend almost one-third of their time “trying to get their agency to operate orlook at things differently.” At NASA, one innovative solution I have beeninvolved with has been Federal Cloud Brokerage. For more than a year, NASA has collaborated with the General ServicesAdministration and 10 other agencies to explore the Federal Government’s leveragingof cloud computing and a cloud brokerage service to reduce expenses and improveIT services.  Our team has issued aRequest for Information (RFI) from technology firms to continue advancing theconcept of Federal cloud brokerage.

What can you do to help innovate?  Please bring me your ideas on how we canreduce costs while maintaining or improving services to our customers. You seethe opportunities since you are on the front line dealing with efficiency andeffectiveness issues. Keep in mind that we owe it to the taxpayers, ourco-workers and ourselves to best prepare our agency for the great challengesthat lie ahead.

What's the Deal with Big Data?

Posted on by .

Big Data affects nearly all of us in NASA and it is exploding– the average annual growth rate is 60% and by end of 2012, the digitaluniverse is estimated to be 2.7 zettabytes.  Goddard manages enormous volumes of datarelated to building and maintaining satellites, analytical simulations, and supportfunctions.

So what is Big Data? In the IT industry, Big Data isdefined by four V’s: volume, velocity, variety, and veracity. Volume is the sheer amount of data. Velocity is the speed with which newdata is created and existing data modified. Varietyis the management of various data formats and types. Veracity is a concern of many business leaders; it is estimatedthat one in three CEO’s don’t trust the information they use. Also, I’ll add afifth V, for Value, or the considerableusefulness of Big Data. Itallows us to see data patterns and anomalies and shiftsour decision-making from being reactive to proactive. So how is NASA managing the many challengesof Big Data? The NASA Open Government Plan outlines many of ourapproaches such as: managing and processing; archiving and distribution; and sharingdata.

On managing and processing, here’san example. The MissionData Processing and Control System (MPCS) was recently usedby the Curiosity rover on Mars. MPCS interfaces with NASA’s deep-space network,and in turn the Mars Reconnaissance Orbiter, to relay data to and fromCuriosity and process the raw data in real time, a process which previouslytook hours, if not days, to accomplish.

For archiving and distribution, consider the Atmospheric Science Data Center (ASDC) at Langley, which isprocesses, archives and distributes Earth science data, and the Planetary Data System (PDS), which contains considerable planetary science data. PDS offers accessto over 100 TB of space images, telemetry, models, etc. associated withplanetary missions from the past 30 years. 

NASAis a leader at sharing Big Data. TheEarth Observing System Data and Information System (EOSDIS) manages and sharesEarth science data from various sources – satellites, aircraft, fieldmeasurements, etc. The EOSDIS science operations are performed within 12interconnected Distributed Active Archive Centers (DAACs), each with specificresponsibilities for producing, archiving, and distributing Earth science dataproducts.    

To enhance our ability to manage BigData, I believe that the IT industry should adopt the Predictive Model Markup Language (PMML), an XML-based,vendor-agnostic markuplanguage that provides an easier way to share predictiveanalytical model data. With PMML, proprietary issues and incompatibilities are no longer abarrier to the exchange of data and models between applications.  

One real world example of how NASAleverages its expertise in Big Data, and directly affects your life, is in thefield of airline safety. NASA analyzes data from planes to study safetyimplications, which in turn helps to improve the maintenance procedures ofcommercial airlines and potentially prevent equipment failures. Using advancedalgorithms, the agency helps sift through mountains of unstructured data tofind key information that helps predict and prevent safety problems.

Meeting the Needs of the Goddard Customer

Posted on by .

BYOD, or “bring your own device,” is an increasing request byGoddard employees and contractors. Simply, BYOD enables staff to use their owncomputers, smartphones, tablets and other technologies at work.  There are many benefits – allowing users tochoose devices they are comfortable with improves job satisfaction. BYOD makes telecommuting morefeasible and reduces duplicativedevices. Goddard benefits from BYOD too – we save money on hardware, softwareand device maintenance. And, staff tends to upgrade to the latest hardware andsoftware quicker than Goddard does.  

But there are a number of challenges with BYOD.  Smartphones and tablets are susceptibleto worms, viruses, Trojans and spyware just like desktops. Eavesdropping is anissue since carrier-based wireless networks lack end-to-end security. Theft ofdevices can result in a loss of sensitive NASA data. Finally, users may beconcerned that Goddard has access to sensitive personal data.   

Virtualization helps us to overcome these challenges. Virtualdesktops and applications are delivered to end users on any device. Little, ifany, data is actually stored on the device; instead, data is requested anddisplayed as needed, reducing the risk of data loss. We conducted a VirtualDesktop Infrastructure (VDI) pilot this summer and will initiate aproof-of-concept study by allowing Goddard employees and visiting scientists toconnect to NASA data using VDI and their own devices.

We are developing policies for allowing personal devices toconnect to our network. These policies will cover who gets a mobile device, whopays for it, what constitutes acceptable use, user responsibilities, and therange of devices ITCD will support. Our Mobile Device Management (MDM) system willmonitor, manage and support these personal devices. It will provide centralremote management of devices including the distribution of applications, dataand configuration settings, and remote wipes.  And, it will position us to better meetthe needs of our customers.

A Reduction In Resources Can Mean Outsourcing for Federal Agencies

Posted on by .

Last week, I shared an article, Federal IT Faces Budget Pressure, with Goddard’s IT leadership team. This article predicts that federalagencies will have to move towards vendor-managed cloud computing solutionswithin the next 5 years to accommodate future budget cuts.   Werecently received guidance that 10% cuts would be made to all federal ITbudgets within the next fiscal year, so the author’s prediction of this cost-savinginitiative comes as no surprise.   Duringthe discussion, I advised our team that outsourcing initiatives have become aharsh reality for the way in which we do business today, and we need tocontinue to be proactive and adapt.  

An increased reliance on vendors for the management andmaintenance of some of our services is not against the norm for Goddard SpaceFlight Center or NASA.  For example, ifwe examine the amazing work underway at our Wallops Flight Facility and theprivate-public partnership that has been established with Orbital Corporation,it is a visible example our future business model.  Orbital’s Commercial Orbital TransportationServices (COTS) operational system consists of an International Space Station(ISS) visiting vehicle, a new privately-developed medium-class launch vehicle,and all necessary mission planning and operations facilities and services.

Orbital is developing and qualifying a new launch vehicle(called Antares) to enable lower-cost COTS launches as well as future NASAscience and exploration, commercial and national security space missions.Antares will combine Orbital’s industry-leading experience in developing,building and operating small launch vehicles with Wallops Flight Facilities’industry leading range operations to provide services in a more cost-effectivefashion.

Our IT programs, organizations, and scientific missions willneed to use this same strategic approach with Goddard’s infrastructurecapabilities- namely, containerized computing- considering our fiscal constraints.  Containerized computing offers a fee-for-servicemodel that can be monitored, measured, and tailored to scientific demands thatwe have at each Center.  A reduction in our IT funding will inevitablydecrease our ability to procure appropriate resources for cloud/containerizedcomputing services; therefore, outsourcing will potentially be both a short andlong-term solution.   Short-term we wouldbe able to “do more with less”, and long-term, we would be able to facilitatepartnerships to ensure and sustain security compliance, infrastructurecompatibility, etc.

Overall, vendor-managed cloud computing will allow us to re-prioritizeand reallocate those existing resources that we do have into other IT service areas.   We will then be able to focus our talents onimproving operating inefficiencies and innovation across the Center.  It may be a shift in the way that the federalgovernment has done business previously, but for NASA, it’s business as usual.

The Buzz Around IT Governance

Posted on by .

Information Technology (IT) Governance seems to be the hotnew buzzword  across NASA and theCIO community.  On September 29,2011 NASA Mission Support Council approved  a governing model for improving the management of NASA IT. The Agency spends approximately $1.4B annually on IT. Unfortunately,today’s governance structure makes it increasingly more difficult to manage ITspend at the Agency and Center level. The meaning of IT governance varies depending on who’s having the discussion.  One could simply state that ITGovernance is a process that puts structure and discipline around howorganizations align IT strategy with business strategy, ensuring that its functionalelements stay on track to achieve their strategies and goals, and implementinggood ways to measure IT’s performance. It makes sure that all stakeholders’interests are taken into account and that processes provide measurable results.

I believe that every organization—largeand small, public and private—needs a way to ensure that the IT functionsustains the organization’s mission, strategies, and objectives. The level ofsophistication that we must apply to IT governance, however, may vary accordingto size, culture, industry or applicable regulations. A general principal isthat the larger and more regulated the organization, the more detailed the ITgovernance structure should be.

IT governance aligns the Agency’s and Center’s  IT strategies with its mission and business strategy and ensures that our Center will stay ontrack to achieve programmatic and mission  goals, in a manner that produces an IT architecture that isefficient, effective,  scalable,and measurable.  We must also takeinto account stakeholder (e.g.,Congress, public sector partners, industry, academia, etc.) interests and takethem into account and  establishprocesses to ensure measurable results. Essentially, an IT governance framework should answer these fundamentalquestions:

  • How closely aligned is IT to the strategic direction of the Agency or Center?
  • How are IT requirements identified, validated, and funded?
  • How is the IT managed  overall?
  • What are the specific metrics required to effectively manage the Center’s IT resources?
  •  What is the return on investment?
  • What are the risks and how are they being managed?

As detailed in the GSFC IT Strategic Plan, the IT governancemodel proposed for GSFC is supportive and aligns to the Agency IT governance processesby providing the framework and visibility to ensure the Center improves IT as astrategic asset. GSFC’s IT governance policy incorporates the five essential elementsof effective governance:

  • Strategic Alignment – link the mission to IT investment
  • Value Delivery – ensure the IT investment delivers the benefits promised
  • Resource Management – project management personified
  • Risk Management – establish a formal framework for analysis and management of all risks
  • Performance Measures – are we meeting the business goals, how are we reporting them, and how often are we measuring

Center-wide IT investments, programmatic and institutional, willbe reviewed by the governance boards comprised of representatives from every organization,creating a governance structure for IT that is truly federated, providing a strongbalance between organizational and enterprise innovation.

So how do we successfully embrace the structures andprocesses of IT governance? We align our IT strategy with the missions of GSFC todeliver maximum value, and establish a solid portfolio management system thatintegrates and lays out project components to clearly identify how assignedresources lead to the accomplishment of our goals.  This structured policy, along with senior managing governanceboards, combines to ensure that IT investments are synchronized with themissions of the various organizations, thereby providing full value to GSFC. 

Using IT as a strategic asset must be one of the central tenantsof the Center going forward and IT governance is one of the mechanisms thatwill propel GSFC forward as we take on the new and exciting challenges and opportunitiesthat we will face over the next 5 years. I welcome your feedback.


The Changing Role of the Chief Information Officer

Posted on by .

Ifthere’s one thing that’s certain in today’s changing economic and technologicalclimate, it’s that “business as usual” is a thing of the past. To remain at theforefront of scientific discovery within increasing fiscal constraints, we arecompelled as an Agency to do things differently.
On August 8, 2011 the White House released a pivotal memorandum regarding ChiefInformation Officer Authorities (OMB M-11-29) that underscores this point. Thememo explicitly redirects the authority and responsibility of Agency CIOs awayfrom just policymaking and infrastructure maintenance, to encompass trueportfolio management for all information technology (IT) investments. Thememorandum explains that the shift is intended to eliminate the barriers thatget in the way of effective management of Federal IT programs.
The memo <
http://www.whitehouse.gov/blog/2011/08/08/changing-role-federal-chief-information-officers> now directsAgency CIOs to take a lead role in four main areas:

1. Governance

·       Drive theinvestment review process for the entire IT portfolio of an Agency

·       Lead “Tech-Stat”sessions intended to improve line-of-sight between project teams and seniorexecutives

·       Terminate or turnaround one-third of all underperforming  IT investments by June of 2012

2.  Commodity IT

·       Eliminateduplication and rationalize IT investments

·       Pool the agency’spurchasing power across the entire organization to drive down costs and improveservice for commodity IT

·       Show a preferencefor using shared services instead of standing up separate independent services

3.  Program Management

·       Identify, recruit,and hire top IT program management talent

·       Takeaccountability for the performance of IT program managers

4.  Information Security

·       Implement anagency-wide security program

·       Implementcontinuous monitoring and standardized risk assessment processes supported by“CyberStat” sessions

AsI read the memorandum, I was struck by the timing—the Center is positioned totake advantage of these very improvements. We need to begin leveraging IT as astrategic asset for growth, which can occur only if leadership has visibilityand influence into the broad scope of our IT investments (e.g., infrastructure,mission, and corporate). If we pool our resources, reduce duplication, createstrategic partnerships, and leverage our IT investments in a strategic manner,we can provide increased capabilities to the Center at a competitive or reducedprice point—in other words, we can position the Center “to remaincompetitive and thrive in any budget climate.”

The role of the CIO is changing to encompass more visibility and responsibilityfor managing a broader spectrum of IT. I look forward to partnering andcollaborating with our customers across the Center as we develop and execute astrategy for IT that enables the mission of the Center and Agency.

First steps forGoddard

As I review the steps Goddard will need totake to comply with the new guidelines, I recognize the magnitude of thechange. As the Center CIO, I must work with Center stakeholders and take a hardlook at our current and future IT requirements to outline a technical plan thatwill position our Center to be as competitive as possible to achieve long-termsustainability and mission growth. Our workforce must be part of thesolution.  Therefore, we mustdevelop an aligned, skilled, and agile IT workforce equipped to achieve serviceand operational excellence in this competitive and dynamic environment.

Effectively using IT as a strategic asset willrequire partnering in ways we have never partnered before. I will need to reachout to many of you to establish stronger relationships in order to trulyunderstand how I can enable your mission requirements through IT. I ask foryour patience and help as I begin this effort.

While ideological battles over large-scalefiscal reform play out in the media every day, we have a very real need to enablethe NASA mission through information technology.  We must improve IT service quality, lower costs, andstreamline IT governance practices and deliver secure, scalable, and efficient ITservices, products, and operations. These are common goals to achieve a commongood that the entire Center can strive for!

In conclusion I ask the following questions:Does the challenging budget climate offer opportunities for the Center to cometogether? Are there opportunities to share services, capabilities, and assets?Are there opportunities to lower costs and spur growth using innovation models,such as the “open-source methodology?” In my opinion, that answer is a resounding “yes.” We must encourage,embrace, and accept new ideas and manage the associated risks in order to solvetoday’s problems and anticipate tomorrow’s challenges.

Cloud Computing: A Game-Changing Business Strategy

Posted on by .

On June22, 2011, I presented at the Third Annual Cloud Computing World Forum in London.  Anestimated 5,000 attendees, over 120 speakers, and 200 private and publicvendors participated in this forum that featured all of the key players withinthe Cloud Computing (CC) and Software as a Service (SaaS) industries.  I enjoyed the opportunity to represent ourAgency at the forum, and I presented Cloud as a Game-Changing Business Strategywith the mindset that if used effectively, it will be able to reduce the costand schedule of various projects’ life cycles.

As a highlytechnical, governmental agency, NASA’s success leans heavily on its computingcapabilities.  Utilizing the mostefficient and effective approaches for storage, processing, and bandwidth isimperative to ensure our continuing.

Mission-basedprojects within our agency often have very long IT life cycles, and they arealso usually one of a kind and complex: high-stake endeavors.  Decisionsfor missions begin early in the strategic planning phase.  Due to long lead times for procurementof hardware, IT acquisitions also happen early in the life cycle.  Next, because of the long life cycle, upgradesneed to occur regularly; otherwise, what was new when the planning began willbe obsolete by the actual launch date.  In addition, the necessary compute must be estimated far inadvance of when it will actually be needed, which can hinder theinteroperability of a project’s components.  Another important aspect is that duplicate IT environmentsare sometimes created from the variety of projects (e.g., duplicative developmentalenvironments, integration environments, and test environments) resulting also inthe duplication of software, hardware, and licenses.  Furthermore, additional certifications are necessary for anyduplicative environment.  These areobvious drawbacks of our current system. However, CC, when strategically used, offers solutions to many of thesescenarios.  With the availabilityof on-demand access to compute and incredible scalability (elasticity), thereis no need for advanced estimating, duplicative environments, non-sharing ofexpensive licensing, and non-sharing of HW and SW.

The abovesaid, CC is not a one-size-fits-all solution, because some systems (e.g.,embedded systems) do not fall within CC’s use cases.  For example, there is some risk involved when dealing with highlysensitive or complex missions.  Also,there are other obstacles, which would need to be considered carefully beforeproceeding:  such as theintroduction of possible complications to a project or potential security risks. Additionally, latency can becomean issue once data and applications become distributed and certain spectrums ofcompute (e.g., tightly coupled supercomputing) are needed.  Lastly, cultural resistance to change(if disruptive enough to a particular organization) can be a good reason tohold off on a rapid migration to CC.

In orderfor an organization to utilize CC as a game-changing business strategy, itshould be implemented strategically.  First, determine where CC can bring the most value to theorganization; next, gauge the levels of potential willingness for adoption.  Finally, choose an appropriate approachfor execution:  an enterprise-wideor a project-by-project approach.  Theenterprise approach was discussed above, so I will address the project-centricapproach next.

            For certain organizations, as discussedabove, it may not be feasible to implement CC on an enterprise-wide basis.  An alternative then is to use CC on aproject-by-project basis.  Such wouldbe possible for six different types of projects.

1.    Those that need the latest IT on a “just-in-time” basis.  E.g., systemor mission development and construction or renovations of buildings.

2.    Projects that need stop-gap capability such as transferring information, consolidating servers, orreorganizing a data center.

3.    Risks associated with projects, businesses, and change such as indecisive funding, changes in demand, orsignificant changes to IT.

4.    Missions facing challenging funding such as cyclic demands or economic downturns.

5.    Improving new projectsto include utilizing CC until the scope is understood or as you need compute.

6.    Proofs of concepts.  E.g., Scoping of network performance,and scoping workstation or server performance.

            In summary,CC should be embedded in the project-management process to ensure business success, risk mitigation, significant cost savings,a better understanding of compute requirements, and the ability to obtain thelatest technologies when buying new hardware.  If executed properly, organizations will be able to utilize CCas a game-changing business strategy.

NASA is Answering the Call

Posted on by .

Today,more than ever, is the ideal time for the Federal Government to focus onimproving operating inefficiencies and reducing the costs of IT investments asopposed to spending over $80 billion per year as it has done in the past.  One alternative- which Federal ChiefInformation Officer, Vivek Kundra, strives to implement in his 25-pointplan for IT Reform - is a “Cloud First Policy.”   This policy requires that agencies default to cloud-basedsolutions whenever a secure, reliable, cost-effective cloud option exists.   This will ultimately improve operating inefficiencies and overallservice delivery for Federal agencies.

In December 2010, NASA’s Ames Research Center and GoddardSpace Flight Center collaborated to spearhead efforts to provide cloudcomputing as a viable option to NASA scientists and engineers.

NASA’scomputing strategy involves various technologies to include Cloud Computing(CC). Currently, NASA has invested in the OpenStack software for CC andis working to operationalize a specific version of this software, which NASAcalls Nebula.  The Nebula CC software will provide NASA personnel, whoneed a private CC solution, with IaaS (Infrastructure as a Service); later implementationsof Nebula will offer PaaS (Platform as a Service).  

NASAis also working on various technologies that support the conservation of energy(i.e., so-called Green Technologies) and the environment (i.e., so-called CleanTechnologies) such as containerized computing, which offers energy efficiency,a compact footprint, dense computing power, and reduces the demand forcomputing facilities within buildings.  A few other technologies, whichNASA has as part of its computing strategy are:

·   Thin-client, zero-client, and remote thick-client technologies;

·   Data-center consolidations;

·   Server consolidations via virtual computing environments (VCEs)and CC environments (CCEs);

·   Implementing Green Technologies and Clean Technologies in newbuilding endeavors as well as renovations of existing buildings;

·   Computing-environment consolidations;

·   Smart manufacturing environments that provide improvedcapabilities, availability, safety, reliability, and agility; and

·   An IT Storefront concept, which provides a mechanism forallowing customers and users to specify their needs in their terms and let theunderlying storefront software select a best-practice solution.  This may be a CC solution, a mobilitysolution, a thick- or thin-client solution, or a combination of these orsomething else.

Either way, utilization of these technologies will not onlyhelp NASA’s personnel to focus more on their missions and less on computinginfrastructures; but it will also help the Agency to accrue benefits frominnovations, improve resource utilization, and ultimately lead tosustainability that is critical in our current fiscal environment.

In the upcoming months, I’m excited to see the how CC will play a role in our science here at Goddard Space Flight Center.  Eventually, the phrase, take it to the Cloud, will be commonly iterated amongst our scientists and engineers.

2011 Management of Change Conference: Partnership for Success-Delivering IT

Posted on by .

For over 31years, the Management of Change (MOC) Conference, hosted by The American Council for Technology (ACT)Industry Advisory Council (IAC), has provided a valuable open forum whereboth private and public sectors have collaborated to help improve the government’soperating inefficiencies via innovation.  The opportunity to have thebest of government and industry to pool resources and leverage best practicesare essential, and proper implementation will allow agencies to optimize theirspending and, more importantly, redirect their efforts to value addedactivities.   This is allhighlighted in the 2011 MOC theme, “Partnership for Success-Delivering IT.”

In the midst ofan uncertain fiscal budget for 2012 and the out years, the timing of this year’sMOC Conference is the perfect venue to help contribute to these efforts.  Instead of focusing on just partnering,which is one of the critical elements for success, the conference will alsoprovide discussions on innovative IT solutions, effective IT governance, thesuccess of outsourcing, and a variety of other key topics.

As Co-Chair ofthis year’s conference, I am encouraging all to register to attend on May 15-17thin Hot Springs, VA.   Theplanning committee, consisting of both industry and federal employees, hasdiligently worked to develop a robust agenda that will entice all whoattend.   To learn more and toregister please visit http://www.actgov.org/events/managementofchange/Pages/default.aspx 


Please join us as we build onpartnerships for success!    

Page 1 of 212