30 April 2009

"The Network" Newsletter, Part 2: Lessons in implementation of a disease surveillance system in Peru

by C. Cecilia Mundaca, MD, MPH, Uniformed Services University of the Health Sciences

This article is a part of a series that will be published in the Global Outreach newsletter, "The Network." A pdf version of "The Network" is coming soon!


While employed at the US Naval Medical Research Center Detachment in Lima, Peru I had the opportunity to lead the implementation of a technology-based disease surveillance system (i.e. Alerta) at sites across the nation. This project was a public-private partnership involving the Peruvian Navy, the US Navy and a private company. Alerta provided the mechanism for reporting of 45 diseases/syndromes via a telephone or a computer with Internet access. It was launched as a pilot project in 2002 and was expanded nationwide in the Peruvian Navy by 2006. Its success led to the incorporation of the Peruvian Army with a total of almost 200 sites in 2007. There were several important lessons that might be of value to others planning a similar experience:

• Securing political commitment early was critical to program success. A Peruvian Navy Surgeon General directive was issued to establish the mandatory nature of the program. The directive was useful to enforce the surveillance duties of healthcare personnel but it also established its priority for their superiors. Consequently, surveillance staff members were allowed access to the limited telecommunications and computer support at the sites.

• Mandatory formal reporting to leadership. To ensure constant support from the Peruvian military leadership, weekly formal reports with the system’s performance and a summary of the diseases’ notified were submitted.

• Pilot sites before broad implementation. Beginning implementation with a pilot phase allowed continuous monitoring of every site, supervision visits to the regional hubs and the early development of evaluation indicators. The small scope allowed for investigation of noncompliant sites.

• Quality assurance site visits. Our team conducted site visits to compare electronic reporting to Alerta with local paper charting. During the visits the team identified and addressed challenges (e.g. use of limited resources, confusion about task) while using the opportunity for immediate training.

• Evaluation metrics were critical. We embraced CDC guidelines to develop indicators designed to measure the system’s usefulness and performance. Our evaluation data were used to refine training material, improve our assessment indicators and also to identify noncompliant sites.

• Initial and ongoing training and technical assistance critical. Our team learned that training was important to motivate the surveillance staff. It was insufficient to train them on how to use the technology tool to report diseases but we needed to offer broad-based training courses on the importance of surveillance, epidemiology of the most prevalent diseases in the area, and the basics of outbreak detection and response. We also supported site outbreak response with technical assistance and laboratory supplies.

• Regular feedback as a motivator. Feedback through the distribution of epidemiological bulletins was also very important. Staff observed how their reporting efforts were translated to useful information for their organization.

• Use of incentives. Our team sent congratulations letters to the surveillance staff of sites with the highest performance. We offered free attendance to continuing education conferences. Promotional materials used pictures of the surveillance staff in action. Cumulatively, the use of incentives to reinforce positive behavior was deemed valuable.

No comments:

Post a Comment