Using Cost Effectiveness to Plan for EBPs


In 1999, the U.S. Surgeon General presented a national report on mental health treatments and delivery.

“The main finding was that we had a lot of evidence about effective services, but we weren’t doing a very good job implementing them,” says Howard Goldman, a University of Maryland School of Medicine psychiatrist and one of the report’s editors.

The report called on states to be champions for the implementation of evidence-based practices.

Maryland answered that call. In 2001, the state established the Evidence-Based Practice Center (EBPC) at the University of Maryland School of Medicine, under the leadership of the Director of the Maryland Mental Hygiene Administration (MHA), Brian Hepburn, MD.

The innovative public–academic partnership has improved the quality of services and sustainability of evidence-based programs in Maryland by offering training opportunities and financial incentives for fidelity. Here’s how Maryland did it and what state officials learned in the process.

University of Maryland - College Park

Establishing an Evidence-Based Practice Center

After the Maryland Public Mental Health System reorganized the way it financed services from grants to a Medicaid-based fee-for-service system, Dr. Hepburn wanted to address the issue of quality.

The state had a longstanding partnership with the Department of Psychiatry at the University of Maryland School of Medicine. Historically, the partnership had focused on improving the intervention “toolkit” for severe and persistent mental illness, says Goldman.

“We found that just providing a toolkit without the necessary state leadership or training resources wasn’t sufficient,” Goldman explains.

Using Mental Health Block Grant money, the state expanded the public–academic partnership with the creation of the EBPC.

“The EBPC created that leadership structure and served as a locus of training expertise for service providers throughout the entire state,” adds Goldman.

The idea was to help providers maintain current skills while learning new, evidence-based approaches. The EBPC would provide community mental heath agencies with all of the training and support they would need to implement these evidence-based practices—free of charge, explains EBPC Director Eileen Hanson. The center started with the implementation of two evidence-based practices: supported employment and family psychoeducation.

Implementation of supported employment reflected a major policy change in Maryland. The MHA incentivized the shift away from traditional or sheltered employment programs to competitive employment by dropping Medicaid reimbursements for traditional services.

Incentivizing Fidelity

For the first few years, the EBPC trainer or consultant assigned to help an agency implement the evidence-based practice would also perform the fidelity review. Fidelity measures how closely an agency sticks to an evidence-based model for providing a particular service.

In 2006 the state of Maryland and the EBPC decided to make a change.

“We saw the importance of having an independent fidelity review process in which the state would monitor fidelity and the EBPC would provide the technical expertise,” says Hanson. This change would allow trainers to concentrate more on forming a supportive relationship with provider agencies while bringing an impartial referee to the fidelity review.

At the same time, the state recognized that the long-term sustainability of evidence-based practices would depend on stable funding. Medicaid reimbursements offered one potential source of funding. But the state needed some mechanism to ensure that the programs being put into place actually worked.

To incentivize fidelity, Maryland created an enhanced Medicaid reimbursement rate for high-fidelity, evidence-based practice programs.

“Agencies that meet fidelity for a particular program could then bill for services under that program at a higher rate,” explains Steven Reeder, assistant director of clinical services for the MHA (which was recently renamed the Behavioral Health Administration or BHA).

The difference for some services is substantial, says Reeder, creating a boon for community mental health centers that reach high fidelity.

Sources of funding for the fidelity incentive differ, depending on the service. To incentivize supported employment, for instance, Maryland has braided together dollars from the state general fund, the public vocational rehabilitation program, and Medicaid.

Collaborating to measure impact of a program

Measuring Success

The specialized training provided by the EBPC and the enhanced reimbursement rate for high-fidelity programs seem to be working, says Mona Figueroa. She supervises the state’s team of fidelity monitors.

There are now 43 mental health agencies in Maryland that have achieved high fidelity in one or more evidence-based practices, according to Figueroa.

“We see a positive correlation between outcomes and high-fidelity ratings,” she says.

In the case of supported employment, the change has resulted in significant improvements in employment outcomes.

According to a 2014 report from the University of Maryland School of Medicine, more than half of all consumers in Maryland receiving supported employment services achieve competitive employment. Nationally, just 20 percent of consumers taking part in traditional vocational programs are employed.

Staff Turnover is a Challenge

Despite the successes of Maryland’s EBPC, challenges to sustainability remain. “One of the biggest barriers we’re seeing now is the incredibly high turnover rate of agency staff,” says Reeder.

“We’re continually training and retraining and just when people get some level of competency, we have to start again. It’s hard to build on that momentum,” says Goldman.

At the heart of the problem, he says, are high levels of burnout among mental health workers and poor compensation for their work.

In the face of these challenges, Goldman says, it’s essential to have the statewide leadership and training infrastructure in place to support the implementation and sustainability of evidence-based practices.

Last Updated: 06/16/2017