The FDA has crowed long and hard about its improved statistics for pushing new products through its 510(k) clearance program.
The device industry’s largest trade association, AdvaMed, which should be front and center at holding the agency’s feet to the fire, has praised the agency for improving its performance. Alex Gorsky, chairman and CEO of Johnson & Johnson even got into the FDA lovefest on January 20, 2015 telling Wall Street analysts that “governments are…taking steps to reward innovation through FDA designations that are helping to speed product review times.”
Declining Review Times
FDA Commissioner Margaret Hamburg, M.D. told attendees at a recent AdvaMed meeting that review times for 510(k) clearances have declined, with the average review time for a clearance dropping from a high of 154 days in 2010 to 144 days in 2012.
“Apart from the highest risk devices, we are I think at par with comparable other countries in terms of review times. We do ask for more clinical data often on the higher risk devices. But, I think there’s some urban mythology about where we stand in comparison to review times and leadership, ” said Hamburg.
Faster review times and a greater likelihood of success. By those measures the agency seems to be improving and making life a little easier for device companies.
But what should you be measuring to determine the likely success of getting your product through the 510(k) clearance program?
The 510(k) pathway is the most common way for companies to get their products to market. Simply put, if the company management can convince the FDA that its device is “substantially equivalent” to another product already approved by the agency through the PMA (premarket application) approval process, they are cleared to market their product.
In 2013, FDA cleared approximately 140 510(k)s for every original PMA application approved.
A recent study by Jeffrey Gibbs and Allyson Mullen of Hyman, Phelps & McNamara, and Melissa Walker, the president and chief technology officer of Graematter, Inc., applied Moneyball-style statistical analysis of FDA performance to yield some new insights into FDA performance.
The Mask of Averages
The authors were able to get high-level data on key metrics because Congress requires the FDA to calculate and publish statistics to help companies gauge their likelihood of success in getting through the 510(k) process. But, say the authors, a major limitation is that the calculation of averages will mask product-specific or classification-specific variations. However, the agency has also released databases that enable more probing analyses than were previously available.
They used SOFIE, Graematter Inc.’s Regulatory Intelligence System to analyze various 510(k) metrics from the agency’s public database for a five-year period from 2008 to 2012.
Here’s what they found.
During that time period, the FDA cleared approximately 3, 027 devices per year. The total number and type of clearance remained generally consistent over that period. About 74% of all submissions were done through the traditional route as opposed to De Novo, Special or Abbreviated routes.
Third-Party Reviews Dying on the Vine
The vast majority of submissions were directly submitted to the FDA instead of through an accredited third party reviewer. The FDA began accepting reviews from “accredited persons” in 1998. But that program, say the authors, had limited use because only select devices are eligible, “and companies have reported mixed experiences with their accredited persons.”
They also say recent data show that the effective rate for the third-party 510(k) review process has steadily declined (from 16% in 2008 to 9% in 2012). “There have not been any recent changes to this program, so this decrease appears to be attributable to industry’s disuse—perhaps due to lack of interest, dissatisfaction with the program, and added expense—rather than agency policy.”
Third-party-reviewed 510(k)s also took longer to get through the FDA. They found the average review times all appear to be in excess of the 30 days FDA is allowed to make its decision after receiving the third party’s recommendation.
Average 510(k) Review Times
The overall average review time for a 510(k) between 2008 and 2012 was 137 calendar days. (Lower than the 144 days reported by Commissioner Hamburg). There was an increase in the average review time beginning in 2010, which leveled out between 2010 and 2012. This increase also took place across each of the four 510(k) types.
The average number of review days for traditional submissions climbed from 128 days in 2008 to 160 days in 2012. The number of review days also climbed for Abbreviated, De Novo and Special Applications in that period.
Failure of Abbreviated Pathway
The authors say this data is particularly curious because when the abbreviated 510(k) was introduced in 1998, the goal was, in part, “to streamlin[e] the review of 510(k)s through reliance on a ‘summary report’ outlining adherence to relevant guidance documents.” While abbreviated 510(k)s may offer advantages in terms of cost and time to prepare, the projected advantage in review times didn’t happen in this five-year period.
Medical Specialty Patterns
There are no apparent patterns in terms of the number of submissions cleared within and the number of device types (or product codes) assigned to a medical specialty. “But there is a strong pattern” in the medical specialties with the highest average number of review days—many of them relate to in vitro diagnostic devices (IVDs). The four medical specialties with the longest average 510(k) review times between 2008 and 2012 were pathology, physical medicine, immunology, and hematology (in decreasing order).
Ortho’s 121 Day Reviews
There is some good news for orthopedics. The average number of review days for a clearance by the orthopedic review committee was 121 days. Only radiology at 92 days and cardiovascular at 108 days did better. The other 17 review committees took longer with pathology and physical medicine topping out at 273 and 256 days, respectively.
Overall, the average review time for IVD 510(k)s between 2008 and 2012 was significantly higher than non-IVD 510(k) review times (183 days versus 127 days) and had increased dramatically since 2003 when it was 82 days.
Device Type Reviews by Product Code
Average review times for devices change over time, said the authors. Review times can change more quickly and dramatically for individual types of devices than across the device program or by reviewing committee, even those that are required most often.
Orthopedic related devices accounted for three of the top six device product codes cleared most frequently. Those codes included:
- HRS – Plate, Fixation, Bone (233 clearances, 112 days in 2008 to 137 days in 2012)
- NKB – Orthosis, Spinal Pedicle Fixation, For Degenerative Disc Disease (225 clearances, 88 days in 2008 to 104 days in 2012)
- MAX – Intervertebral Fusion Device With Bone Graft, Lumbar (214 clearances, 87 days in 2008 to 99 days in 2012)
Three of the top 10 product codes saw an increase in their average review times by more than 30 days (the overall 510(k) average), with two product codes, GEI (cutting and coagulation) and DZE (endoscopic implants), seeing increases in excess of 60 days (63 and 79 days, respectively). The authors say this pattern is troubling and puzzling. There is no obvious explanation for this trend with the one exception that there were no device-specific guidance documents issued for these devices between 2008 and 2012
In another analysis looking at which product codes took either longer or shorter to be reviewed between 2005 and 2012, the authors found that of 82 product codes, only six required less time for a 510(k) between 2005 and 2012 (approximately 7%). The remaining 76 product codes took longer to review and the increased amount of time ranged from two days to 157 days. Thirteen product codes saw an increase in their average review times of more than 100 days.
The biggest increase in overall review times between 2005 and 2012 was product code NBW (over-the-counter blood glucose test systems); with a 157 day average review time increase.
Safety Concern Drives Review Times
This increase, say the authors, highlights an important point in their analysis:
“When there is a safety concern with a particular device type, the review time for the 510(k)s is likely to increase—and in this case, increase significantly. Beginning in early 2010, FDA began considering industry-wide actions to address concerns around the accuracy of blood glucose monitors. FDA clearly changed the requirements for these 510(k)s around that same time when it withdrew the 2006 guidance documents for insulin pump 510(k)s in 2011. This conclusion is reflected in the average review time for an NBW 510(k), which increased by 126 days between 2010 and 2011.”
The 510(k) Winner: Wound Dressing
On the other hand, the biggest winner in their 510(k) analysis was product code FRO (wound dressing). This product type required 32 fewer review days between 2005 and 2012; nearly all of that decrease was observed between 2005 and 2008. After 2008, wound dressings engaged the FDA two more days than before. “This variability again reminds us that patterns and trends can change significantly.”
Conclusion – Write a Good 510(k)
There are many ways of predicting how long a 510(k) review will take, say the authors. Both the overall 510(k) program data and understanding how quickly particular types of 510(k)s have been cleared provide useful insight. Companies will want to draw upon multiple, complementary resources.
“In the end, the single most important variable is the 510(k) itself. A well-written, well-supported 510(k) is the sine qua non. Yet, even well-written, well-supported 510(k)s are subject to other forces and trends that can influence the speed with which a 510(k) is cleared, whether relating to the 510(k) program or the specific device, ” conclude the authors.




