It is clear that a Parity compliance survey could be a useful starting point and tool to structure general Parity requirements by program type and Parity analysis process. A survey, if defined as a conceptual structure, could provide consistency, guidelines, and identify reproducible patterns of data required for distinct types of analysis across program type. However, at the outset, what has been most effective in our experience as Independent Compliance Administrators is to first create workgroups that represent the key analytic domains for a Parity analysis of standards such as benefits classifications, utilization management/utilization review, network adequacy, availability of information requirements (i.e., adverse determination notifications or instructions for consumer appeals), financial requirements and annual dollar limits. In this respect, a survey for each workgroup could be useful to identify the categories of inquiry, relevant data points, and benchmarks. However, once the workgroups are established, the Parity analysis will inevitably be an ever-evolving, complex process of data requests, data management and analysis. In order to address this, the Compliance Administrator must develop effective, collaborative working relations with the health plan staff who are the purveyors of the behavioral health data in question. The variety of data needed to perform a compliance analysis may not be readily available in a format that fits the requirements of data platforms. Failing to establish this collaborative working relationship lends a potentially unpredictable temporal element to the analytic process.
Since an analysis of Parity compliance entails statistical, policy, and procedural data sets, a mixed methodological approach utilizing quantitative and qualitative methodologies to measure the processes, strategies, evidentiary standards is needed. For example, quantitative methods are applied to numerically available data to financial requirements, aggregate lifetime and annual dollar limits to services for mental health and substance use disorders across benefit classifications. Eventually, how these aggregate lifetime and annual dollar limits compare to the spend for the same levels of care across the same levels of business for medical/surgical services must also be determined. Quantitative methods are also applied to behavioral health patient services, admissions, average lengths of stay, or the percentage of denied claims. An additional dimension to be included in such analyses could be Parity requirements for Medicaid Alternative Benefit Plans.
Just as quantitative methods are applied to numbers, the application of qualitative methods begins with words that describe policy, procedure, or why a particular claim for service or benefit classification was denied. Qualitative data are a source of rich descriptions structured by particular contexts that can also be quantified through coding that permits identification of underlying patterns. Qualitative data can also preserve chronological flow, see precisely which actions lead to which consequences, derive meaningful explanations, and lead to serendipitous findings and new integrations. Parity compliance requires a blend of both methodologies that could not possibly be represented in a survey design alone. By definition, survey designs are static documents, whereas the management and delivery of behavioral health services to insured members is active, ongoing, complex, and dynamic.
Quantitative and qualitative methodologies are also complementary analytics that provide a more robust and complex picture of Parity compliance when applied to the same data point, such as adverse determination notifications (“denial letters”) or with sub-classifications of benefits and cumulative financial requirements. For example, although quantitative analyses may identify an increase in spend level of behavioral health benefits as reflected in decreased denials of services, SAE found that a qualitative analysis of specific denial letters, on a case-by-case basis, revealed that a substantial number of denials not captured in the quantitative data set were clinically inappropriate despite a recognized independent standard of defined conditions for mental health and substance use disorders. Further, through coding and quantification of the content of the denial letters, SAE was able to identify underlying patterns to problematic denials such as the misapplication of medical necessity criteria or absence of significant clinical information relevant to the particular cases. Although the quantitative analyses reflected that the provider’s efforts toward Parity compliance were trending in a positive direction, qualitative data also indicated that there were still substantive problems with the clinical rationale for some adverse determination notifications. These findings provided the impetus for SAE to provide technical assistance to the behavioral health provider regarding processes, strategies and evidentiary standards in applying the medical necessity criteria to that enrollee. An additional collaborative workgroup was created to analyze the content of specific denial letters to address deficits on a granular, operational level.
In sum, a Parity compliance template alone could not possibly capture the dynamic process of data collection, processing, the need for additional data requests, or the need for data to be provided in formats potentially different from the way in which an insurer collects and stores their data. There is no way to predetermine the starting point for assessing Parity, as relative compliance is part of a continuum. The process of analyzing Parity is organic and complex in nature where disparate data sets chronologically generated from the respective workgroups must be organized and coherently integrated to reflect full compliance with the legal requirements that comprise the totality of the Mental Health Parity Addiction and Equity Act of 2008 (MHPAEA). A static Parity compliance survey design is not equal to this task. Successful compliance is informed by an on-going, structured collaboration between the compliance monitor and the health plan’s staff; a mixed methodological approach utilizing and integrating both quantitative and qualitative methodologies; and a data metrics platform that is responsive to, and supportive of the complex interaction of quantitative and qualitative data.