Digital Health Technologies: Diagnostic Applications - CAM 30301

Description
Digital health technologies is a broad term that includes categories such as mobile health, health information technology, wearable devices, telehealth and telemedicine, and personalized medicine. These technologies span a wide range of uses, from applications in general wellness to applications as a medical device, and include technologies intended for use as a medical product, in a medical product, as companion diagnostics, or as an adjunct to other medical products (devices, drugs, and biologics). The scope of this review includes only those digital technologies that are intended to be used for diagnostic application (detecting the presence or absence of a condition, the risk of developing a condition in the future, or treatment response [beneficial or adverse]) and meet the following 3 criterion: 1) Must meet the definition of "software as a medical device," which states that software is intended to be used for a medical purpose, without being part of a hardware medical device or software that stores or transmits medical information; 2) Must have received marketing clearance or approval by the U.S. Food and Drug Administration either through the de novo premarket process or 510(k) process or pre-market approval; and 3) Must be prescribed by a health care provider.

Summary of Evidence
For individuals who are in the age range of 18 to 72 months and in whom there is a suspicion of autism spectrum disorder (ASD) by a parent, caregiver, or health care provider and who receive Canvas Dx, the evidence includes a single prospective study of clinical validity. Relevant outcomes are test validity, change in disease status, functional outcomes, and quality of life. Results of the study reported that Canvas Dx outperformed conventional autism screeners both in area under curve (AUC), sensitivity, and specificity. However, multiple limitations were noted. The major limitation is the lack of clarity on how the test fits into the current pathway. Diagnosis of ASD in the United States generally occurs in 2 steps: developmental screening followed by comprehensive diagnostic evaluation if screened positive. To evaluate the utility of the test, an explication of how the test would be integrated into the current recommended screening and diagnostic pathway is needed. Neither the manufacturer’s website nor the FDA-cleared indication is explicit on how the test fits into the current pathway. It is unclear whether the test is meant to be used as add-on test to existing comprehensive diagnostic evaluation tests or if it could replace existing comprehensive diagnostic evaluation tests among a population of children at risk for developmental delay for confirmatory diagnosis of ASD. In addition, there is also a potential of "off-label" use of this test in the general population, either as a screening test or a diagnostic test. Second, the manufacturer asserts that Canvas Dx is intended to be used by a primary care physician to aid in the diagnosis of ASD, but the published study on clinical validity used a specialist rather than a primary care physician to complete the clinical questionnaire module. This is likely to result in higher sensitivity and specificity and thus confounds the interpretation of published data on clinical validity. Further testing in primary care clinics is needed to validate the accuracy of the clinician module. In addition, all published studies were conducted on children who had been preselected as having high risk of autism. No studies on children from the general population have been published. Other limitations include differences that may occur between the testing environments of a structured clinical trial setting versus the home setting and lack of data on usability outside of a clinical trial. Evidence for the Canvas Dx has not directly demonstrated that the test is clinically useful, and a chain of evidence cannot be constructed to support its utility. The evidence is insufficient to determine that the technology results in an improvement in the net health outcome.

Additional Information
Not applicable.

Policy 
Prescription digital health technologies for diagnostic application that have received clearance for marketing by the U.S. Food and Drug Administration as a diagnostic aid for autism spectrum disorder (Canvas Dx) are considered investigational and/or unproven and therefore considered NOT MEDICALLY NECESSARY. 

Policy Guidelines 
Coding
See the Codes table for details.

BACKGROUND
Autism Spectrum Disorder

Autism spectrum disorder (ASD) is a biologically based neurodevelopmental disorder characterized by persistent deficits in social communication and social interaction and restricted, repetitive patterns of behavior, interests, and activities. ASD can range from mild social impairment to severely impaired functioning; as many as half of individuals with autism are non-verbal and have symptoms that may include debilitating intellectual disabilities, inability to change routines, and severe sensory reactions. The American Psychiatric Association’s Diagnostic and Statistical Manual, Fifth Edition (DSM-5) provides standardized criteria to help diagnose ASD.1

Diagnosis of ASD in the United States generally occurs in two steps: developmental screening followed by comprehensive diagnostic evaluation if screened positive. American Academy of Pediatrics (AAP) recommends general developmental screening at 9, 18 and 30 months of age and ASD specific screening at 18 and 24 months of age.2,3 Diagnosis and treatment in the first few years of life can have a strong impact on functioning as it allows for treatment during a key window of developmental plasticity.4,5 However, early diagnosis in U.S. remains an unmet need even though studies have demonstrated a temporal trend of decreasing mean ages at diagnosis over time.6,7 According to a 2020 study by Autism and Developmental Disabilities Monitoring (ADDM) Network, an active surveillance system that provides estimates of ASD in the U.S., reported median age of earliest known ASD diagnosis ranged from 36 months in California to 63 months in Minnesota.8

Scope of Review
Software has become an important part of product development and is integrated widely into digital platforms that serve both medical and non-medical purposes. Three broad categories of software use in medical device are:

  1. Software used in the manufacture or maintenance of a medical device (example software that monitors X-ray tube performance to anticipate the need for replacement),
  2. Software that is integral to a medical device or software in a medical device (example software used to "drive or control" the motors and the pumping of medication in an infusion pump)
  3. Software, which on its own is a medical device referred to as "software as a medical device" (SaMD) (example, software that can track the size of a mole over time and determine the risk of melanoma)

The International Medical Device Regulators Forum, a consortium of medical device regulators from around the world led by the U.S. Food and Drug Administration (FDA) defines SaMD as "software that is intended to be used for one or more medical purposes that perform those purposes without being part of a hardware medical device."9 Such software was previously referred to by industry, international regulators, and health care providers as "stand-alone software," "medical device software," and/or "health software," and can sometimes be confused with other types of software.

The scope of this review includes only those digital technologies that are intended to be used for diagnostic application (detecting presence or absence of a condition, the risk of developing a condition in the future, or treatment response [beneficial or adverse]) and meet the following 3 criterion:

  1. Must meet the definition of "software as a medical device" which states that software is intended to be used for a medical purpose, without being part of a hardware medical device or software that stores or transmits medical information.
  2. Must have received marketing clearance or approval by the U.S. Food and Drug Administration either through the de novo premarket process or 510(k) process or pre-market approval and
  3. Must be prescribed by a health care provider.

BCBSA Evaluation Framework for Digital Health Technologies
SaMDs, as defined by FDA, are subject to the same evaluation standards as other devices; the Blue Cross Blue Shield Association Technology Evaluation Criterion are as follows:

  1. The technology must have final approval from the appropriate governmental regulatory bodies.
  2. The scientific evidence must permit conclusions concerning the effect of the technology on health outcomes.
  3. The technology must improve the net health outcome.a
  4. The technology must be as beneficial as any established alternatives.
  5. The improvement must be attainable outside the investigational settings.b

a The technology must assure protection of sensitive patient health information as per the requirements of The Health Insurance Portability and Accountability Act of 1996 (HIPAA).
b The technology must demonstrate usability in a real-world setting.
Other regulatory authorities such as the United Kingdom's National Institute for Health and Care Excellence (NICE) have proposed standards to evaluate SaMD.10

Regulatory Status
Digital health technologies that meet the current scope of review are shown in Table 1.

Table 1. Digital Health Technology for Diagnostic Applications

Application Manufacturer FDA Cleared Indication Description FDA Product Code FDA Marketing Clearance Year
Canvas DX (formerly known as Coagnoa App) Cognoa "Canvas Dx is intended for use by health care providers as an aid in the diagnosis of Autism Spectrum Disorder (ASD) for patients ages 18 months through 72 months who are at risk for developmental delay based on concerns of a parent, caregiver, or health care provider. The device is not intended for use as a stand-alone diagnostic device but as an adjunct to the diagnostic process. The device is for prescription use only (Rx only)." Artificial intelligence app for use by health care providers as an adjunct in the diagnosis of autism spectrum disorder for patients ages 18 to 72 months. Canvas DX includes 3 questionnaires: parent/caregiver, a video analyst, and a health care provider, with an algorithm that synthesizes the 3 inputs for use by the primary care provider. QPF DEN200069 2021

FDA: U.S. Food and Drug Administration.

Rationale 
This evidence review was created in April 2022 with a search of the PubMed database. The most recent literature update was performed through April 25, 2022.

Evidence reviews assess whether a medical test is clinically useful. A useful test provides information to make a clinical management decision that improves the net health outcome. That is, the balance of benefits and harms is better when the test is used to manage the condition than when another test or no test is used to manage the condition.

The first step in assessing a medical test is to formulate the clinical context and purpose of the test. The test must be technically reliable, clinically valid, and clinically useful for that purpose. Evidence reviews assess the evidence on whether a test is clinically valid and clinically useful. Technical reliability is outside the scope of these reviews, and credible information on technical reliability is available from other sources.

Autism Spectrum Disorder
Clinical Context and Test Purpose

The American Academy of Pediatrics provides details on the screening and diagnosis for autism spectrum disorder (ASD).2,3 Children with ASD can be identified as toddlers, and early intervention can and does influence outcomes.11 The Academy recommends screening all children for symptoms of ASD through a combination of developmental surveillance at 9, 18, and 30 months of age and standardized autism-specific screening tests at 18 and 24 months of age.

Screening tools typically use questionnaires that are answered by a parent, teacher, or clinician and are designed to help caregivers identify and report symptoms observed in children at high risk for ASD. While they are generally easy and inexpensive to administer, they have limited sensitivity (ability to identify young children with ASD) and specificity (ability to discriminate ASD from other developmental disorders, such as language disorders and global developmental delay).12 Results of a screening test are not diagnostic. Due to the variability in the natural course of early social and language development, some children who have initial positive screens (suggesting that they are at risk for ASD) ultimately will not meet diagnostic criteria for ASD.13 Other children who pass early screens for ASD may present with atypical concerns later in the second year of life and eventually be diagnosed with ASD. In the context of early identification and diagnosis of ASD, sensitivity is more important than specificity for a screening test as the potential over-referral of children with positive screens is preferable to missing children at risk for ASD. Once a child is determined to be at risk for a diagnosis of ASD, either by screening or surveillance, a timely referral for a comprehensive clinical diagnostic evaluation is warranted. Structured observation of symptoms of ASD during clinical evaluation is helpful to inform the diagnostic application of the DSM-5 criteria. These tools require long and expensive interactions with highly trained clinicians. To meet diagnostic criteria, the symptoms must impair function.

Cognoa, the manufacturer of Canvas Dx, has stated on its website that the test “is intended for use by health care providers as an aid in the diagnosis of ASD for patients ages 18 months through 72 months who are at risk for developmental delay based on concerns of a parent, caregiver, or health care provider.14 The device is not intended for use as a stand-alone diagnostic device but as an adjunct to the diagnostic process. Further the manufacturer states, "Canvas Dx can aid primary care physician in diagnosing ASD in children starting at 18 months of age during a critical period when interventions are shown to provide/lead to optimal long-term outcomes". The manufacturer also makes indirect and direct assertions that the use of Canvas Dx may allow children with ASD to be diagnosed earlier than the current average age of diagnosis and that the use of this test fulfills an unmet need for a delayed formal diagnosis of ASD after parenteral concern.14 Some of the reasons cited for the unmet need of a delayed diagnosis is shortage of specialists, time-intensive evaluations, lack of access to care for children from ethnic/racial minorities and/or disadvantaged socioeconomic backgrounds and in rural areas, lack of standard diagnostic process for ASD and use of multiple types of specialists for referral with no clear pathway for primary care physicians.

To evaluate the utility of the test, an explication of how the test would be integrated into the current AAP-recommended screening and diagnostic pathway is needed. The FDA authorized indication is for children who are at risk of developmental delay. It is unclear how Canvas Dx may be used as an aid to existing testing used for diagnosis during the comprehensive clinical diagnostic evaluation. Several potential scenarios are possible. For example, Canvas Dx could be used as an add-on test to existing comprehensive diagnostic evaluation tests or it could replace existing comprehensive diagnostic evaluation tests among a population of children at risk for developmental delay for confirmatory diagnosis of ASD. Canvas Dx could also be used as a rule-out test to identify false-positive cases among a population of children at risk for developmental delay to minimize unnecessary referrals for a comprehensive diagnostic evaluation. It remains unclear if use of Canvas Dx eliminates the need for comprehensive diagnostic evaluation by a specialist. In addition, there is also a potential of "off-label" use of this test in the general population either as a screening test or a diagnostic test. Note that each of these hypothetical scenarios have a unique PICO formulation. The general formulation is described below. A more complete PICO discussion is only possible with explicit information on how the test should be used.

The purpose of Canvas DX in individuals who are in the age range of 18 to 72 months and in whom there is a suspicion of ASD by a parent, caregiver, or health care provider is to inform a decision whether the individuals needs a comprehensive diagnostic evaluation by a specialist for a confirmatory diagnosis of ASD.

The question addressed in this evidence review is: Does testing with Canvas DX improve the net health outcome in children with ASD?

The following PICO was used to select literature to inform this review.

Populations
The relevant population of interest are children who are in the age range of 18 to 72 months and who are at risk of developmental delay. Depending on how the test fits into the existing screening and diagnostic pathway, this may include all children in this age group, it may include only those for whom there is a suspicion of ASD by a parent, caregiver, or health care provider, or it may include only those indicated for referral for comprehensive evaluation with existing screening. At present, the AAP does not recommend universal screening for ASD in children older than 30 months.

Interventions
The test being considered is Canvas DX (formerly known as Cognoa App). According to the manufacturer, Canvas Dx is a prescription diagnostic aid to health care professionals considering the diagnosis of ASD in patients 18 months through 72 months of age at risk for developmental delay.14 Canvas Dx incorporates 3 separate inputs. The patient’s caregiver uses a smartphone application (“App”) to fill out a caregiver questionnaire (4-minute) that asks about the child’s behavior and development. The patient’s caregiver also uses the smartphone application to make video recordings of behavior at home. A lightly trained video analyst reviews these videos of the child recorded by the parent/caregiver and completes a questionnaire (2-minute). Finally, a health care professional meets with the child and a parent/caregiver and completes an online questionnaire (2-minute) via a health care provider portal. Canvas Dx utilizes a machine-learning algorithm that receives the 3 independent inputs and produces one of the 3 outputs listed in Table 2.

Canvas DX uses a machine learning-based assessment of autism comprising the above-mentioned modules for a unified outcome of diagnostic-grade reliability. The parent and the clinician questionnaire modules are based on behavioral patterns probed by a Autism Diagnostic Interview-Revised (ADI-R) while the video assessment module is based on behavioral patterns probed by the Autism Diagnostic Observation Schedule (ADOS).15

Abbas et al. (2020) states that the responses from the 3 modules are each considered to be a "probability and combined mathematically."15 Upper and lower thresholds are applied to produce the categories in Table 2. The paper states that "thresholds can be tuned independently to optimize the sensitivity, specificity, and model coverage."

Table 2. Outputs of Canvas Dx14

Canvas Dx Output Interpretation
Positive for ASD The patient has ASD if the health care professional confirms the clinical presentation of the patient is consistent with and meets diagnostic criteria for ASD.
Negative for ASD The patient does NOT have ASD if the health care professional confirms the clinical presentation of the patient is consistent with ruling out ASD and does NOT meet diagnostic criteria for ASD. A negative result does not necessarily mean that the patient will not develop ASD in the future and continued monitoring or evaluation for non-ASD conditions may be warranted.
No result The available information does not allow the algorithm to render a reliable result. This does not mean that the patient either has or does not have ASD.

ASD: autism spectrum disorder 

Comparators
The comparator would depend on exactly how the test fits into the diagnostic pathway. Possible comparators could be validated tools used for developmental surveillance, ASD specific-screening tools, and comprehensive diagnostic evaluation tests for confirmatory diagnosis of ASD that are commonly used in the United States.

Multiple validated screening tools are available and tools commonly used in the U.S are summarized in Table 3. The choice of screening test depends upon the age of the child and whether they are being screened for the first time or have been identified through developmental surveillance or screening to be at risk for developmental problems. The AAP recommends developmental and behavioral screening for all children during regular well-child visits at the age of 9, 18, and 30 months. In addition, the AAP also recommends that all children be screened specifically for ASD during regular well-child visits at age of 18 and 24 months.3 At present, there are no validated screening tools available for children older than 30 months and the Academy does not recommend universal screening for ASD in that age group.

Diagnostic tools commonly used in the U.S. are summarized in Table 4. The accuracy of many of these tools has not been well studied.16 Tools that are recommended in national guidelines and used in the U.S. include Autism Diagnostic Interview-Revised (ADI-R), Autism Diagnostic Observation Schedule-2nd edition (ADOS-2), and Childhood Autism Rating Scale 2nd edition (CARS-2). According to a 2018 Cochrane systematic review and meta-analyses, authors observed substantial variation in sensitivity and specificity of all tests. According to summary statistics for ADOS, CARS, and ADI-R, ADOS was found to be the most sensitive. All tools performed similarly for specificity.16

Table 3. Commonly Used Screening Instruments and Tools for Autism Spectrum Disorder in the United Statesa

 
Tool Age Description Sensitivity/specificity Validation Comment
M-CHAT-R/F

16 to 30 months

  • Parent/caregiver completed questionnaire designed to identify children at risk for autism from the general population
  • 20 items;17 5 – 10 minutes administration time18
  • Available in multiple languages19
  • Available for free
  • Sensitivity: 91%
  • Specificity: 95%18
  • > 15,000 children in primary care practices18
  • Validated as first tier screenb
  • This is the most frequently-used test for “screening aged” children in the United States.20
  • Assesses risk of ASD as low, medium, or high. Children at medium risk require structured follow-up questions for additional information before referral for diagnostic evaluation. Follow-up interview takes approximately 5 to 10 minutes.18
STAT 24 to 36 months21,22
  • Clinician-directed, interactive, and observation measure; requires training of clinician for standardized administration; not for population screening23
  • 12 observed activities during 20-minute play session22
  • Sensitivity: 83 to 95%22
  • Specificity: 73 to 86%24
  • 52 children with ASD and other developmental disorders and 71 high-risk children22,24
  • Not validated as a first tier screenb
  • Primarily a second-stage screen for children already suspected to have high ASD risk, to rule out ASD.20
  • Language comprehension is not required.23
SCQ 4+ years25
 
  • Parent/caregiver completed questionnaire; designed to identify children at risk for ASD from the general population; based on items in the ADI-R
  • 40 items (yes/no); < 10 minutes administration time and < 5 minutes to score
  • Sensitivity: 85%
  • Specificity: 75%25
  • 90% of children who failed (SCQ score ≥ 15) had a neurodevelopmental disorder26
  • 200 high-risk patients25
  • 247 low-risk children from school or general population26
  • Additional studies are necessary before the SCQ can be used as a first-tier screen.b
  • This tool is for older children.
  • Nonverbal children may require different cut-off scores.27
ITC

6 to 24 months28
 

  • Parent/caregiver questionnaire: screens for language delay
  • 24-item (component of CSBS-DP); 15 mins administration time28
  • Sensitivity and specificity of 88.9 for identifying ASD or other developmental delays
  • PPV: 71 to 79 and NPV: 88 to 99 for 9- to 24-month-old children.28
  • 5,385 children from a general population28
  • Digital screening (n = 57,603) as part of community-screen-evaluate-treat model29
  • Studies support the validity for children 9 to 24 months of age but not 6 to 8 months.28
POSI 16 to 35 months
  • Parent/caregiver questionnaire used to assess autism risk developed as part of a comprehensive primary care screening instrument, the Survey of Wellbeing of Young Children
  • 7-item parent/caregiver reported items; ≤ 5 minutes to complete

Age 16 to 36 months

  • Sensitivity: 83% and Specificity: 74%

Age 18 to 48 months

  • Sensitivity: 89% and Specificity: 54%

Age 16 to 48 months30

  • Sensitivity: 94% and Specificity: 41%

Age 16 to 30 months30

  • Sensitivity: 75% and Specificity: 48%
  • 232 children (16 to 36 months) from primary care and specialty clinics
  • 217 children (18 to 48 months) from specialty clinic
  • 524 children (16 to 48 months) referred to a developmental-behavioral clinic
  • Additional studies in community samples are necessary before the POSI can be recommended as a first-tier screen.b
  • This is a good choice for practices that seek integrated autism and developmental screening.

a The AAP does not approve/endorse any specific tool for screening purposes.20 This table is not exhaustive, and other tests are available such as the Autism Spectrum Screening Questionnaire (ASSQ), Developmental Behavior Checklist-Autism Screening Algorithm (DBC-ASA), Developmental Behavior Checklist-Early Screen (DBC-ES), Developmental Behavior Checklist for Pediatrics (DBC-P), Intelligence Quotient (IQ), and Rapid Interactive Screening Test for Autism in Toddlers (RITA-T).
b First-tier screening tools are used to identify children at risk for ASD from a general population; second-tier screening tools are used to discriminate ASD from other developmental disorders in children with developmental concerns.
ADI-R: Autism Diagnostic Interview-Revised; ASD; autism spectrum disorder; CSBS-DP: Communication and Symbolic Behavior Scales Developmental Profile; ITC: Infant Toddler Checklist; M-CHAT-R/F: Modified Checklist for Autism in Toddlers, Revised with Follow-Up; NPV: negative predictive value; POSI: Parent's Observations of Social Interactions; PPV: positive predictive value; SCQ: Social Communication Questionnaire; STAT: Screening Tool for Autism in Toddlers and Young Children

Table 4. Commonly Used Diagnostic Instruments and Tools for Autism Spectrum Disorder in the United Statesa

Tool Age Description Comments
ADI-R Mental age ≥ 18 months
  • 2- to 3-hour 93-point semi-structured clinical interview that probes for ASD symptoms
  • Not practical for clinical settings
  • Usually used in research settings, often combined with the ADOS-2
ADOS-2nd edition Age 12 months through adulthood
  • Semi-structured assessment by trained clinician of social interaction, play/imaginative use of materials, communication and atypical behaviors
  • 5 modules based on child's expressive language abilities (including one for toddlers)
  • Takes 40 to 60 minutes to administer
  • Reference standard for diagnosis of ASD in research studies and clinical settings
  • The information obtained from the ADOS-2 is used by the clinician in conjunction with the history of peer interactions, social relationships, and functional impairment from symptoms to determine if the DSM-5 criteria are met
CARS-2 Children ≥ 2 years of age
  • 15 items directly observed by a trained clinician and a parent unscored questionnaire
  • Takes 20 to 30 minutes to administer
  • 15 items are correlated with DSM-5


a This table is not exhaustive, and other tests are available such as Developmental Dimensional and Diagnostic Interview (3di), Diagnostic Interview for Social and Communication Disorder (DISCO), Gilliam Autism Rating Scale (GARS) and Social Responsiveness Scale, Second edition (SRS). According to AAP, validated observation tools include the Autism Diagnostic Observation Schedule, Second Edition (ADOS-2) and the Childhood Autism Rating Scale, Second Edition (CARS-2). No single observation tool is appropriate for all clinical settings.3,
ADI-R: Autism Diagnostic Interview-Revised; ADOS-2: Autism Diagnostic Observation Schedule-2nd edition (ADOS-2); ASD: autism spectrum disorder; CARS-2: Childhood Autism Rating Scale 2nd edition; DSM-5: The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition

Outcomes
The general outcomes of interest are test validity, symptoms, functional outcomes, quality of life.

Beneficial outcomes resulting from a true negative test result are avoiding unnecessary subsequent testing.

Beneficial outcomes resulting from a true positive test result are early referral for comprehensive evaluation and identification of ASD leading to early intervention and improved health outcomes.

Harmful outcomes resulting from a false-positive test result are unnecessary testing or treatment, potential stigmatization and other ethical, legal, and social implications such as educational and employment discrimination.

Harmful outcomes resulting from a false-negative test result are diagnostic delay and possibility of missing treatment during the key window of developmental plasticity.

A fuller explanation of appropriate outcomes is not possible until the position of the test in the screening and diagnostic pathway is clarified.

Study Selection Criteria
For the evaluation of clinical validity of Canvas Dx, studies that meet the following eligibility criteria were considered:

  • Reported on the accuracy of the marketed version of the technology (including any algorithms used to calculate scores)
  • Included a suitable reference standard
  • Patient/sample clinical characteristics were described
  • Patient/sample selection criteria were described.

Clinically Valid
A test must detect the presence or absence of a condition, the risk of developing a condition in the future, or treatment response (beneficial or adverse).

Diagnostic Performance
Characteristics and results of a single clinical validity study by Abbas et al. (2020) evaluating the performance of the Canvas Dx (formerly known as Cognoa App) for diagnosing ASD are summarized in Tables 5 and 6.15 The study included at-risk children 18 to 72 months of age, with English speaking parents who were referred to specialized centers in the U.S. for a comprehensive evaluation and diagnosis of ASD. All children received an Autism Diagnostic Observation Schedule (ADOS) as well as standard screening tool like Modified Checklist for Autism in Toddlers, Revised (M-CHAT-R), and Child Behavior Checklist (CBCL). Prior to the children receiving the comprehensive clinical assessment, parents of the children completed the Canvas Dx app. The app consisted of 3 modules (parent questionnaire, video, and clinician questionnaire). Much of the publication outlines the details of machine learning methods in training datasets. These details were not reviewed. Results of the study showed that Canvas Dx outperformed baseline screeners administered to children by 0.35 (90% CI: 0.26 to 0.43) in AUC and although the thresholds used for categorization were not specified, there was an improvement of 0.69 (90% CI: 0.58 to 0.81) in specificity when operating at 90% sensitivity. Compared to the baseline screeners evaluated on children less than 48 months of age, Canvas Dx outperforms baseline screeners by 0.18 (90% CI: 0.08 to 0.29 at 90%) in AUC and 0.30 (90% CI: 0.11 to 0.50) in specificity when operating at 90% sensitivity.

Relevance, design, and conduct gaps in the studies are described in Tables 7 and 8. The major limitation is the lack of clarity on how the test fits into the current pathway (i.e., whether it's a screening test or a diagnostic test). As per the FDA-cleared indication, Canvas Dx is intended for use as an aid in the diagnosis of ASD. This cleared indication is not explicit about whether Canvas Dx is intended to be used as a screening test in the community setting or if it is to be used as an adjunct with other standard diagnostic tools at a specialist office for diagnosis of ASD or possibly a third scenario- as a diagnostic tool in a community setting used by primary care physicians. Each of the 3 scenarios will require a unique PICO formulation and unless there is clarity on intended use, it is difficult to interpret currently available evidence. Second, the manufacturer asserts that Canvas Dx is intended to be used by a primary care physician to aid in the diagnosis of ASD, but the only published study on clinical validity used a specialist rather than a primary care physician to complete the clinical questionnaire module. This is likely to result in higher sensitivity and specificity and thus confounds the interpretation of published data on clinical validity. Further testing in primary care clinics is needed to validate accuracy of the clinician module. In addition, all published studies were conducted on children who had been preselected as having a high risk of autism. No studies on children from the general population have been published. Other limitations include differences that may occur between the testing environments of a structured clinical setting versus the home setting.

Table 5. Characteristics of Studies of Clinical Validity of a Diagnostic Test

Study Study Population Design Reference Standardfor ASD Threshold for Positive Canvas Dx Timing of Reference and Canvas Dx Blinding of Assessors Comment
Abbas et al. 202015
  • At-risk children 18 to 72 months of age referred for ASD evaluation through a standard referral program process with English-speaking parents
  • Prospective
  • Sample selection not defined
  • 2020 publication utilized currently marketed version of Canvas Dx with 3 modules (parent questionnaire, video, and clinician questionnaire)
  • Undergone full clinical examination and received a clinical diagnosis at a center specialized in neurodevelopmental disorders
  • Each child received standard autism assessment instruments (such as ADOS, M-CHAT-R, and/or CBCL) appropriate for their age
  • Thresholds not defined
  • Ordinal result: +ASD or -ASD determination by Canvas Dx
Canvas Dx was administered prior to clinical visit at the specialized center for clinical diagnosis of ASD where reference test was administered. Yes None

ADOS: Autism Diagnostic Observation Schedule; ASD: autism spectrum disorder; CBCL: Child Behavior Checklist; M-CHAT-R: Modified Checklist for Autism in Toddlers, Revised 

Table 6. Clinical Validity Results of Canvas Dx

Study Initial N Final N Excluded Samples Prevalence of Condition Sensitivity Specificity AUC
Abbas et al. 202015 375
  • For M-CHAT-R: N = 204
  • CBCL: N = 363
  • SRS: N = 302
  • Parent Coverage: N = 375
  • Parent & Video: N = 368
  • Parent & Video & Clinician: N = 204
Use of screening tools is based on age and therefore n < 375 for screening tools does not imply missing data. The total sample of 375 includes data from 162 participants from Abbas 2018 study, which did not include clinician questionnaire module.
  • Total sample (n = 375): 72.5%
  • Age < 4 years (n = 216): 76.9%
  • Age ≥ 4 years (n = 159): 66.7%
Inconclusive determination featurea turned off:
80% 75% 0.83
Inconclusive determination featurea turned on with allowance of up to 30%:
90% 83% 0.92

a Inconclusive determination was allowed for up to 25% or 30% of the cases, but how many were excluded when this feature is turned on was not reported.
AUC: area under curve; CBCL: Child Behavior Checklist; M-CHAT-R: Modified Checklist for Autism in Toddlers, Revised; SRS: Social Responsiveness Scale 

Table 7. Study Relevance Limitations

Study Populationa Interventionb Comparatorc Outcomesd Duration of Follow-Upe
Abbas et al, 202015, 2. Test use in current diagnostic pathway unclear
4. Study only includes patients at high risk of ASD. Further studies are needed to verify this by conducting clinical studies on children from the general population.
5. It is unclear if racial minorities were well-represented
1. Classification thresholds were not specified
4. Other (assuming the intended use of Canvas Dx is in a community setting for purpose of screening, the clinician questionnaire module was filled by a specialist at a tertiary care center raising the issue of applicability of Canvas Dx)
3. Other (comparator would depend on exactly how the test fits into the diagnostic pathway). 6. Other (diagnostic performance characteristics not provided for general population as the studies are not representative of the intended clinical population)  

a Population key: 1. Intended use population unclear; 2. Clinical context is unclear; 3. Study population is unclear; 4. Study population not representative of intended use; 4 Enrolled study populations do not reflect relevant diversity; 5. Other
b Intervention key: 1. Classification thresholds not defined; 2. Version used unclear; 3. Not intervention of interest (e.g., older version of test, not applied as intended); 4. Other.
c Comparator key: 1. Classification thresholds not defined; 2. Not compared to credible reference standard; 3. Not compared to other tests in use for same purpose; 4. Other.
d Outcomes key: 1. Study does not directly assess a key health outcome; 2. Evidence chain or decision model not explicated; 3. Key clinical validity outcomes not reported; 4. Reclassification of diagnostic or prognostic risk categories not reported; 5. Adverse events of the test not described (excluding minor discomforts and inconvenience of venipuncture or noninvasive tests); 6. Other.
e Follow-Up key: 1. Follow-up duration not sufficient with respect to natural history of disease (true positives, true negatives, false positives, false negatives cannot be determined); 2: Other.
ASD: autism spectrum disorder

Table 8. Study Design and Conduct Limitations

Study Selectiona Blindingb Delivery of Testc Selective Reportingd Data Completenesse Statisticalf
Abbas et al. 202015 1. Selection not described;   1. Timing of delivery of index or reference test not described (for 2018 publication) 1. Not registered 4. Other (unclear how many samples were excluded when inconclusive feature was turned on) 1. Confidence intervals were not reported for sensitivity and specificity not reported

a Selection key: 1. Selection not described; 2. Selection not random or consecutive (i.e., convenience); 3. Other.
b Blinding key: 1. Not blinded to results of reference or other comparator tests; 2. Other.
c Test Delivery key: 1. Timing of delivery of index or reference test not described; 2. Timing of index and comparator tests not same; 3. Procedure for interpreting tests not described; 4. Expertise of evaluators not described; 5. Other.
d Selective Reporting key: 1. Not registered; 2. Evidence of selective reporting; 3. Evidence of selective publication; 4. Other.
e Data Completeness key: 1. Inadequate description of indeterminate and missing samples; 2. High number of samples excluded; 3. High loss to follow-up or missing data; 4. Other.
f Statistical key: 1. Confidence intervals and/or p values not reported; 2. Comparison to other tests not reported; 3: Insufficient consideration of potential confounding; 4. Other.

Clinically Useful
A test is clinically useful if the use of the results informs management decisions that improve the net health outcome of care. The net health outcome can be improved if patients receive correct therapy, or more effective therapy, or avoid unnecessary therapy, or avoid unnecessary testing.

Direct Evidence
Direct evidence of clinical utility is provided by studies that have compared health outcomes for patients managed with and without the test. Because these are intervention studies, the preferred evidence would be from randomized controlled trials.

Chain of Evidence
Indirect evidence on clinical utility rests on clinical validity. If the evidence is insufficient to demonstrate test performance, no inferences can be made about clinical utility. There are no studies comparing clinical outcomes for patients diagnosed using Canvas Dx with alternative methods for testing for ASD (i.e., no direct evidence that the test is clinically useful). Currently, it is unclear whether a chain of evidence can be constructed because of the lack of clarity on how the test results would be used to change management practices.

Section Summary: Autism Spectrum Disorder
The evidence on Canvas Dx includes a single prospective study of clinical validity. Results of the study reported that Canvas Dx outperformed conventional autism screeners both in AUC, sensitivity and specificity. However, multiple limitations were noted. The major limitation is the lack of clarity on how the test fits into the current pathway. The FDA-authorized indication is for children who are at risk of developmental delay. It is unclear how Canvas Dx may be used as an aid to existing testing used for diagnosis during the comprehensive clinical diagnostic evaluation. Several potential scenarios for use of Canvas Dx are possible. For example, Canvas Dx could be used as an add-on test to existing comprehensive diagnostic evaluation tests or it could replace existing comprehensive diagnostic evaluation tests among a population of children at risk for developmental delay for confirmatory diagnosis of ASD. Canvas Dx could also be used as a rule-out test to identify false-positive cases among a population of children at risk for developmental delay to minimize unnecessary referrals for a comprehensive diagnostic evaluation. It remains unclear if the use of Canvas Dx eliminates the need for comprehensive diagnostic evaluation by a specialist. In addition, there is also a potential of "off-label" use of this test in the general population, either as a screening test or a diagnostic test. Each of the scenarios will require a unique PICO formulation and unless there is clarity on intended use, it is difficult to interpret currently available evidence. Second, the manufacturer asserts that Canvas Dx is intended to be used by a primary care physician to aid in the diagnosis of ASD, but the only published study on clinical validity used a specialist rather than a primary care physician to complete the clinical questionnaire module. This is likely to result in higher sensitivity and specificity and thus confounds the interpretation of published data on clinical validity. Further testing in primary care clinics is needed to validate accuracy of the clinician module. In addition, all published studies were conducted on children who had been preselected as having a high risk of autism. No studies on children from the general population have been published. Other limitations include differences that may occur between the testing environments of a structured clinical trial setting versus the home setting and lack of data on usability outside of a clinical trial.

Summary of Evidence
For individuals who are in the age range of 18 to 72 months and in whom there is a suspicion of ASD by a parent, caregiver, or health care provider and who receive Canvas Dx, the evidence includes a single prospective study of clinical validity. Relevant outcomes are test validity, change in disease status, functional outcomes, and quality of life. Results of the study reported that Canvas Dx outperformed conventional autism screeners both in AUC, sensitivity, and specificity. However, multiple limitations were noted. The major limitation is the lack of clarity on how the test fits into the current pathway. Diagnosis of ASD in the United States generally occurs in 2 steps: developmental screening followed by comprehensive diagnostic evaluation if screened positive. To evaluate the utility of the test, an explication of how the test would be integrated into the current recommended screening and diagnostic pathway is needed. Neither the manufacturer’s website nor the FDA-cleared indication is explicit on how the test fits into the current pathway. It is unclear whether the test is meant to be used as add-on test to existing comprehensive diagnostic evaluation tests or if it could replace existing comprehensive diagnostic evaluation tests among a population of children at risk for developmental delay for confirmatory diagnosis of ASD. In addition, there is also a potential of "off-label" use of this test in the general population, either as a screening test or a diagnostic test. Second, the manufacturer asserts that Canvas Dx is intended to be used by a primary care physician to aid in the diagnosis of ASD, but the published study on clinical validity used a specialist rather than a primary care physician to complete the clinical questionnaire module. This is likely to result in higher sensitivity and specificity and thus confounds the interpretation of published data on clinical validity. Further testing in primary care clinics is needed to validate the accuracy of the clinician module. In addition, all published studies were conducted on children who had been preselected as having high risk of autism. No studies on children from the general population have been published. Other limitations include differences that may occur between the testing environments of a structured clinical trial setting versus the home setting and lack of data on usability outside of a clinical trial. Evidence for the Canvas Dx has not directly demonstrated that the test is clinically useful, and a chain of evidence cannot be constructed to support its utility. The evidence is insufficient to determine that the technology results in an improvement in the net health outcome.

The purpose of the following information is to provide reference material. Inclusion does not imply endorsement or alignment with the evidence review conclusions.

Practice Guidelines and Position Statements
Guidelines or position statements will be considered for inclusion in "Supplemental Information" if they were issued by, or jointly by, a U.S. professional society, an international society with U.S. representation, or National Institute for Health and Care Excellence (NICE). Priority will be given to guidelines that are informed by a systematic review, include strength of evidence ratings, and include a description of management of conflict of interest.

American Academy of Pediatrics
The American Academy of Pediatrics (AAP) guidelines recommend ASD-specific universal screening in all children at ages 18 and 24 months in addition to developmental surveillance and monitoring.Toddlers and children should be referred for diagnostic evaluation when increased risk for developmental disorders (including ASD) is identified through screening and/or surveillance. Children should be referred for intervention for all identified developmental delays at the time of identification and not wait for an ASD diagnostic evaluation to take place. The AAP does not approve nor endorse any specific tool for screening purposes. The AAP has published a toolkit that provides a list of links to tools for developmental surveillance and screening for use at the discretion of the health care professional.20

The American Academy of Child and Adolescent Psychiatry
The American Academy of Child and Adolescent Psychiatry recommends that the developmental assessment of young children and the psychiatric assessment of all children should routinely include questions about ASD symptomatology.31

The UK National Screening Committee
The UK National Screening Committee32 does not recommend systematic population screening for ASD because

  • There is not currently a test that is good enough for screening the general population.
  • It is not known if screening would improve long term outcomes for children with autism.
  • There is not an established approach to screening which is acceptable to parents.

These recommendations were based on a summary of evidence published in 2012. The next review is estimated to be completed in 2022.

U.S. Preventive Services Task Force Recommendations
The U.S. Preventive Services Task Force (USPSTF) published recommendations for ASD in young children in 2016.33 The USPSTF concludes that the current evidence is insufficient to assess the balance of benefits and harms of screening for ASD in young children (children 18 to 30 months of age) for whom no concerns of ASD have been raised by their parents or a clinician.

Ongoing and Unpublished Clinical Trials
Some currently unpublished trials that might influence this review are listed in Table 9.

Table 9. Summary of Key Trials

NCT No. Trial Name Planned Enrollment Completion Date
Ongoing      
NCT05223374 Extension for Community Healthcare Outcomes (ECHO) Autism Diagnostic Study in Primary Care Setting 100 Jun 30, 2023
Unpublished      
NCT04326231a Cognoa ASD Digital Therapeutic Engagement and Usability Study 30 Jul 2020
NCT04151290a Cognoa ASD Diagnosis Aid Validation Study 711 Aug 31, 2020

NCT: national clinical trial.
a Denotes industry-sponsored or cosponsored trial.

References 

  1. Lipkin PH, Macias MM, Norwood KW, et al. Promoting Optimal Development: Identifying Infants and Young Children With Developmental Disorders Through Developmental Surveillance and Screening. Pediatrics. Jan 2020; 145(1). PMID 31843861
  2. Hyman SL, Levy SE, Myers SM, et al. Identification, Evaluation, and Management of Children With Autism Spectrum Disorder. Pediatrics. Jan 2020; 145(1). PMID 31843864
  3. Dawson G, Bernier R. A quarter century of progress on the early detection and treatment of autism spectrum disorder. Dev Psychopathol. Nov 2013; 25(4 Pt 2): 1455-72. PMID 24342850
  4. Dawson G, Rogers S, Munson J, et al. Randomized, controlled trial of an intervention for toddlers with autism: the Early Start Denver Model. Pediatrics. Jan 2010; 125(1): e17-23. PMID 19948568
  5. Hertz-Picciotto I, Delwiche L. The rise in autism and the role of age at diagnosis. Epidemiology. Jan 2009; 20(1): 84-90. PMID 19234401
  6. Leigh JP, Grosse SD, Cassady D, et al. Spending by California's Department of Developmental Services for Persons with Autism across Demographic and Expenditure Categories. PLoS One. 2016; 11(3): e0151970. PMID 27015098
  7. Maenner MJ, Shaw KA, Bakian AV, et al. Prevalence and Characteristics of Autism Spectrum Disorder Among Children Aged 8 Years - Autism and Developmental Disabilities Monitoring Network, 11 Sites, United States, 2018. MMWR Surveill Summ. Dec 03 2021; 70(11): 1-16. PMID 34855725
  8. International Medical Device Regulators Forum. Software as a Medical Device (SaMD): Key Definitions. 2013. http://www.imdrf.org/docs/imdrf/final/technical/imdrf-tech-131209-samd-key-definitions-140901.pdf. Accessed December 27, 2021.
  9. National Institute for Health and Care Excellence (NICE). Evidence standards framework for digital health technologies. 2021. nice.org.uk/corporate/ecd7/chapter/section-a-evidence-for-effectiveness-standards. Accessed December 26, 2021.
  10. Zwaigenbaum L, Bauman ML, Choueiri R, et al. Early Intervention for Children With Autism Spectrum Disorder Under 3 Years of Age: Recommendations for Practice and Research. Pediatrics. Oct 2015; 136 Suppl 1: S60-81. PMID 26430170
  11. Zwaigenbaum L, Bryson S, Lord C, et al. Clinical assessment and management of toddlers with suspected autism spectrum disorder: insights from studies of high-risk infants. Pediatrics. May 2009; 123(5): 1383-91. PMID 19403506
  12. Kleinman JM, Ventola PE, Pandey J, et al. Diagnostic stability in very young children with autism spectrum disorders. J Autism Dev Disord. Apr 2008; 38(4): 606-15. PMID 17924183
  13. Canvas Dx Website. Accessed on April 25, 2022. Available at https://canvasdx.com/
  14. Abbas H, Garberson F, Liu-Mayo S, et al. Multi-modular AI Approach to Streamline Autism Diagnosis in Young Children. Sci Rep. Mar 19 2020; 10(1): 5014. PMID 32193406
  15. Randall M, Egberts KJ, Samtani A, et al. Diagnostic tests for autism spectrum disorder (ASD) in preschool children. Cochrane Database Syst Rev. Jul 24 2018; 7: CD009044. PMID 30075057
  16. Dumont-Mathieu T, Fein D. Screening for autism in young children: The Modified Checklist for Autism in Toddlers (M-CHAT) and other measures. Ment Retard Dev Disabil Res Rev. 2005; 11(3): 253-62. PMID 16161090
  17. Robins DL, Casagrande K, Barton M, et al. Validation of the modified checklist for Autism in toddlers, revised with follow-up (M-CHAT-R/F). Pediatrics. Jan 2014; 133(1): 37-45. PMID 24366990
  18. DuBay M, Watson LR, Mendez LI, et al. Psychometric Comparison of the English and Spanish Western-Hemisphere Versions of the Modified Checklist for Autism in Toddlers-Revised. J Dev Behav Pediatr. Dec 01 2021; 42(9): 717-725. PMID 34840315
  19. Autism Spectrum Disorder: Links to Commonly Used Screening Instruments and Tools (AAP Toolkits). American Academy of Pediatrics. Accessed on April 27, 2022. Available at https://publications.aap.org/toolkits/pages/asd-screening-tools
  20. Stone WL, Coonrod EE, Ousley OY. Brief report: screening tool for autism in two-year-olds (STAT): development and preliminary data. J Autism Dev Disord. Dec 2000; 30(6): 607-12. PMID 11261472
  21. Stone WL, Coonrod EE, Turner LM, et al. Psychometric properties of the STAT for early autism screening. J Autism Dev Disord. Dec 2004; 34(6): 691-701. PMID 15679188
  22. Robins DL, Dumont-Mathieu TM. Early screening for autism spectrum disorders: update on the modified checklist for autism in toddlers and other measures. J Dev Behav Pediatr. Apr 2006; 27(2 Suppl): S111-9. PMID 16685177
  23. Stone WL, McMahon CR, Henderson LM. Use of the Screening Tool for Autism in Two-Year-Olds (STAT) for children under 24 months: an exploratory study. Autism. Sep 2008; 12(5): 557-73. PMID 18805947
  24. Berument SK, Rutter M, Lord C, et al. Autism screening questionnaire: diagnostic validity. Br J Psychiatry. Nov 1999; 175: 444-51. PMID 10789276
  25. Chandler S, Charman T, Baird G, et al. Validation of the social communication questionnaire in a population cohort of children with autism spectrum disorders. J Am Acad Child Adolesc Psychiatry. Oct 2007; 46(10): 1324-1332. PMID 17885574
  26. Eaves LC, Wingert H, Ho HH. Screening for autism: agreement with diagnosis. Autism. May 2006; 10(3): 229-42. PMID 16682396
  27. Wetherby AM, Brosnan-Maddox S, Peace V, et al. Validation of the Infant-Toddler Checklist as a broadband screener for autism spectrum disorders from 9 to 24 months of age. Autism. Sep 2008; 12(5): 487-511. PMID 18805944
  28. Pierce K, Gazestani V, Bacon E, et al. Get SET Early to Identify and Treatment Refer Autism Spectrum Disorder at 1 Year and Discover Factors That Influence Early Diagnosis. J Pediatr. Sep 2021; 236: 179-188. PMID 33915154
  29. Salisbury LA, Nyce JD, Hannum CD, et al. Sensitivity and Specificity of 2 Autism Screeners Among Referred Children Between 16 and 48 Months of Age. J Dev Behav Pediatr. Apr 2018; 39(3): 254-258. PMID 29570569
  30. Volkmar F, Siegel M, Woodbury-Smith M, et al. Practice parameter for the assessment and treatment of children and adolescents with autism spectrum disorder. J Am Acad Child Adolesc Psychiatry. Feb 2014; 53(2): 237-57. PMID 24472258
  31. UK National Screening Committee. Child screening programme. Autism. Accessed on April 27, 2022 Available at https://view-health-screening-recommendations.service.gov.uk/autism/
  32. Siu AL, Bibbins-Domingo K, Grossman DC, et al. Screening for Autism Spectrum Disorder in Young Children: US Preventive Services Task Force Recommendation Statement. JAMA. Feb 16 2016; 315(7): 691-6. PMID 26881372

Coding Section 

Codes Number Description
CPT N/A No specific code
HCPCS N/A  
ICD10 CM Z13.41 Encounter for autism screening
  Z81.8 Family history of other mental and behavioral disorders
Place of Service Outpatient/Office  
Type of Service Digital Application

Procedure and diagnosis codes on Medical Policy documents are included only as a general reference tool for each policy. They may not be all-inclusive. 

This medical policy was developed through consideration of peer-reviewed medical literature generally recognized by the relevant medical community, U.S. FDA approval status, nationally accepted standards of medical practice and accepted standards of medical practice in this community, Blue Cross Blue Shield Association technology assessment program (TEC) and other nonaffiliated technology evaluation centers, reference to federal regulations, other plan medical policies and accredited national guidelines.

"Current Procedural Terminology © American Medical Association. All Rights Reserved" 

History From 2022 Forward    

09/01/2023 Annual review, no change to policy intent. Updating table 7 and it's footnotes.
09/07/2022 NEW POLICY

 

Complementary Content
${loading}