Monday, February 6, 2023

"Can ChatGPT Summarize It?" A Litmus Test for Clarity?

Starting Point:  On January 20, 2023, I published a new white paper on trends in coding and reimbursement in solid tumors - find it here.

I used this as the basis for some AI research on Chat GPT's ability to summarize key points.


Computer Summary of Executive Summary

The white paper has a page-long executive summary, and about 10 pages of text.   

I gave the executive summary to ChatGPT to summarize, and it did a pretty good job.  (Clipped at bottom; 615 words to 165).

And...Summary of Half of Paper

Next, ChatGPT can't absorb 10 pages of text, so I cut and pasted  the second half of the paper into ChatGPT, for a summary.   This also came out pretty well.   (Clipped at bottom; 1700 words to 163).

What's the Point?  Could This be a Clarity Litmus Test?

So it occured to me, people often think of ChatGPT in forward-mode:  Can it write a paper?  Can it write a discussion section?   In today's essay, I am using it the opposite way, as a sort of instantaneous cross-check after the writing's done.  

Proposed: If you have written a paper, essay, op ed, or executive summary, can ChatGPT "understand it" and summarize it clearly?  If so, you've written clearly enough that even a computer (basically a pile of tin cans and some batteries) can understand it, then it's probably fairly clear.

And on the other hand, if you give your essay to ChatGPT to summarize, and it returns with a confusing, conflicted mess, maybe - just maybe - your human writing isn't as clear as you thought it was.

#   #   #   #  

The Computer Summary of my own Executive Summary:

Comprehensive Genomic Profiling (CGP) in cancer is widely recognized as having clinical utility and is included in updated guidelines, Medicare coverage policies, and white papers. The newer plasma-based CGP technology is preferred by some cancer patients as it is available when solid tumor tissue is not and can be performed serially to determine therapy resistance. The AMA code set for genomics does not distinguish between specimen sources but the newer PLA code set does and there are already codes for plasma-based tumor testing. Reimbursement for plasma-based testing requires codes specifying it as the underlying technology and there are already codes for two FDA-approved tests. The cost of plasma-based testing is higher due to the depth of sequencing required, but it offers cost offsets compared to tissue-based testing. The AMA CPT has recently introduced new specificity codes for RNA sequencing in cancer and is working on updating the coding system for cancer genomics. 2023 is expected to be an exciting period for policy around plasma-based CGP.

Computer Summary of Second Half of My White Paper:

The coding system for genomics in the US, the AMA CPT codes, has seen a rise in codes related to genomics in recent years, including the creation of the "Proprietary Laboratory Analysis" (PLA) codes. There are currently only a few codes for tumor genomics and these codes have not yet been specific to sample source such as plasma or paraffin. The AMA CPT has recently revised codes for DNA and RNA analyses but more needs to be done to keep up with the latest advancements. The coding system has not yet recognized the coding for plasma-based comprehensive genomic profiling (CGP) testing, which is an important area now seeing FDA-approved tests and rapid innovation from laboratory-developed tests. The resources for plasma-based CGP tests are different and higher than for FFPE-based tests, but there are cost offsets obtained from using a plasma-based approach. There is still time to debate these topics and develop the best consensus strategies through the AMA CPT meeting in May 2023.