In a couple of weeks, many of us in the nephrology community will be roughing it down in the “Big Easy.” New Orleans is the host city for this year’s annual Renal Physicians Association meeting. I have the privilege of participating in that meeting, and on Sunday morning will share the stage with Drs. Mark Kaplan and Adam Weinstein to discuss EHR best practices. As I was preparing for that presentation the topic of medical record storage came up within the context of the physician’s obligation to maintain a patient’s medical record.
As many in the community know, physicians are obligated by statute to maintain a patient’s medical record for several years. The number of years varies from state to state, but in general the record must be maintained for at least seven years following the provider’s last encounter with the patient. If the patient is a minor, the rules are a bit more complex and again vary from state to state. This of course creates a remarkable storage challenge. Those of us over the age of 40 recall trips to medical records departments in the bowels of large community hospitals, which housed hundreds of thousands of paper charts. Physician practices meet the records storage challenge in a wide variety of ways today, including scanning records into an archive, housing those records within their EHR and even renting space to maintain paper records.
What does any of this have to do with the double-helix? A couple of weeks ago in the Economist, an intriguing article told how a couple of folks from the European Bioinformatics Institute have discovered a unique solution to a data storage problem they face. These guys do genomic research and they face rapidly expanding data storage capacity challenges. One night, over an adult beverage or two, Nick Goldman and Ewan Birney consolidated their idea on the back of a napkin (or the equivalent one finds in an English pub) and last month published their results in Nature.
At the center of the idea is to use DNA, nature’s natural data storage system, instead of hard drives and tapes for the purpose of storing data. Using a clever coding scheme based on the four chemical “bases”—adenosine (A), thymine (T), cytosine (C) and guanine (G), which we all fondly recall from our biochem class—Goldman and Birney were able to copy and then read several files with remarkable success. The advantage of using artificially constructed DNA as a data center? Space and time come to mind. The fellow writing for the Economist estimates there are about three zettabytes of digital data in the world today. For those of you keeping score at home, a zettabyte is 1021 bytes, or stated another way, roughly one billion terabytes of data. The data density of the technique described in Nature would permit storing the world’s experience with data today in the back of a truck. The other advantage is time. Under the appropriate environmental conditions, this data structure would remain intact for thousands of years. The disadvantage of course is cost. At an estimated $12,400 per megabyte, don’t expect to see this storage option on the shelves at Walmart next year.
Sixty years ago, Jim Watson and Francis Crick made a discovery that dramatically influenced our understanding of life. In March of 1953, in a letter to his son Francis Crick wrote, “…we think we have found the basic copying mechanism by which life comes from life. You can understand that we are very excited.” Excited indeed. I hope to see many of you in New Orleans. If you have a medical records challenge or solution to share, drop us a comment.