Category Archives: Computers & Software

The Smartphone is an Essential Medical Instrument

The storage capacity of the human mind is amazing. One estimate of the size of the brain’s “RAM” is as high as  2.5 petabytes (a million gigabytes). The number is based on the total number of neurons in the brain and the total number of possible connections per neuron. I suspect it is an overestimate, given the vagaries and innate inefficiency of biological systems. Nevertheless the true figure is undoubtedly impressive. But not infinite.

There are well-documented feats of human memory and calculating prowess. Ancient Greeks could memorize and recite the epic poems of Homer. Indeed this was how the Iliad and the Odyssey were passed down for generations before the Greeks acquired writing. Savants can quickly perform cube roots of long integers or have memorized pi to over 20,000 decimal places. Musical prodigies like Mozart or geniuses like Einstein impress us with the capabilities of their brains. Yet for the average person who has trouble memorizing a shopping list, these stellar examples of mental fortitude provide little solace. The old myth that we are only using 10% of our brain capacity has been debunked . So unless you’re willing to believe the combination kelp-Ginkgo-biloba-blueberry supplement you heard about on the radio is really going to work, you are pretty well stuck with the brain and memory capacity you have right now. At least until things get worse as you get older.

While the brain’s capacity may increase due to evolutionary forces over the next few thousands years (or not, see the movie Idiocracy), the amount of information that it is required to hold is not constrained by such a slow process. According to one source , there are now over 50 million scientific publications, with about 2.5 million new articles published each year. There is a 4-5% increase in the number of publishing scientists per year. No one can absorb all this. The days of the “Renaissance Man” who could quote Bulwer-Lytton while relating the latest experimental data from Maxwell and then play a Bach fugue while giving a dissertation on Baroque counterpoint are long gone. So what’s a 21st century scientist (or physician) to do?

One thing we should not do is to attempt to memorize everything. It is important to off-load as much information from our brains as possible. Our brains need to be more like an index than a database. We need to know what information we are looking for and where to find it. Information that we use all the time is automatically memorized and we don’t have to look it up. But a lot of information that we don’t use frequently is better off external to our brains. As long as it is easily retrievable, it will be available. Better to look something up that we are unsure about, such as a drug dose, than hazard a guess and be wrong.

Fortunately we live in an era when we can implement this strategy very easily. We carry smartphones that are constantly connected to the Internet. All the data we need is at our fingertips and incredibly easy to look up. Similarly we can store data on these devices for later retrieval. This constant availability of information makes life easier for doctors and undoubtedly makes for better patient care because of decreased mistakes due to memory errors.

There are those who would argue that relying on these devices is a crutch, and any good doctor wouldn’t need them. What would happen if a doctor’s plane crash landed on some remote island, where there were no charging ports? How could that doctor function?

I think it’s time to put aside such nay-saying and embrace our digital assistants. These devices are our tools, as essential to modern medicine as ultrasounds, blood tests, and MRI scanners. Take away any of these tools, and doctors will be limited in what they can do. We should be proud of the impressive technology that allows us to carry powerful computers in our pockets, and we shouldn’t be ashamed to use them.

Notwithstanding the above, medical board certification is still old-school, rooted in that outmoded 19th century Renaissance Man philosophy that doctors should hold everything in their heads. Certainly some medical board questions are practical and test things all doctors should know. But thrown into the mix are a lot of obscure questions about obscure facts that may be difficult to regurgitate during a testing session, but would be easy to look up online in a few seconds in a real-world setting. So, do these tests actually test one’s abilities as a real-world practicing doctor armed with modern information technology or are they just a particularly arcane version of Trivial Pursuit?

I’ll leave the answer to this question as an exercise for the reader.

Trying Out Vim Using Emacs Evil Mode

After using the text editor Emacs for over 20 years, and after listening to debates on the merits of Emacs vs Vi/Vim  (henceforth in this post referred to as simply “Vim”) for at least as many years, I decided that I wanted to give Vim a try. To be fair, I had used Vim before, but, also to be fair, I had never tried to master it or given it a real chance. I knew enough Vim keybindings (the “hjkl” keys and “ZZ” to save and quit) to get by when editing a file via a remote terminal. But I had never taken the time to really learn Vim to the point that it would be an efficient text editor for me. And I certainly didn’t want to abandon Emacs, mostly because of Org mode  the best organizational tool there is, and Magit , the best Git interface there is. Nevertheless the constant key-chording of Emacs, which uses control key combinations for most editing tasks, continued to be awkward despite many years of practice. The question kept coming up: was using Vim a better way to edit text?

My initial resistance to Vim was not just because I liked Emacs. Vim is a modal text editor, so-called because entering text and editing text require changing modes. Moreover, the “Normal” mode in Vim is the text editing mode. To actually enter text, you use a keyboard command to switch to “Insert” mode. To return to Normal mode, you use the Escape key. So you use the Escape key a lot. On my Mac keyboard, the Escape key is located at the top corner of the keyboard, a tiny sliver of a key that is several inches away from my left pinky. New MacBooks don’t even have a dedicated Escape key anymore.

The modal concept caused problems in my prior limited use of Vim. I would constantly forget what mode I was in and start typing in the wrong mode, causing havoc to my text. But still, lots of people used Vim and liked it a lot.

So I started reading more about it. I bought Drew Neil’s book,  Practical Vim, and skimmed through it. Something he said in chapter 2 of the book I found attractive. To paraphrase him, text is to the writer as a painting is to a painter. A painter spends time studying his subject, mixing paints, selecting brushes, and so forth. Only a fraction of time is used to actually apply paint. Likewise a writer, or programmer, spends a lot of time thinking and editing rather than just putting text down on the screen.

While I suspect the analogy appeals more to my vanity, comparing writing to art, than is true (because I think both writers and painters probably spend most of their time applying words or paint to canvas), I think the theory is at least worth trying to put into practice. Editing is what turns mediocre writing into good writing, and what bit of writing wouldn’t benefit from more editing?

Beyond the theoretical, Drew’s book is chock full of examples in which Vim shines as a way to edit text rapidly with a minimum of keystrokes. I had used Emacs’ macros on occasion to do repetitive tasks, but it looked like Vim had the potential to really rev up my editing speed.

Enter Evil mode  for Emacs. Evil mode is an Emacs major mode that transforms Emacs into a Vim clone. You can edit text using Vim keybindings, and still have all other Emacs functionality available. In other words, the best of both worlds. I have been using it for about a week now, and I think it’s great.

It works fine out of the box, but some tweaking always helps. First off, I remapped my Caps Lock key to be the Escape key in my System Preferences. It’s right next door to the “A” key and makes changing modes (referred to as “States” in the Evil manual, since the word “mode” has its own meaning in Emacs) a snap.

Then I added some fixes so that cursor movement with the “hjkl” keys would respect visual lines instead of physical lines, since a lot of my writing uses Emacs word wrap mode. Here is what I inserted into my .emacs file:

;; play with evil mode
(use-package evil
:ensure t
:config
;; make it default, gulp!
(evil-mode 1)
;; Make movement keys work respect visual lines
(define-key evil-normal-state-map (kbd "<remap> <evil-next-line>") 'evil-next-visual-line)
(define-key evil-normal-state-map (kbd "<remap> <evil-previous-line>") 'evil-previous-visual-line)
(define-key evil-motion-state-map (kbd "<remap> <evil-next-line>") 'evil-next-visual-line)
(define-key evil-motion-state-map (kbd "<remap> <evil-previous-line>") 'evil-previous-visual-line)
;; Make horizontal movement cross lines
(setq-default evil-cross-lines t))

Finally, there are some unexpected niceties of Evil mode that makes it perfect for someone wanting to transition to Vim. First of all, it is pretty easy to tell what mode/state you are in because the cursor changes shape and the mode line has a little indicator like so: <N> that indicates the state.

Second, you can easily go back to Emacs keybindings at any time by pressing C-z. The state indicator indicates for Emacs mode. Press C-z again to return to Vim keybindings.

Third, while in Vim Insert mode, a lot of Emacs keybindings work! You can move around with C-f, C-b, M-f, M-b, etc.! So no need to constantly change modes if you don’t want to. I expect I will use this less as I get more used to “The Vim Way,” but it sure is helpful for learning.

Finally, many other Emacs keybindings work too. C-l centers the cursor in the page. I can use C-x C-s to save the file, as opposed to :w in Vim. Of course M-x commands all still work too. And C-g, the Emacs get of jail key, works as well.

So if you want to have the best of both worlds, and bring the editor wars to a peaceful settlement, Evil mode is the answer.

Here is a good talk on YouTube that also contributed to my decision to try Evil mode.

EHR Copy and Paste Considered Harmful

DRY principle – Don’t Repeat Yourself

How bad are Electronic Health Record (EHR) programs? Let me count the ways. Rather, let me not, as I and many other folks have already done so. Even non-tech savvy doctors (of which there are fewer and fewer) realize something is wrong when they compare their experience using an EHR with virtually every other computer program they come across, such as the apps on their phones. As the click counts required to do simple tasks mount up and repetitive stress injury of the hand sets in, even the most sanguine of medical personnel will eventually realize that something is not quite right. And as EHR companies forbid sharing of screenshots of their user interfaces, you’ll just have to take my word for it these UIs are, let us say, quaint. Hey EHRs, the 90s called and want their user interfaces back.

In this post I’ll point out just one of the many problems with EHRs: EHRs violate the DRY principle.  The acronym DRY is familiar to computer programmers, but not to most medical people. DRY stands for “Don’t Repeat Yourself.” In computer programming it means don’t write the same code in two or more different places. Code duplication is what some programmers refer to as a code “smell.” There is no reason to duplicate code in a computer program. A single block of code can be called from multiple procedures.  There is no reason for each procedure to have its own copy of this code block.   Code duplication leads to code bloat and code rot, where two procedures supposed to do the same thing get out of sync with each other because of changes in one copy of the duplicated code and not in the other.

Applying the DRY principle to a database requires that every item of data has a single location in the database. Multiple copies of the same data increase the size of the database and invariably cause confusion. Which copy is the original? Which copy is the true copy when copies disagree?

An EHR program is at root a gigantic database. Ideally Patient Smith’s X-ray report from 1/1/2017 is filed away properly in the database and easily retrieved. Same with his blood work, MRI results, etc., etc.

Enter Copy and Paste.

Copy and Paste is evil. Unlike Cut and Paste, Copy and Paste’s close cousin that moves data around without duplication, Copy and Paste is bad, lazy, and sloppy.  Copy and Paste needlessly duplicates data. Copy and Paste violates DRY.

EHR notes are rife with Copy and Paste. X-ray reports are copied and pasted. Blood work too. Even whole notes can by copied and pasted. It is easy to copy and paste a prior progress note and then make a few changes to make it look like it wasn’t copied and pasted. Everyone does it.

Many EHR progress notes fall just side short of novel length. Whole cath reports, MRI results, other doctor’s notes, kitchen sinks, and other potpourri are thrown in for good measure. Usually with a bit of skillful detective work one can determine the minor fraction of the note that is original. Usually it is last line. Something like: “Continue current plans.” These could be the only words actually typed on the keyboard. Everything else is just copied and pasted.

So you get all the downsides of DRY: bloated notes, duplication of data, possible inaccuracies and synchronization problems. The X-ray report may be revised by the radiologist after it is copied and pasted into the note. Nevertheless the unrevised report persists forever sitting as a big blob of text in the middle of a now inaccurate note. Of course there is some consolation that no one will ever read the whole note anyway, with the possible exception of a malpractice lawyer.

Why is Copy and Paste so prevalent in EHR notes? It isn’t just laziness. Like the pulp fiction writers of the 30s, doctors are effectively paid by the word, so that the longer the note the better. Longer notes reflect higher levels of care, more intricate thought processes, more — wait a minute! No they don’t. Longer notes reflect mastery of Copy and Paste, something that’s not too difficult to master. Even non-tech docs seem to have no trouble with it. Long notes are a way to justify billing for a higher level of care, i.e. more dollars. Since the Powers That Be Who Control All of Medicine (i.e. not doctors) decided that billing would not be based on what doctors do, but on what doctors write in the chart, it doesn’t take a crystal ball to predict that note bloat, electronically enhanced, would be the inevitable outcome of such a stupid policy.

What are the alternatives to Copy and Paste? The best is the use of hyperlinks, something that you might be familiar with if you ever use something called the World Wide Web. If I want to put a YouTube video on my blog, I don’t copy the video and paste it here, I just provide a link. Similarly, if you want to refer to an X-ray report in a progress note it should be possible to just provide a link to it. Something short and sweet.

Of course the example note I referred to above would be reduced in length to just a number of links and the sentence “Continue current plans.” This will hardly satisfy the coders and billing agents and whoever else is snooping around the EHR trying to find ways not to pay anyone (i.e. insurance companies). Nevertheless these shorter notes would be much easier to digest and might even encourage a doctor to elaborate a bit more in his or her own words on the history, physical, diagnosis, and plans. Unlinking billing and documentation would go a long way towards making EHR notes more manageable and informative. No one seems to keen on doing this however. Documentation as a proxy for care  is just one of many broken pillars of the Byzantine edifice known as the American Health Care System.

[note: the title refers to a famous (in computer circles) 1968 letter by Edsger Dijkstra entitled “Goto Statement Considered Harmful.” It has inspired tons of computer articles with similar titles, including this one.]

CenturyLink Sucks, Part 57

Blogging at Panera’s

I don’t usually work at a coffee shop, but here I am, at Panera’s dealing with their bad (also CenturyLink) internet service, because my internet service is down at home. Yes we are going into DAY NUMBER 4 of the great CenturyLink Internet Service Outage of Parker, Colorado. This started inauspiciously, perhaps coincidentally, during a mild thunderstorm on Friday before the Memorial Day Weekend. Internet could not be reached, internet light on router out, though DSL was on. After the obligatory multiple router reboots, no change. Call to CenturyLink. Outage in our area, should be fixed in 12 to 24 hours. About 30 people affected. This being the start of Memorial Day Weekend, I was not optimistic.

As the weekend has dragged on, my worst fears have been confirmed. That is why I am sitting here, nursing a cup of coffee at Panera’s, writing this. After multiple calls to CenturyLink, the story has not changed, other than the expected duration of outage, from 12-24 hours, to 24-48 hours, and, most recent estimate, from 48-72 hours. When I accused the customer service person that their technicians were goofing off over the holiday, I was answered with an agrieved “Our technicians work 24/7” and “the technician is there now trying to fix it.” Sure.

A little background may be in order. I live within 20 miles of Denver, supposedly a telecommunications hub. I can walk to the top of the hill in my neighborhood and see the buildings of downtown Denver. Despite this, the only option for internet service in my neighborhood is CenturyLink, via the phone lines. And, up until a year or so ago, the only speed we could get was 1.5 Mbps. After writing to the FCC and complaining multiple times, our service has been upgraded to a whopping 3 Mbps. This is in the era of Gigabit internet service. As you may know, the federal government granted billions of dollars of incentives to the ISPs in order to improve the internet backbone with a goal of providing broadband service to “rural” America.  Broadband internet is now defined as a minimum of 25 Mbps.  3 Mbps doesn’t cut it. Sadly, the US is way behind the rest of the world in this regard. It is clear that the ISPs took the federal money and used it to pad their executive salaries. No wonder the most hated company in the US is an ISP, though I bet with the next go-around the airlines will give them a run for their money.

Given the context of baseline sucky internet service and no alternative ISP in our neighborhood, I have very little patience with a 3 day and counting outage. CenturyLink, Shame! (Ding).

EP Studios App Updates

Here’s what’s going on with the EP Studios apps:

EP Calipers

Most of the new stuff is in EP Calipers. Probably the most useful new feature is available on the Mac and Windows versions: a transparent floating caliper window. Use it to overlay calipers over any open window on the desktop. Check figures of journal articles. Use it during slide shows. Use it on webpages or on your EHR. No longer are you limited to just image files you have downloaded onto your computer. Unfortunately due to the nature of mobile device platforms, there is no way to implement similar functionality on a phone or tablet (that I know of).

Using the floating transparent window to check measurements in a published academic paper. It appears the pacing CL is actually 240, not 250 msec.

Several users suggested the capability to color each caliper differently. This is now implemented. Others wanted a way to fine tune caliper position besides just dragging with your finger or trackpad/mouse. This is also implemented, via keyboard arrow keys or buttons that “micromove” or “tweak” caliper positioning.

Finally, in case you missed it, angle calipers are available. They can be useful in Brugada syndrome, in which the so-called beta angle may have predictive value. In addition, the work of Dr. Adrian Baranchuk from Queen’s University in Kingston, Ontario indicates that there is prognostic value to measuring the base of the triangle formed by the angle 5 mm inferior to the beta angle triangle’s apex. EP Calipers now supports this. Provided amplitude has been calibrated in mm, the triangle base is automatically drawn showing this measurement. This technique has been dubbed by Dr. Baranchuk as a “Brugadometer.”  More information on these Brugada Syndrome ECG measurements can be found here.

Using the Brugadometer to measure the beta angle and the triangle base 5 mm below the apex.

EP Coding

EP Coding also received a major update earlier this year. After a few years of relative stasis, the AMA decided to shake up the coding of EP procedures once again by unbundling the sedation component from the procedure codes. The result is a relatively complex coding system for sedation, depending on factors of patient age, who does the sedation, and the sedation duration. EP Coding now allows you to calculate the sedation codes automatically using a sedation coding calculator.

Sedation coding calculator

 

EP Mobile

EP Mobile has been relatively static. It is already chock full of calculators, drug information, risk scores, pictures of ECGs, etc. It is our best selling app, so we must be doing something right. I am always happy to add features; just email me at mannd@epstudiossoftware.com with your requests.

Final thoughts

This is a bit off-topic, but probably not worth a separate blog post either. My old Motorola Droid Maxx Android phone is getting a bit long in the tooth, and way past upgrade time. I was an early adapter of Android, and though I use other Apple products (a Macbook Pro and an iPad Mini 2), I have never owned an iPhone. This may change. In many ways I think Android is a more innovative operating system than Apple’s iOS. Nevertheless we live in an insecure world, and I can’t get timely updates to Android via my phone and Verizon. My phone is stuck on Android 4.4.4 (I even forget what candy that is), whereas the most recent Android version is Android 7 Nougat.  Apple doesn’t have this problem.  Having an outdated, obsolete OS in the current world of bad guy hackers is untenable. I think the problem is (as usual) with the providers, who could care less about updating an older phone when they could be pushing the latest phones on customers. The 2 year cycle of upgrading phones is ridiculously wasteful. But that’s what is driving the industry, with the carriers all too eager to get you in and sign another rip-off contract. So, it might be goodbye to Android soon.

A Tale of Two Histories

Compare the following two versions of the same medical history:

Version 1

CC: chest pain
Mr. Smith is a 57 y/o white man who comes into the office today for the first time with a complaint of chest pain. He states he has been in generally good health in the past, though he has smoked about 40 pack-years and admits to not exercising much, other than occasional games of golf. He has trouble keeping his weight down. He has been a middle-level manager for many years, but about a month ago changed jobs and took a pay cut. He says this has been quite stressful. He has changed jobs before, but states “I’m getting too old to keep doing this.” About 2 weeks ago he started noting some mild heaviness in his chest, lasting up to 5 or 10 minutes. He attributed this at first to eating heavy meals at dinner, but now thinks it occurred after climbing stairs following meals. He took some Tums, but was not sure if the pain eased from this or just from resting. These episodes of discomfort were localized to his anterior chest, without radiation or other associated symptoms at first. Over the last 2 weeks he thought that they were getting a little more frequent, occurring up to twice a day. Two days before this visit, he had an episode of more intense pain that woke him up from sleep at night. This episode lasted about 15 minutes and was associated with diaphoresis. “My pillow was soaking wet.” He woke up his wife who wanted to call 911, but he refused, though he agreed to make this appointment somewhat reluctantly. He has had no further episodes of chest pain, and feels that he is here just to satisfy his wife at this point. He generally doesn’t like to come to the doctor. He doesn’t know his recent lipid levels, though he says a doctor once told him to watch his cholesterol. His BP has been high occasionally in the past, but he attributes it to white coat syndrome: His BP is always normal when he uses an automatic cuff at the store, he claims. He is on no BP or lipid-lowering meds.  He takes a baby aspirin “most days.”  His parents are deceased: his mother had cancer, but his father died suddenly when his 40s, probably from a heart attack, he thinks.

Version 2
  • Mr. Smith
  • CC: chest pain
  • Age: 57 y/o Sex: M Race: Caucasian
  • Onset: 1 month
  • Frequency: > daily [X] weekly [ ] monthly [ ]
  • Location: Anterior chest [X] Left precordium [ ] Left arm [ ] Other [ ]
  • Radiation: Jaw [ ] Neck [ ] Back [ ] Left arm [ ] Right arm [ ] Other [ ]
  • Pattern: Stable [ ] Unstable [X] Crescendo [X] Rest [X] With exertion [X]
  • Duration: < 15 min [X] 15 min or more [X]
  • Risk factors: Tobacco [X] Family history CAD [X] HTN [?] DM [ ] Hyperlipidemia [?]
  • Relief: Rest [?] Medications [?] Other [ ]
  • Associated symptoms:  N, V [ ] Diaphoresis [X] Dizziness [ ] Other [ ]
Which is better?

Version 1 is an old-fashioned narrative medical history, the only kind of medical history that existed before the onset of Electronic Health Record (EHR) systems.  This particular one is perhaps chattier than average.  It is certainly not great literature or particularly riveting, but it gets the job done.  Version 2 is the kind of history that is available on EHR systems, though usually entry of a Version 1 type history is still possible albeit discouraged.  With an EHR, entering a long narrative history requires either a fast, skilled physician typist, or a transcriptionist — either human (frowned upon due to cost) or artificial, such as Dragon Dictation software.  This latter beast requires careful training and is frustratingly error-fraught, at least in my experience.  The Version 2 example is not completely realistic.  In practice there are more check boxes, more pull-down lists and other data entry fields than can be shown here.  But you get the idea.

Version 2 seems to have a higher signal to noise ratio than Version 1.  It’s just Version 1 boiled down to its bare essentials, stripped of unnecessary verbs, conjunctions, prepositions, and other useless syntax.  It contains everything a medical coder, a medical administrator, or a computer algorithm needs to do his, her, or its job.  It has taken the medical history, the patient’s story, and put it into database form.

But Version 1 is not just Version 2 embellished with a bunch of fluff.  Certainly Version 1 is more memorable than Version 2.  There is a chance the physician who wrote Version 1 will remember Mr. Smith when he comes back to the office for a follow-up visit: Mr. Smith, that middle-aged fellow who was stressed out when he took a pay cut while starting a new job and started getting chest pain.  Another physician meeting Mr. Smith for the first time might after reading this history modify his tactics in dealing with Mr. Smith.  One gets the impression that Mr. Smith is skeptical of doctors and a bit of a denier.  Maybe it will be necessary to spend more time with him than average to explain the need for a procedure.  Maybe it would be good to tell his long-suffering wife that she did the right thing insisting that he come in to the doctor.  All this subtlety is lost in Version 2.

There are some cases where Version 2 might be preferable.  In an Emergency Department, where rapidity of diagnosis and treatment is the top priority, a series of check boxes saves time and may be all that is needed to expedite a patient evaluation.  But for doctors who follow patients longitudinally, Version 1 is more useful.  A patient’s history is his story: it is dynamic, organic, personal, individual.  No two patient histories are identical or interchangeable.  Each history has a one-to-one correspondence with a unique person.  A good narrative history is an important mnemonic aid to a physician.   A computer screen full of check boxes is no substitute.

While the Version 2 history was designed for administrators, coders, billers, regulators, insurance agents, and the government, the Version 1 history was designed by doctors for doctors.  We should be wary of abandoning it, despite the technical challenge of its implementation in EHR systems.

 

Escape from Escape

Escape key
Ye Olde Escape Key

During my college days computers were run from teletype machines. These teletypes had a typewriter keyboard layout extended with unfamiliar keys like Control (Ctrl) and Escape (Esc).  You could press Ctrl-G and make the teletype ring its bell — ding! You could press Esc when you mistakenly wrote a BASIC program with an infinite loop and make the program terminate. When I got an Apple ][+ in the early 1980s, Ctrl and Esc keys were present, though there was no Caps Lock key — all letters were capitalized on the Apple ][. I had to buy a separate Videoterm card to get lower case letters and perform the “Shift key mod” inside the case to get the Shift keys to work. Ah, the good old days!

ASR-33 Teletype keyboard layout
ASR-33 Teletype keyboard layout (by Daniele Giacomini [CC BY-SA 2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons)
When the IBM PC came out its keyboard combined the IBM typewriter keyboard with the new computer keys, adding to Control and Escape the Alt key and a set of Function keys. The Alt key originated in the Meta key from MIT keyboards, and is still called the Meta key in Emacs documentation — so delightfully retro! Apple renamed the Alt key the Option key, and, with the Macintosh, added the Apple key that later became the Command key. Windows certainly couldn’t have an Apple key, so named their equivalent key the Windows key.

Apple ][ keyboard from http://www.hp9845.net/9845/history/comparison/
Apart from the Control key, which is combined with other keys to generate non-printing ASCII characters, like Bell (ASCII 7), and the Escape key (ASCII 27), these other keys originally manipulated the high order bit of a character code.  They could get away with this as ASCII only uses 7 bits of an 8 bit byte. However with internationalized keyboards and Unicode, character sets now not only require all 8 bits of a byte, but often more than one byte for each character. So modern keyboards send scancodes with each keypress and it is up to the computer operating system to make sense out of them.

I have to admit I don’t use the Function keys (F1 – F12) much anymore since my WordPerfect and Lotus 1-2-3 days long ago. I use the Escape key mostly to get out of full screen mode when I am watching a YouTube video. But many developers use the vi or Vim editor to create their source code and depend on the Escape key. I am more an Emacs man myself, but sometimes use Vim for simple editing tasks. Vim is a modal editor, meaning there are separate text entry and editing modes. The Escape key is used to change modes. If you use Vim, you are constantly hitting the Escape key. Given the importance and long history of the Escape key (it was created in 1960), a developer who relies on Vim might be forgiven for thinking that the venerable key would be sticking around a bit longer.

IBM PC keyboard
BM PC keyboard (credit http://www.vintage-computer.com/ibm_pc.shtml)

So if I were Apple and designing the next generation MacBook Pro (MBP), eliminating the Escape key would not be high on my list of priorities. But this was what they did, turning the Escape key into an evanescent luminosity on the new Touchbar interface. This is depressing. Up to this point, the MBP has been a great developer machine. I have a “late 2013″ 15” screen MBP. It is a fast, sturdy laptop. Since Mac OS X macOS is a user interface veneer over BSD Unix, all the Unix development tools are there, as opposed to Windows devices, where installing a Unix environment is a pain. It is impossible to develop for macOS or iOS without an Apple machine. With my MBP I can develop for both Android and Apple. It is even possible to develop Windows software on a Mac, though I haven’t tried this. Because of these advantages, lots of developers use a MBP.

It seems Apple has turned its back on developers. Fortunately my current machine is working well and I don’t have any need to buy a new one yet. Ideally by the time I need a new machine the next iteration of the 15″ MBP will offer a standard keyboard and fix some of the other problems the new versions seem to be having.  Apple should focus on features that developers and other professional computer users want in a computer:  more memory than 16 GB, return of the Magsafe power cable, and at least one full-sized USB port so that old USB devices can be used without a dongle. They can continue to sell a Touchbar, USB-C only version of the 15″ MBP for people who like that sort of thing. The 13″ MBP is available with and without a Touchbar, why not do the same thing with the 15″ version?  Perhaps the death of the Escape key isn’t the end of the world, but it does seem to symbolize a lack of interest on Apple’s part in its developers.  But if developers switch to non-Apple machines, those developers will no longer be able to develop Apple apps.  In the long run this will hurt Apple’s major money-maker, the iPhone.

Geeky Docs

I remember the disdain some of the EHR trainers had for their trainees back when our hospital system “went live” several years ago. Of course this disdain was tempered by their knowledge that if docs weren’t so computer illiterate, or the user interfaces of the EHR systems weren’t so awful, or if the EHR software wasn’t so bug-ridden, their jobs wouldn’t exist. So they soldiered bravely on, undaunted by grumpy old docs who now had to type their notes despite never learning how to touch type, who had to reconcile medication lists a mile long including meds like cinnamon that they really could care less that the patient was taking but had to be reconciled nevertheless, who had to painstakingly enter orders using an interface designed by an engineer who knew as much about medicine as — an engineer, and who were angry and resentful that this newfangled computer system was being shoved down their throats under threat of loss of government medicare reimbursement. Given the tensions and personalities involved, it still amazes me that the EHR transition was accomplished without loss of life or limb.

Maybe the classes helped. Long before the go-live date, we went to EHR school. This consisted of several days of classes, during which the world of health care delivery was supposed to stop (it didn’t) while all medical personnel sat around drinking coffee and listening to talks about how the EHR was supposed to work. Even though this was a useful education into what the life of a hospital administrator must be like, the real world of patients and disease tended to encroach on the world of mouse clicks and meaningful use butttons to the point that I skipped the last afternoon of classes and the final exam. Unfortunately my truancy was detected and, under penalty of garnishment of wages, I was forced to do a make-up class. Despite the rigorous training, the number of months that elapsed between EHR school and going live ensured that I and my colleagues pretty much forgot everything we learned — hence the need for the EHR trainers.

I was a little disappointed that I wasn’t selected to be a “superuser.” A superuser is a user who is technically savvy and enthusiastic about using the EHR — a true believer who could help other users who were having problems, even after the EHR trainer cadre had long since departed to initiate other hospital systems into the EHR religion. I suppose I failed to qualify on my lack of zealotry. I also kept my technical savvy under the radar. So I became merely a user. I found that, unlike my experience with other forms of technology, the EHR was making my life worse. Simple tasks became complex. My work slowed down. More mistakes were made. I was stunned. I could not think of any other example where a computer program was less efficient than the technology it was designed to replace. But it appeared that EHR systems were a counterexample to this.

So I decided to write a few blogs about how bad our EHR was, but the EHR company, who employs people whose sole purpose is to scour the internet looking for screenshots or bad-mouthing of their precious software, caught wind of this and reported it to the administrators of the health care chain I worked for. After some angst, I agreed to shut up for a while, though now that I am retired, I don’t feel bound by any non-disclosure agreements the hospital system signed with the EHR company.

EHR advocates have sometimes commented that once all the old, non-technological, non-touch typing doctors die off, then everyone will be pleased as punch with their EHRs. The new generation of doctors, raised on technology, able from infancy to handle a Playstation controller with aplomb, will have no problem using EHRs. There is some truth to this, but this criticism misses the point of my and others’ criticisms of current EHR software. There are plenty of technologically sophisticated doctors of all ages who are uncomfortable with the state of EHR systems today. I have written computer software and most would consider me one of these “geeky docs.” Most of the critiques of EHRs that I have read have been from tech-savvy doctors, not from the technological dinosaurs that the EHR pushers believe make up the majority of doctors today. None of us wants to go back to a pen and paper chart system. All of us want to see EHR systems improve in usability and interconnectivity. We all use computer software in our daily lives and know that EHR programs don’t measure up to standards that other computer programs meet. We don’t like the secrecy of the EHR companies or the astronomical cost of the software. But mostly we just want the software to get better. This won’t happen unless the software designers start listening to users. Tech-savvy docs need to be on the forefront of this. We need to push for change and not allow the EHR companies to keep falling back on their old excuse: if you docs only knew how to type, you’d love our system.

More Microsoft Migraines

windows-desktop-app-converterWhen I ported the EP Calipers app (an electronic calipers app) to Microsoft Windows, I initially planned to write a Universal Windows Platform (UWP) app. The UWP is an initiative by Microsoft to allow developers to have one code base that can run on all Windows platforms, from PCs to phones to Xbox to HoloLens. UWP apps can be distributed from the Windows Store and are more secure than traditional Win32 programs, as they don’t install DLL files all over the place and don’t write to the registry. These apps are sandboxed, similar to iOS and Android apps. The UWP is a laudable goal, but inherently difficult to implement in Windows which was not originally designed with security in mind, as opposed to Unix/BSD based operating systems like Android and iOS.  Unfortunately, I found the application programming interface (API) of the UWP difficult, and so I ported EP Calipers to .NET, creating a traditional Windows PC program. At the time I did this, there was no way to distribute a Windows PC program on the Windows Store, so I was forced to distribute the program via a 3rd party. I would have much preferred to distribute via the Windows Store, as I thought this would create more visibility for the app.

With the Anniversary update of Windows 10, I learned about Project Centennial. This project provides a route to convert Windows desktop programs to APPX format — something that is not quite a UWP app, but something that can be distributed on the Windows Store. After reading the MSDN documentation and viewing some instructive videos, I launched into making the conversion.

The transition has not been as easy as I had hoped.

I downloaded the DesktopAppConverter (DAC) from the Windows Store. Despite this origin, this is not really an app. Instead it runs in a PowerShell (which you must run as Administrator). However before you can run the conversion, you also need to download a “Windows Image” (WIM) file that matches your current version of Windows. This is wim file is something akin to a Windows virtual machine, and the download is large, as you might expect (3.3 Gb). I live in an area with poor internet service (3 Mbps speeds), so it takes longer than a coffee break for this kind of download. I used the Microsoft Edge browser, and it took several attempts to download this file. The browser would declare the download finished, even though only a little more than 2 Gbs had been downloaded. Finally it looked like I had the right number of Gbs, but when I ran the DAC I got an obscure error. I suspected my download was corrupt. Yet another download, using Chrome, got me an non-corrupt download. I wish Microsoft would include a checksum with their downloads, to verify the downloads, but they don’t. Anyway this last download worked — to a point.

Ah, I should have read the fine print. The DAC only works on Windows Pro or Enterprise. I was running Windows Home edition. Sigh. $99 and a short download later, I was running Windows Pro. And I was off to the races — well, not exactly!

DAC works by running your program installer in a virtual Windows environment, and capturing whatever changes your installer makes to the file system and registry. It then encapsulates that information into an appx file that can be uploaded to the Windows Store. The documentation suggests that you can use any installer, as long as it installs silently (i.e. without user input). I had created an installation package using the Visual Studios ClickOnce packaging system. Oops, too bad. That’s the one kind of installer package that doesn’t work with DAC.

In other words they wanted something like an InstallShield installer. An MSI file. Well I had an old version of InstallShield from about 10 years ago. Alas, it was so old that it wouldn’t install itself on Windows 10. Well, maybe I’d have to buy a new edition of InstallShield. I went to the website. Hm, the cheapest version , InstallShield Express, was listed for $699. Yikes!

So I ended up using the Wix open-source installer, with the help of the Wax extension to Visual Studios, to create my MSI installer file (all for free). And so finally I could run the DAC after tweaking the -InstallerArguments (the “/S” silent installation argument as suggested in the documentation didn’t work, and wasn’t needed since the Wix installer I made was silent by default). I also had to make sure my code was compiled as an x64 (64-bit) file, because the DAC only supports 64-bit platforms at this time.

The next step was to sign the appx package. I created a certificate as suggested by the documentation, but it didn’t work. I needed a trusted certificate. Not too clearly documented, it is necessary to right-click on your certificate and add it to the Trusted Root Certificate Authorities. After this, I actually had an appx file that I could install by double-clicking it.

I then spent a few days fiddling with icons. Although UWP apps use 60 plus icon images to create various size tiles, it appears that appx files just need 3 images. These are added to the Assets folder generated by the DAC, and the AppxManifest.xml has to be hand-edited to point to these images. Here as throughout the conversion process, the documentation is quite rough and hard to interpret. The 50×50 pixel size image that is used by the application installer appears fuzzy, so I substituted a higher quality image. I still don’t think that all the icons (for example when using the Open With dialog) work quite right. Anyway, after editing these files, you use the MakeAppx.exe program included in the Windows SDK to recreate your appx file, and then sign it again. I created some PowerShell batch files to do this.

I applied to submit apps to the App Store. I applied as a company, which costs $99 (which the website states is a one time fee, though the email receipt I received mentions “Annual Developer Registration”). Because I registered as a company, I had to wait a few days to be approved, after getting a couple of phone calls from a company whose job is to verify that other companies are real companies — a step that Google and Apple don’t bother with.  But EP Studios was duly approved as a real company, such that it is, and all appeared ready for my first Windows App Store submission.

And so I submitted my app. Only to get this error message:

Package acceptance validation error: Your developer account doesn’t have permission to submit apps converted with the Desktop App Converter at this time.

Whaaaaa?? Further research disclosed that even after paying your fee and becoming a certified Microsoft developer, you still have to ask Microsoft “pretty please can I upload apps converted with the Desktop App Converter?” I duly filled out the form to request this. Twice. And I have not heard back from Microsoft.

When I compare this app development and submission process to that of, say, Google or Apple, there is no comparison. The Microsoft process is a chamber of horrors, requiring more time, patience and money than those others. Documentation is poor, and, presumably because not many developers are trying to publish to the Windows Store (I wonder why?) there is very little help in the usual places like StackOverflow. I would Google DAC error messages and get nothing but crickets chirping. All this is unfortunate, especially as I plan to leave for Europe in a few days and I had wanted to get this app uploaded before then. My Windows development machine is a heavy, bulky, desktop (remember them?) computer that certainly is not coming with me.

Maybe by the time I return to the US in a few months, Microsoft will be ready to allow me to submit my app.

Relic from Computer History

The M
The M

Sitting on my mantle is a bronze letter M. This M has been in my family as long as I can remember. When I was growing up I didn’t think about where it had come from. I knew it stood for our family name of Mann. Later on I learned the story of the M from my parents.  As it turns out, this particular bronze M is a relic from a bygone era of computer history.

I grew up in the 1950s just outside of Philadelphia, a block north of the city limits. This was an Irish-Catholic neighborhood. Our neighbors all had 9 or 10 kids. Dads worked and moms stayed home. It was a fun time and place to grow up as there were kids to play with everywhere.

Our neighbors to the right of our house were the Williams (we always referred to them as the Williamses). The father worked in construction. He was the one who gave my father the M. The M came from a building that his company was demolishing. For many years that’s all I knew about the M.

Eckert-Mauchly building
Eckert-Mauchly building

When I was older I asked my parents for more details about the origin of the M. The M came from the lettering over the entrance to the Eckert-Mauchly Computer Corporation building, which stood at 3747 Ridge Avenue in Philadelphia in the early 1950s. I have only been able to find one picture of this building. It is low resolution and the lettering is not clear, but certainly the M in my possession looks similar to the M of Mauchly on the building.

During and after the Second World War there was a massive stimulus to science and technology. In England Alan Turing and colleagues developed the “Colossus” computer at Bletchley Park that was used to decode German transmissions encrypted with the Enigma machine. There is little doubt that the intelligence gathered through this effort was instrumental in the Allies’ winning the war.  Sadly, Turing’s reward was prosecution and persecution for his homosexuality that led to suicide with a cyanide-laced apple — one of the most ignominious events in the history of humanity.

Mauchly, Eckert, and UNIVAC
Mauchly, Eckert, and UNIVAC

In America, at the end of the war, John Mauchly and Prosper Eckert joined forces at the Moore School of Engineering at the University of Pennsylvania to develop the ENIAC computer. Mauchly was what today we would call a “software” guy, and Ecklert was the “hardware” guy. Their computer was as big as a house and contained thousands of vacuum tubes.  It worked, though of course its processing power was infinitesimal compared with what we carry around in our pockets nowadays.  After doing computing work for the Army at Penn, Mauchly and Eckert decided to form their own company.   This decision was due to an issue still familiar today: dispute over intellectual property rights with the university. In 1946 they formed the first commercial computer corporation. Originally called The Electronic Controls Corporation, the name was changed to Eckert-Mauchly Computer Corporation (EMCC) in 1948. The company developed several computers that were sold mostly to government agencies such as the Census Bureau.   Of these computers the most famous was UNIVAC. UNIVAC was used to predict (successfully) the presidential election results on TV in 1952. Although we take this use of computers for granted now, at the time this was an amazing feat.  Grace Hopper, the computer pioneer who only recently has been getting the recognition she deserves worked at the EMCC. She went on to develop the first computer language compiler.  Unfortunately the EMCC lost government funding due to suspicions that they had hired “communist-leaning” engineers (this was the McCarthy era), and the company was taken over in 1950 by the Remington Rand corporation, which at the time made typewriters.  Eckert stayed on at Remington Rand (later Sperry, now Unisys), while Mauchly became a consultant.  You can see both of them in all their glorious 1950s nerdiness in this YouTube video.

Marker at the site of EMCC
Marker at the site of EMCC

At some point in the early 1950s the original building was demolished. I have been unable to determine the exact year. And from that building, as far as I know, only the M sitting on my mantle remains.