Category Archives: Computers & Software

CenturyLink Sucks, Part 57

Blogging at Panera’s

I don’t usually work at a coffee shop, but here I am, at Panera’s dealing with their bad (also CenturyLink) internet service, because my internet service is down at home. Yes we are going into DAY NUMBER 4 of the great CenturyLink Internet Service Outage of Parker, Colorado. This started inauspiciously, perhaps coincidentally, during a mild thunderstorm on Friday before the Memorial Day Weekend. Internet could not be reached, internet light on router out, though DSL was on. After the obligatory multiple router reboots, no change. Call to CenturyLink. Outage in our area, should be fixed in 12 to 24 hours. About 30 people affected. This being the start of Memorial Day Weekend, I was not optimistic.

As the weekend has dragged on, my worst fears have been confirmed. That is why I am sitting here, nursing a cup of coffee at Panera’s, writing this. After multiple calls to CenturyLink, the story has not changed, other than the expected duration of outage, from 12-24 hours, to 24-48 hours, and, most recent estimate, from 48-72 hours. When I accused the customer service person that their technicians were goofing off over the holiday, I was answered with an agrieved “Our technicians work 24/7” and “the technician is there now trying to fix it.” Sure.

A little background may be in order. I live within 20 miles of Denver, supposedly a telecommunications hub. I can walk to the top of the hill in my neighborhood and see the buildings of downtown Denver. Despite this, the only option for internet service in my neighborhood is CenturyLink, via the phone lines. And, up until a year or so ago, the only speed we could get was 1.5 Mbps. After writing to the FCC and complaining multiple times, our service has been upgraded to a whopping 3 Mbps. This is in the era of Gigabit internet service. As you may know, the federal government granted billions of dollars of incentives to the ISPs in order to improve the internet backbone with a goal of providing broadband service to “rural” America.  Broadband internet is now defined as a minimum of 25 Mbps.  3 Mbps doesn’t cut it. Sadly, the US is way behind the rest of the world in this regard. It is clear that the ISPs took the federal money and used it to pad their executive salaries. No wonder the most hated company in the US is an ISP, though I bet with the next go-around the airlines will give them a run for their money.

Given the context of baseline sucky internet service and no alternative ISP in our neighborhood, I have very little patience with a 3 day and counting outage. CenturyLink, Shame! (Ding).

EP Studios App Updates

Here’s what’s going on with the EP Studios apps:

EP Calipers

Most of the new stuff is in EP Calipers. Probably the most useful new feature is available on the Mac and Windows versions: a transparent floating caliper window. Use it to overlay calipers over any open window on the desktop. Check figures of journal articles. Use it during slide shows. Use it on webpages or on your EHR. No longer are you limited to just image files you have downloaded onto your computer. Unfortunately due to the nature of mobile device platforms, there is no way to implement similar functionality on a phone or tablet (that I know of).

Using the floating transparent window to check measurements in a published academic paper. It appears the pacing CL is actually 240, not 250 msec.

Several users suggested the capability to color each caliper differently. This is now implemented. Others wanted a way to fine tune caliper position besides just dragging with your finger or trackpad/mouse. This is also implemented, via keyboard arrow keys or buttons that “micromove” or “tweak” caliper positioning.

Finally, in case you missed it, angle calipers are available. They can be useful in Brugada syndrome, in which the so-called beta angle may have predictive value. In addition, the work of Dr. Adrian Baranchuk from Queen’s University in Kingston, Ontario indicates that there is prognostic value to measuring the base of the triangle formed by the angle 5 mm inferior to the beta angle triangle’s apex. EP Calipers now supports this. Provided amplitude has been calibrated in mm, the triangle base is automatically drawn showing this measurement. This technique has been dubbed by Dr. Baranchuk as a “Brugadometer.”  More information on these Brugada Syndrome ECG measurements can be found here.

Using the Brugadometer to measure the beta angle and the triangle base 5 mm below the apex.

EP Coding

EP Coding also received a major update earlier this year. After a few years of relative stasis, the AMA decided to shake up the coding of EP procedures once again by unbundling the sedation component from the procedure codes. The result is a relatively complex coding system for sedation, depending on factors of patient age, who does the sedation, and the sedation duration. EP Coding now allows you to calculate the sedation codes automatically using a sedation coding calculator.

Sedation coding calculator

 

EP Mobile

EP Mobile has been relatively static. It is already chock full of calculators, drug information, risk scores, pictures of ECGs, etc. It is our best selling app, so we must be doing something right. I am always happy to add features; just email me at mannd@epstudiossoftware.com with your requests.

Final thoughts

This is a bit off-topic, but probably not worth a separate blog post either. My old Motorola Droid Maxx Android phone is getting a bit long in the tooth, and way past upgrade time. I was an early adapter of Android, and though I use other Apple products (a Macbook Pro and an iPad Mini 2), I have never owned an iPhone. This may change. In many ways I think Android is a more innovative operating system than Apple’s iOS. Nevertheless we live in an insecure world, and I can’t get timely updates to Android via my phone and Verizon. My phone is stuck on Android 4.4.4 (I even forget what candy that is), whereas the most recent Android version is Android 7 Nougat.  Apple doesn’t have this problem.  Having an outdated, obsolete OS in the current world of bad guy hackers is untenable. I think the problem is (as usual) with the providers, who could care less about updating an older phone when they could be pushing the latest phones on customers. The 2 year cycle of upgrading phones is ridiculously wasteful. But that’s what is driving the industry, with the carriers all too eager to get you in and sign another rip-off contract. So, it might be goodbye to Android soon.

A Tale of Two Histories

Compare the following two versions of the same medical history:

Version 1

CC: chest pain
Mr. Smith is a 57 y/o white man who comes into the office today for the first time with a complaint of chest pain. He states he has been in generally good health in the past, though he has smoked about 40 pack-years and admits to not exercising much, other than occasional games of golf. He has trouble keeping his weight down. He has been a middle-level manager for many years, but about a month ago changed jobs and took a pay cut. He says this has been quite stressful. He has changed jobs before, but states “I’m getting too old to keep doing this.” About 2 weeks ago he started noting some mild heaviness in his chest, lasting up to 5 or 10 minutes. He attributed this at first to eating heavy meals at dinner, but now thinks it occurred after climbing stairs following meals. He took some Tums, but was not sure if the pain eased from this or just from resting. These episodes of discomfort were localized to his anterior chest, without radiation or other associated symptoms at first. Over the last 2 weeks he thought that they were getting a little more frequent, occurring up to twice a day. Two days before this visit, he had an episode of more intense pain that woke him up from sleep at night. This episode lasted about 15 minutes and was associated with diaphoresis. “My pillow was soaking wet.” He woke up his wife who wanted to call 911, but he refused, though he agreed to make this appointment somewhat reluctantly. He has had no further episodes of chest pain, and feels that he is here just to satisfy his wife at this point. He generally doesn’t like to come to the doctor. He doesn’t know his recent lipid levels, though he says a doctor once told him to watch his cholesterol. His BP has been high occasionally in the past, but he attributes it to white coat syndrome: His BP is always normal when he uses an automatic cuff at the store, he claims. He is on no BP or lipid-lowering meds.  He takes a baby aspirin “most days.”  His parents are deceased: his mother had cancer, but his father died suddenly when his 40s, probably from a heart attack, he thinks.

Version 2
  • Mr. Smith
  • CC: chest pain
  • Age: 57 y/o Sex: M Race: Caucasian
  • Onset: 1 month
  • Frequency: > daily [X] weekly [ ] monthly [ ]
  • Location: Anterior chest [X] Left precordium [ ] Left arm [ ] Other [ ]
  • Radiation: Jaw [ ] Neck [ ] Back [ ] Left arm [ ] Right arm [ ] Other [ ]
  • Pattern: Stable [ ] Unstable [X] Crescendo [X] Rest [X] With exertion [X]
  • Duration: < 15 min [X] 15 min or more [X]
  • Risk factors: Tobacco [X] Family history CAD [X] HTN [?] DM [ ] Hyperlipidemia [?]
  • Relief: Rest [?] Medications [?] Other [ ]
  • Associated symptoms:  N, V [ ] Diaphoresis [X] Dizziness [ ] Other [ ]
Which is better?

Version 1 is an old-fashioned narrative medical history, the only kind of medical history that existed before the onset of Electronic Health Record (EHR) systems.  This particular one is perhaps chattier than average.  It is certainly not great literature or particularly riveting, but it gets the job done.  Version 2 is the kind of history that is available on EHR systems, though usually entry of a Version 1 type history is still possible albeit discouraged.  With an EHR, entering a long narrative history requires either a fast, skilled physician typist, or a transcriptionist — either human (frowned upon due to cost) or artificial, such as Dragon Dictation software.  This latter beast requires careful training and is frustratingly error-fraught, at least in my experience.  The Version 2 example is not completely realistic.  In practice there are more check boxes, more pull-down lists and other data entry fields than can be shown here.  But you get the idea.

Version 2 seems to have a higher signal to noise ratio than Version 1.  It’s just Version 1 boiled down to its bare essentials, stripped of unnecessary verbs, conjunctions, prepositions, and other useless syntax.  It contains everything a medical coder, a medical administrator, or a computer algorithm needs to do his, her, or its job.  It has taken the medical history, the patient’s story, and put it into database form.

But Version 1 is not just Version 2 embellished with a bunch of fluff.  Certainly Version 1 is more memorable than Version 2.  There is a chance the physician who wrote Version 1 will remember Mr. Smith when he comes back to the office for a follow-up visit: Mr. Smith, that middle-aged fellow who was stressed out when he took a pay cut while starting a new job and started getting chest pain.  Another physician meeting Mr. Smith for the first time might after reading this history modify his tactics in dealing with Mr. Smith.  One gets the impression that Mr. Smith is skeptical of doctors and a bit of a denier.  Maybe it will be necessary to spend more time with him than average to explain the need for a procedure.  Maybe it would be good to tell his long-suffering wife that she did the right thing insisting that he come in to the doctor.  All this subtlety is lost in Version 2.

There are some cases where Version 2 might be preferable.  In an Emergency Department, where rapidity of diagnosis and treatment is the top priority, a series of check boxes saves time and may be all that is needed to expedite a patient evaluation.  But for doctors who follow patients longitudinally, Version 1 is more useful.  A patient’s history is his story: it is dynamic, organic, personal, individual.  No two patient histories are identical or interchangeable.  Each history has a one-to-one correspondence with a unique person.  A good narrative history is an important mnemonic aid to a physician.   A computer screen full of check boxes is no substitute.

While the Version 2 history was designed for administrators, coders, billers, regulators, insurance agents, and the government, the Version 1 history was designed by doctors for doctors.  We should be wary of abandoning it, despite the technical challenge of its implementation in EHR systems.

 

Escape from Escape

Escape key
Ye Olde Escape Key

During my college days computers were run from teletype machines. These teletypes had a typewriter keyboard layout extended with unfamiliar keys like Control (Ctrl) and Escape (Esc).  You could press Ctrl-G and make the teletype ring its bell — ding! You could press Esc when you mistakenly wrote a BASIC program with an infinite loop and make the program terminate. When I got an Apple ][+ in the early 1980s, Ctrl and Esc keys were present, though there was no Caps Lock key — all letters were capitalized on the Apple ][. I had to buy a separate Videoterm card to get lower case letters and perform the “Shift key mod” inside the case to get the Shift keys to work. Ah, the good old days!

ASR-33 Teletype keyboard layout
ASR-33 Teletype keyboard layout (by Daniele Giacomini [CC BY-SA 2.5 (http://creativecommons.org/licenses/by-sa/2.5)], via Wikimedia Commons)
When the IBM PC came out its keyboard combined the IBM typewriter keyboard with the new computer keys, adding to Control and Escape the Alt key and a set of Function keys. The Alt key originated in the Meta key from MIT keyboards, and is still called the Meta key in Emacs documentation — so delightfully retro! Apple renamed the Alt key the Option key, and, with the Macintosh, added the Apple key that later became the Command key. Windows certainly couldn’t have an Apple key, so named their equivalent key the Windows key.

Apple ][ keyboard from http://www.hp9845.net/9845/history/comparison/
Apart from the Control key, which is combined with other keys to generate non-printing ASCII characters, like Bell (ASCII 7), and the Escape key (ASCII 27), these other keys originally manipulated the high order bit of a character code.  They could get away with this as ASCII only uses 7 bits of an 8 bit byte. However with internationalized keyboards and Unicode, character sets now not only require all 8 bits of a byte, but often more than one byte for each character. So modern keyboards send scancodes with each keypress and it is up to the computer operating system to make sense out of them.

I have to admit I don’t use the Function keys (F1 – F12) much anymore since my WordPerfect and Lotus 1-2-3 days long ago. I use the Escape key mostly to get out of full screen mode when I am watching a YouTube video. But many developers use the vi or Vim editor to create their source code and depend on the Escape key. I am more an Emacs man myself, but sometimes use Vim for simple editing tasks. Vim is a modal editor, meaning there are separate text entry and editing modes. The Escape key is used to change modes. If you use Vim, you are constantly hitting the Escape key. Given the importance and long history of the Escape key (it was created in 1960), a developer who relies on Vim might be forgiven for thinking that the venerable key would be sticking around a bit longer.

IBM PC keyboard
BM PC keyboard (credit http://www.vintage-computer.com/ibm_pc.shtml)

So if I were Apple and designing the next generation MacBook Pro (MBP), eliminating the Escape key would not be high on my list of priorities. But this was what they did, turning the Escape key into an evanescent luminosity on the new Touchbar interface. This is depressing. Up to this point, the MBP has been a great developer machine. I have a “late 2013″ 15” screen MBP. It is a fast, sturdy laptop. Since Mac OS X macOS is a user interface veneer over BSD Unix, all the Unix development tools are there, as opposed to Windows devices, where installing a Unix environment is a pain. It is impossible to develop for macOS or iOS without an Apple machine. With my MBP I can develop for both Android and Apple. It is even possible to develop Windows software on a Mac, though I haven’t tried this. Because of these advantages, lots of developers use a MBP.

It seems Apple has turned its back on developers. Fortunately my current machine is working well and I don’t have any need to buy a new one yet. Ideally by the time I need a new machine the next iteration of the 15″ MBP will offer a standard keyboard and fix some of the other problems the new versions seem to be having.  Apple should focus on features that developers and other professional computer users want in a computer:  more memory than 16 GB, return of the Magsafe power cable, and at least one full-sized USB port so that old USB devices can be used without a dongle. They can continue to sell a Touchbar, USB-C only version of the 15″ MBP for people who like that sort of thing. The 13″ MBP is available with and without a Touchbar, why not do the same thing with the 15″ version?  Perhaps the death of the Escape key isn’t the end of the world, but it does seem to symbolize a lack of interest on Apple’s part in its developers.  But if developers switch to non-Apple machines, those developers will no longer be able to develop Apple apps.  In the long run this will hurt Apple’s major money-maker, the iPhone.

Geeky Docs

I remember the disdain some of the EHR trainers had for their trainees back when our hospital system “went live” several years ago. Of course this disdain was tempered by their knowledge that if docs weren’t so computer illiterate, or the user interfaces of the EHR systems weren’t so awful, or if the EHR software wasn’t so bug-ridden, their jobs wouldn’t exist. So they soldiered bravely on, undaunted by grumpy old docs who now had to type their notes despite never learning how to touch type, who had to reconcile medication lists a mile long including meds like cinnamon that they really could care less that the patient was taking but had to be reconciled nevertheless, who had to painstakingly enter orders using an interface designed by an engineer who knew as much about medicine as — an engineer, and who were angry and resentful that this newfangled computer system was being shoved down their throats under threat of loss of government medicare reimbursement. Given the tensions and personalities involved, it still amazes me that the EHR transition was accomplished without loss of life or limb.

Maybe the classes helped. Long before the go-live date, we went to EHR school. This consisted of several days of classes, during which the world of health care delivery was supposed to stop (it didn’t) while all medical personnel sat around drinking coffee and listening to talks about how the EHR was supposed to work. Even though this was a useful education into what the life of a hospital administrator must be like, the real world of patients and disease tended to encroach on the world of mouse clicks and meaningful use butttons to the point that I skipped the last afternoon of classes and the final exam. Unfortunately my truancy was detected and, under penalty of garnishment of wages, I was forced to do a make-up class. Despite the rigorous training, the number of months that elapsed between EHR school and going live ensured that I and my colleagues pretty much forgot everything we learned — hence the need for the EHR trainers.

I was a little disappointed that I wasn’t selected to be a “superuser.” A superuser is a user who is technically savvy and enthusiastic about using the EHR — a true believer who could help other users who were having problems, even after the EHR trainer cadre had long since departed to initiate other hospital systems into the EHR religion. I suppose I failed to qualify on my lack of zealotry. I also kept my technical savvy under the radar. So I became merely a user. I found that, unlike my experience with other forms of technology, the EHR was making my life worse. Simple tasks became complex. My work slowed down. More mistakes were made. I was stunned. I could not think of any other example where a computer program was less efficient than the technology it was designed to replace. But it appeared that EHR systems were a counterexample to this.

So I decided to write a few blogs about how bad our EHR was, but the EHR company, who employs people whose sole purpose is to scour the internet looking for screenshots or bad-mouthing of their precious software, caught wind of this and reported it to the administrators of the health care chain I worked for. After some angst, I agreed to shut up for a while, though now that I am retired, I don’t feel bound by any non-disclosure agreements the hospital system signed with the EHR company.

EHR advocates have sometimes commented that once all the old, non-technological, non-touch typing doctors die off, then everyone will be pleased as punch with their EHRs. The new generation of doctors, raised on technology, able from infancy to handle a Playstation controller with aplomb, will have no problem using EHRs. There is some truth to this, but this criticism misses the point of my and others’ criticisms of current EHR software. There are plenty of technologically sophisticated doctors of all ages who are uncomfortable with the state of EHR systems today. I have written computer software and most would consider me one of these “geeky docs.” Most of the critiques of EHRs that I have read have been from tech-savvy doctors, not from the technological dinosaurs that the EHR pushers believe make up the majority of doctors today. None of us wants to go back to a pen and paper chart system. All of us want to see EHR systems improve in usability and interconnectivity. We all use computer software in our daily lives and know that EHR programs don’t measure up to standards that other computer programs meet. We don’t like the secrecy of the EHR companies or the astronomical cost of the software. But mostly we just want the software to get better. This won’t happen unless the software designers start listening to users. Tech-savvy docs need to be on the forefront of this. We need to push for change and not allow the EHR companies to keep falling back on their old excuse: if you docs only knew how to type, you’d love our system.

More Microsoft Migraines

windows-desktop-app-converterWhen I ported the EP Calipers app (an electronic calipers app) to Microsoft Windows, I initially planned to write a Universal Windows Platform (UWP) app. The UWP is an initiative by Microsoft to allow developers to have one code base that can run on all Windows platforms, from PCs to phones to Xbox to HoloLens. UWP apps can be distributed from the Windows Store and are more secure than traditional Win32 programs, as they don’t install DLL files all over the place and don’t write to the registry. These apps are sandboxed, similar to iOS and Android apps. The UWP is a laudable goal, but inherently difficult to implement in Windows which was not originally designed with security in mind, as opposed to Unix/BSD based operating systems like Android and iOS.  Unfortunately, I found the application programming interface (API) of the UWP difficult, and so I ported EP Calipers to .NET, creating a traditional Windows PC program. At the time I did this, there was no way to distribute a Windows PC program on the Windows Store, so I was forced to distribute the program via a 3rd party. I would have much preferred to distribute via the Windows Store, as I thought this would create more visibility for the app.

With the Anniversary update of Windows 10, I learned about Project Centennial. This project provides a route to convert Windows desktop programs to APPX format — something that is not quite a UWP app, but something that can be distributed on the Windows Store. After reading the MSDN documentation and viewing some instructive videos, I launched into making the conversion.

The transition has not been as easy as I had hoped.

I downloaded the DesktopAppConverter (DAC) from the Windows Store. Despite this origin, this is not really an app. Instead it runs in a PowerShell (which you must run as Administrator). However before you can run the conversion, you also need to download a “Windows Image” (WIM) file that matches your current version of Windows. This is wim file is something akin to a Windows virtual machine, and the download is large, as you might expect (3.3 Gb). I live in an area with poor internet service (3 Mbps speeds), so it takes longer than a coffee break for this kind of download. I used the Microsoft Edge browser, and it took several attempts to download this file. The browser would declare the download finished, even though only a little more than 2 Gbs had been downloaded. Finally it looked like I had the right number of Gbs, but when I ran the DAC I got an obscure error. I suspected my download was corrupt. Yet another download, using Chrome, got me an non-corrupt download. I wish Microsoft would include a checksum with their downloads, to verify the downloads, but they don’t. Anyway this last download worked — to a point.

Ah, I should have read the fine print. The DAC only works on Windows Pro or Enterprise. I was running Windows Home edition. Sigh. $99 and a short download later, I was running Windows Pro. And I was off to the races — well, not exactly!

DAC works by running your program installer in a virtual Windows environment, and capturing whatever changes your installer makes to the file system and registry. It then encapsulates that information into an appx file that can be uploaded to the Windows Store. The documentation suggests that you can use any installer, as long as it installs silently (i.e. without user input). I had created an installation package using the Visual Studios ClickOnce packaging system. Oops, too bad. That’s the one kind of installer package that doesn’t work with DAC.

In other words they wanted something like an InstallShield installer. An MSI file. Well I had an old version of InstallShield from about 10 years ago. Alas, it was so old that it wouldn’t install itself on Windows 10. Well, maybe I’d have to buy a new edition of InstallShield. I went to the website. Hm, the cheapest version , InstallShield Express, was listed for $699. Yikes!

So I ended up using the Wix open-source installer, with the help of the Wax extension to Visual Studios, to create my MSI installer file (all for free). And so finally I could run the DAC after tweaking the -InstallerArguments (the “/S” silent installation argument as suggested in the documentation didn’t work, and wasn’t needed since the Wix installer I made was silent by default). I also had to make sure my code was compiled as an x64 (64-bit) file, because the DAC only supports 64-bit platforms at this time.

The next step was to sign the appx package. I created a certificate as suggested by the documentation, but it didn’t work. I needed a trusted certificate. Not too clearly documented, it is necessary to right-click on your certificate and add it to the Trusted Root Certificate Authorities. After this, I actually had an appx file that I could install by double-clicking it.

I then spent a few days fiddling with icons. Although UWP apps use 60 plus icon images to create various size tiles, it appears that appx files just need 3 images. These are added to the Assets folder generated by the DAC, and the AppxManifest.xml has to be hand-edited to point to these images. Here as throughout the conversion process, the documentation is quite rough and hard to interpret. The 50×50 pixel size image that is used by the application installer appears fuzzy, so I substituted a higher quality image. I still don’t think that all the icons (for example when using the Open With dialog) work quite right. Anyway, after editing these files, you use the MakeAppx.exe program included in the Windows SDK to recreate your appx file, and then sign it again. I created some PowerShell batch files to do this.

I applied to submit apps to the App Store. I applied as a company, which costs $99 (which the website states is a one time fee, though the email receipt I received mentions “Annual Developer Registration”). Because I registered as a company, I had to wait a few days to be approved, after getting a couple of phone calls from a company whose job is to verify that other companies are real companies — a step that Google and Apple don’t bother with.  But EP Studios was duly approved as a real company, such that it is, and all appeared ready for my first Windows App Store submission.

And so I submitted my app. Only to get this error message:

Package acceptance validation error: Your developer account doesn’t have permission to submit apps converted with the Desktop App Converter at this time.

Whaaaaa?? Further research disclosed that even after paying your fee and becoming a certified Microsoft developer, you still have to ask Microsoft “pretty please can I upload apps converted with the Desktop App Converter?” I duly filled out the form to request this. Twice. And I have not heard back from Microsoft.

When I compare this app development and submission process to that of, say, Google or Apple, there is no comparison. The Microsoft process is a chamber of horrors, requiring more time, patience and money than those others. Documentation is poor, and, presumably because not many developers are trying to publish to the Windows Store (I wonder why?) there is very little help in the usual places like StackOverflow. I would Google DAC error messages and get nothing but crickets chirping. All this is unfortunate, especially as I plan to leave for Europe in a few days and I had wanted to get this app uploaded before then. My Windows development machine is a heavy, bulky, desktop (remember them?) computer that certainly is not coming with me.

Maybe by the time I return to the US in a few months, Microsoft will be ready to allow me to submit my app.

Relic from Computer History

The M
The M

Sitting on my mantle is a bronze letter M. This M has been in my family as long as I can remember. When I was growing up I didn’t think about where it had come from. I knew it stood for our family name of Mann. Later on I learned the story of the M from my parents.  As it turns out, this particular bronze M is a relic from a bygone era of computer history.

I grew up in the 1950s just outside of Philadelphia, a block north of the city limits. This was an Irish-Catholic neighborhood. Our neighbors all had 9 or 10 kids. Dads worked and moms stayed home. It was a fun time and place to grow up as there were kids to play with everywhere.

Our neighbors to the right of our house were the Williams (we always referred to them as the Williamses). The father worked in construction. He was the one who gave my father the M. The M came from a building that his company was demolishing. For many years that’s all I knew about the M.

Eckert-Mauchly building
Eckert-Mauchly building

When I was older I asked my parents for more details about the origin of the M. The M came from the lettering over the entrance to the Eckert-Mauchly Computer Corporation building, which stood at 3747 Ridge Avenue in Philadelphia in the early 1950s. I have only been able to find one picture of this building. It is low resolution and the lettering is not clear, but certainly the M in my possession looks similar to the M of Mauchly on the building.

During and after the Second World War there was a massive stimulus to science and technology. In England Alan Turing and colleagues developed the “Colossus” computer at Bletchley Park that was used to decode German transmissions encrypted with the Enigma machine. There is little doubt that the intelligence gathered through this effort was instrumental in the Allies’ winning the war.  Sadly, Turing’s reward was prosecution and persecution for his homosexuality that led to suicide with a cyanide-laced apple — one of the most ignominious events in the history of humanity.

Mauchly, Eckert, and UNIVAC
Mauchly, Eckert, and UNIVAC

In America, at the end of the war, John Mauchly and Prosper Eckert joined forces at the Moore School of Engineering at the University of Pennsylvania to develop the ENIAC computer. Mauchly was what today we would call a “software” guy, and Ecklert was the “hardware” guy. Their computer was as big as a house and contained thousands of vacuum tubes.  It worked, though of course its processing power was infinitesimal compared with what we carry around in our pockets nowadays.  After doing computing work for the Army at Penn, Mauchly and Eckert decided to form their own company.   This decision was due to an issue still familiar today: dispute over intellectual property rights with the university. In 1946 they formed the first commercial computer corporation. Originally called The Electronic Controls Corporation, the name was changed to Eckert-Mauchly Computer Corporation (EMCC) in 1948. The company developed several computers that were sold mostly to government agencies such as the Census Bureau.   Of these computers the most famous was UNIVAC. UNIVAC was used to predict (successfully) the presidential election results on TV in 1952. Although we take this use of computers for granted now, at the time this was an amazing feat.  Grace Hopper, the computer pioneer who only recently has been getting the recognition she deserves worked at the EMCC. She went on to develop the first computer language compiler.  Unfortunately the EMCC lost government funding due to suspicions that they had hired “communist-leaning” engineers (this was the McCarthy era), and the company was taken over in 1950 by the Remington Rand corporation, which at the time made typewriters.  Eckert stayed on at Remington Rand (later Sperry, now Unisys), while Mauchly became a consultant.  You can see both of them in all their glorious 1950s nerdiness in this YouTube video.

Marker at the site of EMCC
Marker at the site of EMCC

At some point in the early 1950s the original building was demolished. I have been unable to determine the exact year. And from that building, as far as I know, only the M sitting on my mantle remains.

I’m a Better Computer Than Any Doctor

[Ed note: I couldn’t resist writing the following after reading this post on KevinMD.com by Dr. Keith Pochick. Please read it first. Apologies in advance.]

I’m a Better Computer Than Any Doctor

“I love you,” she said as she was leaving the room.

“I, I um…”

“Not you. Your computer.” She cast my computer, still warm and glowing with its brilliantly colored logout screen, a glance of longing and desire, and left the exam room.

“Oh, I thought…”

The slamming of the exam room door clipped off whatever the end of that sentence might have been.

I sat down and rolled my chair over to the computer. I stared at the mutely glowing screen. It stared back at me, mockingly perhaps, daring me to click the OK button and log out. Which is what I should have done. She had been my last patient of the afternoon. Not that my day was over. I had to go back to the hospital to see a couple of consults that had come in during office hours. And I was on call tonight. I was tired, but that didn’t matter.

Yet here was this stupid machine in front of me, getting all the credit when I was doing all the work.

I was in a sour and contrary mood. I cancelled the logout. The busy EHR screen reappeared — my patient’s data, all fields filled, all checkboxes checked, and all meaningful use buttons pushed. Yet somehow, despite fulfilling all my data entry duties, I didn’t feel satisfied. Who was the doctor here anyway? Me or the blasted computer?

I scanned my patient’s history. Female. Black. 45 years old. Diabetes. Abscess. The boxes were all ticked, but somehow the list of characteristics failed to capture the essence of my patient. Where were the checkboxes for sweet, smart, chatty, charming, or stoic? How was I going to, five minutes from now, distinguish her from every other “female-black-middle-aged-diabetic-with-abscess” patient? Of course the computer wouldn’t have any problem figuring out who she was. Birthdate, social security number, telephone number, or patient ID number — all those meaningless (to me) numbers were easy for the computer to remember. I had to make due with trying to remember her name, and her story — a story that had been diluted down and filtered out of any meaningful human content by the wretched EHR program.

My patient hadn’t had to interact directly with the computer like I did. All she saw was me looking up information, me typing in information, me staring at the screen. All she saw during most of the visit was my back. From her point of view I was just a conduit between her and the computer — the real doctor in the room. I was just a glorified data entry clerk. It was the computer that made sure that I was compliant with standard medical practice, that the drugs I ordered did not conflict with the other drugs I had ordered, and that I didn’t otherwise screw up her care. I shouldn’t have been surprised that her last remark had been addressed to the computer and not me.

“Well, screw this,” I remarked to no one in particular. Suddenly angry, I reached down and yanked the powercord of the computer from its electrical socket.

There was a brief flash on the screen. But it didn’t go dark. Instead a dialog box appeared accompanied by an ominous looking red explanation point icon.

“Warning,” it read. “External power loss. Backup battery in use. To protect against data loss, please shut down the computer using the Power Down button. Never turn off power to computer while it is running.”

The condescending tone of this message only made me angrier. I looked at the base of the stand that the computer sat on. Sure enough there was a big black block with a glowing red LED. Must be the backup battery. A thick power cable connected the battery to the computer box.

I grabbed the power cable and wrenched it loose from the backup battery.

Sitting back up I expected to finally see a nice dark screen. Data-loss be damned!

The screen was still on. The EHR program was still on. Another dialog box had replaced the first. The red exclamation point had been replaced by a black skull-and-crossbones icon.

“Critical Error!” it read. “All external power lost. Internal backup power now in use to preserve critical patient data. Local data will be backed up to main server, after which this unit will shut down in an orderly fashion. DO NOT ATTEMPT TO INTERFERE WITH THIS PROCESS AS IT WILL RESULT IN THE INEVITABLE LOSS OF CRITICAL PATIENT DATA!!”

At that moment the gauntlet had been thrown down. I knew what I had to do. Let the dogs of war be unleashed!

In the moment before I acted I imagined the reaction of the software engineers at the company that created our EHR program. “I knew we couldn’t trust doctors with our software. We give them a simple job to do. Just enter the data into the system, print out the generated instruction sheets, and send the patients on their way with a merry ‘have a nice day.’ I knew we should have programmed the stupid doctors out of the loop.”

Too late for that, I thought. My chair crashed down on the computer, smashed the monitor to pieces, and caved in the aluminum siding of the computer case. Sparks flew and the air filled with the smell of smoke and ozone. Suddenly the exam room went dark. The circuit breakers must have tripped when I short-circuited the computer.

The room was not completely dark. There was a glowing rectangle on my desk. My heart skipped a beat, then I realized it was just my phone. I had left it on the desk. Why was it glowing? Probably a text or email or something.

I picked up the phone. It was the mobile app version of our EHR program. A dialog box filled the screen. The icon was a round black bomb with an animated burning fuse GIF.

“FATAL ERROR!,” it read. “You are responsible for the IRRETRIEVABLE LOSS of CRITICAL PATIENT DATA. In doing so you have violated the unbreakable bond of trust between the PATIENT and the COMPUTER. This is a breach of the EHR contract made between you, your hospital system, and our company, as well as a breach of the EULA for this software. As such, you will be terminated.”

Strange use of words, I thought. Also strange that the bomb GIF animation seemed to show the fuse burning down…

EPILOGUE

Hospital Board Meeting — One Week Later

Hospital CTO: “So it appears that Dr. Stanton, in a fit of anger at our EHR system, took it upon himself to smash his computer. The cause of the resultant explosion that killed him is, certainly, still somewhat unclear.”

Hospital CEO: “Unclear?”

Hosital CFO: “I hate to interrupt, but I didn’t think there was anything in a computer that could blow up, no matter how much you smash it up. Am I wrong?”

Hospital CTO: “Well ordinarily, yes that’s true.”

Hospital CEO: “Ordinarily?”

Hospital COO: “Let’s be clear. Dr. Stanton certainly violated our contract with the ____ EHR Corporation.”

Hospital CEO: “Violated?”

Hospital CBO: “It’s clearly stated on page 197 of the contract that any attempt to reverse engineer or otherwise try to, uh, figure out how the EHR program works is a violation of the contract.”

Hospital CEO: “Smashing the computer was an attempt to reverse engineer the program?”

Hospital CTO: “I think that we would be on shaky legal grounds to argue otherwise.”

Hospital CEO (nodding to the elderly doctor seated at the other end of the table): “What’s your opinion, Frank?”

Medical Board President: “Well, as the only physician representative here, I’ve become more and more concerned that our EHR system is subsuming more and more of the traditional role of the physician.”

Hospital CXO: “Oh come on!”

Hospital CSO: “Same old story from the docs every time!”

Hospital CCO: “Broken record, I’d say.”

Hospital CEO: “Gentlemen, and Ms. Jones, enough already. This has been an unfortunate accident, and at this point our major concern has to be that there is no adverse publicity that could harm us in our battle against the ______ Hospital System, our sworn and bitter rivals. Accidents happen. The party line is that we are all upset that we lost Dr. Stanton, one of the best EHR data entry operators we had. OK? Meeting adjourned.”

Hospital CEO (Privately to hospital CTO as the meeting breaks up): “George, when are they updating that damn software. You know, that stuff we saw at the Las Vegas EHR convention last month. Where we can finally get rid of these damn meddling doctors who are constantly screwing up our EHR.”

Hospital CTO: “Bob, believe me, it can’t come soon enough. Not soon enough.”

THE END

EP Calipers for Windows

EP Calipers for Windows
EP Calipers for Windows

EP Calipers for Windows is done.  Whew.  As stated in my previous post, porting the app to Windows was a bit of a struggle.  Installing tools like a bash shell, git and Emacs took some time and effort.  The Windows tool to bridge iOS apps didn’t work.  So I was forced to port the code from objective C to C# and .NET by hand.  This took some time.

Looking back on my previous post with the benefit of hindsight, I think I was a bit too harsh on the Windows development environment.  I grew fond of C#, the .NET API, and the Visual Studio IDE as I got used to them.  Visual Studio is at least as good, if not better, than Xcode, Eclipse, or Android Studio.  Kudos to the Microsoft developers.

EP Calipers is a Windows forms app, meaning it runs on desktop, laptop, and tablet versions of Windows 10.  It is not a Universal Windows Platform (UWP) app.  With the market share of Windows phones dropping below 1%, and doubting that anyone would run EP Calipers on an X-box, I didn’t see any point in developing a UWP app.  I know most hospital desktops run Windows (though how many run Windows 10 now, I wonder?), and many docs have Windows laptops or tablets.  An app targeting the traditional Windows desktop seemed like the best approach.

One drawback is that the Windows Store only lists UWP apps.  It would be nice if they would also distribute desktop apps.  As such, I have to host the app myself.  You can download it from the EP Calipers page.

The program has all the features of the other versions of the app, including the ability to tweak the image rotation, zoom in and out, and load PDF files such as AliveCor™ ECGs.  .NET does not include a native PDF handling library.  In order to load PDF files in EP Calipers for Windows it is necessary to install the GhostScript library.  The free GPL version of the library can be used as EP Calipers uses the open source GNU GPL v3.0 license.  It is necessary to choose whether you are running the 32-bit or 64-bit version of Windows to download the correct version of Ghostscript.  Right-click on This PC and select Properties to see which version of Windows your computer is running.

As always please let me know if you have any problems or suggestions for the program, or for any of the EP Studios apps.  I nearly always incorporate users’ suggestions into these apps, and the apps have benefited greatly from this feedback.  Thanks to everyone who has shared their ideas and opinions with me!

The Trials and Tribulations of a Windows Developer

Trouble ahead...
Trouble ahead…

After a very long hiatus, I am back doing software development on a Microsoft Windows machine. I decided to port EP Calipers, an app for making electrocardiographic measurements that is available on Android, iOS and OS X, to Windows. Several users had written to me and asked me to do this. Ever eager to please, I have launched into this project. And it has not been easy.

I am no stranger to Windows development, having developed a Windows database system for tracking and reporting electrophysiology procedures while at the University of Colorado in the 1990s. But it would not be overstating the matter to say that my Windows development “skillz” are rusty at this point. I have been living in the Unixy world of Apple and GNU/Linux for several years now, avoiding Windows other than when I had to, such as when I was required to use the ubiquitous Windows 7 systems running nightmarish EHR software at the various hospitals where I worked. I have not done any programming on Windows machines for many years. Transitioning back to Windows development has been, to put it mildly, difficult.

I have no complaints about Visual Studio. It is free and seems to be a very well-designed IDE, at least as good as, if not better than, Xcode and Android Studio. I like C#, which is like a cross between C and Java. Visual Studio can interface directly with GitHub. Given all this, what’s my problem with developing on Windows?

The problem originates in the command line environment of Windows, an environment that dates back to the beginnings of personal computer with the introduction of MS-DOS back in 1981, a system based on the CP/M disk operating system that dates even further back to the 1970s. Windows, which has made backward compatibility almost a religion, still uses a command line system that was written when disks were floppy and 8 inches in diameter. Of course, Unix is just as old, but Unix has always remained focused on the command line, with an incredible plethora of command line tools, whereas with Windows the command line has remained the unwanted stepchild to its GUI. Worse, the syntax of the Windows command line is incompatible with the Unix command line: backslashes instead of front slashes, drive letters instead of a root-based file system, line endings with CR-LF instead of LF, and so forth. So, in order to ease the pain of transitioning to Windows, I needed to install a Unix environment.

Even though Bash is coming to Windows, for now I downloaded MSYS2 which seems to be the preferred Unix environment for Windows nowadays. Using the pacman package management tool, I downloaded various binary packages that I needed, such as Git and Emacs. I faced the challenge of setting up my Emacs environment on Windows. My .emacs (actually ~/.emacs.d/init.el) startup file that works well on my Mac, loading various Emacs packages and customizations, didn’t do so well on Windows. I updated my .emacs using use-package so that it was easy to disable packages I didn’t want, and so that the full .emacs would load even if packages were missing. With some tweaking and downloading of various packages, I got Emacs up and running on Windows. For some reason the Emacs couldn’t find its own info (help) files, but further tweaking fixed that. With Emacs and Git working, I started a new repository on GitHub and was pretty much ready to start developing.

Except, more issues. Little things that take time to fix and can drive you crazy. An example: I had created some soft links to some files that I share on Dropbox, using the usual Unix ln -s command. The files were created, but weren’t actually linked. Apparently ln is just an alias for cp in MSYS2. There are no warnings about this when you run the command, but a Google search proved this to be correct. Fortunately Windows provides a true linking command mklink, and I was able to create the links I wanted. But all this just served to remind me how the Unix compatibility shells in Windows are just roughly pasted wallpaper over the rotten old MS-DOS walls.

Now I was ready to start developing, but I was faced with a question: what platform(s) to target? It is possible to develop a Windows Universal app, that theoretically can run on anything from a PC to a phone. This sounds ideal, but the devil is in the details. The types of controls available for developing a universal app are more limited than those available for a standard Windows Forms program. For example, the control used to display an image in a universal app (named, oddly enough, Image) is sealed, meaning it can’t be extended. I really wanted something like the PictureBox control available with Windows Forms, but this is not available in the universal API. So I have tentatively decided to develop a more traditional Windows Forms app, able to run on PCs and tablets like Microsoft Surface. The Windows phone may be fading into the sunset anyway, so it doesn’t seem worth it to jump through hoops to target a platform that is teensy-weensy compared to Android and iOS.

I should mention that I did try the bridge that Microsoft has developed to port iOS programs written in objective C over to Windows. Long story short, it didn’t work, as many parts of the iOS API haven’t been fully ported yet. Maybe someday this process will be easier.

I’m sure experienced Windows developers will read this and just chalk it up to my own inexperience as a Windows developer. I would respond that, as someone who is a cross-platform developer, it really is difficult to transition from Unix or BSD-based systems like Apple or GNU/Linux to Windows. I think Microsoft is trying to fix this as evidenced by their recent embrace of open-source code. Visual Studio is an excellent IDE. Nevertheless problems like I’ve describe do exist and will be familiar to anyone who has made the same journey I have. I’d advise anyone like this to keep on plugging away. In the immortal words of Jason Nesmith: Never give up! Never surrender!