Computers & Software History

Relic from Computer History

The M
The M

Sitting on my mantle is a bronze letter M. This M has been in my family as long as I can remember. When I was growing up I didn’t think about where it had come from. I knew it stood for our family name of Mann. Later on I learned the story of the M from my parents.  As it turns out, this particular bronze M is a relic from a bygone era of computer history.

I grew up in the 1950s just outside of Philadelphia, a block north of the city limits. This was an Irish-Catholic neighborhood. Our neighbors all had 9 or 10 kids. Dads worked and moms stayed home. It was a fun time and place to grow up as there were kids to play with everywhere.

Our neighbors to the right of our house were the Williams (we always referred to them as the Williamses). The father worked in construction. He was the one who gave my father the M. The M came from a building that his company was demolishing. For many years that’s all I knew about the M.

Eckert-Mauchly building
Eckert-Mauchly building

When I was older I asked my parents for more details about the origin of the M. The M came from the lettering over the entrance to the Eckert-Mauchly Computer Corporation building, which stood at 3747 Ridge Avenue in Philadelphia in the early 1950s. I have only been able to find one picture of this building. It is low resolution and the lettering is not clear, but certainly the M in my possession looks similar to the M of Mauchly on the building.

During and after the Second World War there was a massive stimulus to science and technology. In England Alan Turing and colleagues developed the “Colossus” computer at Bletchley Park that was used to decode German transmissions encrypted with the Enigma machine. There is little doubt that the intelligence gathered through this effort was instrumental in the Allies’ winning the war.  Sadly, Turing’s reward was prosecution and persecution for his homosexuality that led to suicide with a cyanide-laced apple — one of the most ignominious events in the history of humanity.

Mauchly, Eckert, and UNIVAC
Mauchly, Eckert, and UNIVAC

In America, at the end of the war, John Mauchly and Prosper Eckert joined forces at the Moore School of Engineering at the University of Pennsylvania to develop the ENIAC computer. Mauchly was what today we would call a “software” guy, and Ecklert was the “hardware” guy. Their computer was as big as a house and contained thousands of vacuum tubes.  It worked, though of course its processing power was infinitesimal compared with what we carry around in our pockets nowadays.  After doing computing work for the Army at Penn, Mauchly and Eckert decided to form their own company.   This decision was due to an issue still familiar today: dispute over intellectual property rights with the university. In 1946 they formed the first commercial computer corporation. Originally called The Electronic Controls Corporation, the name was changed to Eckert-Mauchly Computer Corporation (EMCC) in 1948. The company developed several computers that were sold mostly to government agencies such as the Census Bureau.   Of these computers the most famous was UNIVAC. UNIVAC was used to predict (successfully) the presidential election results on TV in 1952. Although we take this use of computers for granted now, at the time this was an amazing feat.  Grace Hopper, the computer pioneer who only recently has been getting the recognition she deserves worked at the EMCC. She went on to develop the first computer language compiler.  Unfortunately the EMCC lost government funding due to suspicions that they had hired “communist-leaning” engineers (this was the McCarthy era), and the company was taken over in 1950 by the Remington Rand corporation, which at the time made typewriters.  Eckert stayed on at Remington Rand (later Sperry, now Unisys), while Mauchly became a consultant.  You can see both of them in all their glorious 1950s nerdiness in this YouTube video.

Marker at the site of EMCC
Marker at the site of EMCC

At some point in the early 1950s the original building was demolished. I have been unable to determine the exact year. And from that building, as far as I know, only the M sitting on my mantle remains.

Computers & Software Medicine Stories

I’m a Better Computer Than Any Doctor

[Ed note: I couldn’t resist writing the following after reading this post on by Dr. Keith Pochick. Please read it first. Apologies in advance.]

I’m a Better Computer Than Any Doctor

“I love you,” she said as she was leaving the room.

“I, I um…”

“Not you. Your computer.” She cast my computer, still warm and glowing with its brilliantly colored logout screen, a glance of longing and desire, and left the exam room.

“Oh, I thought…”

The slamming of the exam room door clipped off whatever the end of that sentence might have been.

I sat down and rolled my chair over to the computer. I stared at the mutely glowing screen. It stared back at me, mockingly perhaps, daring me to click the OK button and log out. Which is what I should have done. She had been my last patient of the afternoon. Not that my day was over. I had to go back to the hospital to see a couple of consults that had come in during office hours. And I was on call tonight. I was tired, but that didn’t matter.

Yet here was this stupid machine in front of me, getting all the credit when I was doing all the work.

I was in a sour and contrary mood. I cancelled the logout. The busy EHR screen reappeared — my patient’s data, all fields filled, all checkboxes checked, and all meaningful use buttons pushed. Yet somehow, despite fulfilling all my data entry duties, I didn’t feel satisfied. Who was the doctor here anyway? Me or the blasted computer?

I scanned my patient’s history. Female. Black. 45 years old. Diabetes. Abscess. The boxes were all ticked, but somehow the list of characteristics failed to capture the essence of my patient. Where were the checkboxes for sweet, smart, chatty, charming, or stoic? How was I going to, five minutes from now, distinguish her from every other “female-black-middle-aged-diabetic-with-abscess” patient? Of course the computer wouldn’t have any problem figuring out who she was. Birthdate, social security number, telephone number, or patient ID number — all those meaningless (to me) numbers were easy for the computer to remember. I had to make due with trying to remember her name, and her story — a story that had been diluted down and filtered out of any meaningful human content by the wretched EHR program.

My patient hadn’t had to interact directly with the computer like I did. All she saw was me looking up information, me typing in information, me staring at the screen. All she saw during most of the visit was my back. From her point of view I was just a conduit between her and the computer — the real doctor in the room. I was just a glorified data entry clerk. It was the computer that made sure that I was compliant with standard medical practice, that the drugs I ordered did not conflict with the other drugs I had ordered, and that I didn’t otherwise screw up her care. I shouldn’t have been surprised that her last remark had been addressed to the computer and not me.

“Well, screw this,” I remarked to no one in particular. Suddenly angry, I reached down and yanked the powercord of the computer from its electrical socket.

There was a brief flash on the screen. But it didn’t go dark. Instead a dialog box appeared accompanied by an ominous looking red explanation point icon.

“Warning,” it read. “External power loss. Backup battery in use. To protect against data loss, please shut down the computer using the Power Down button. Never turn off power to computer while it is running.”

The condescending tone of this message only made me angrier. I looked at the base of the stand that the computer sat on. Sure enough there was a big black block with a glowing red LED. Must be the backup battery. A thick power cable connected the battery to the computer box.

I grabbed the power cable and wrenched it loose from the backup battery.

Sitting back up I expected to finally see a nice dark screen. Data-loss be damned!

The screen was still on. The EHR program was still on. Another dialog box had replaced the first. The red exclamation point had been replaced by a black skull-and-crossbones icon.

“Critical Error!” it read. “All external power lost. Internal backup power now in use to preserve critical patient data. Local data will be backed up to main server, after which this unit will shut down in an orderly fashion. DO NOT ATTEMPT TO INTERFERE WITH THIS PROCESS AS IT WILL RESULT IN THE INEVITABLE LOSS OF CRITICAL PATIENT DATA!!”

At that moment the gauntlet had been thrown down. I knew what I had to do. Let the dogs of war be unleashed!

In the moment before I acted I imagined the reaction of the software engineers at the company that created our EHR program. “I knew we couldn’t trust doctors with our software. We give them a simple job to do. Just enter the data into the system, print out the generated instruction sheets, and send the patients on their way with a merry ‘have a nice day.’ I knew we should have programmed the stupid doctors out of the loop.”

Too late for that, I thought. My chair crashed down on the computer, smashed the monitor to pieces, and caved in the aluminum siding of the computer case. Sparks flew and the air filled with the smell of smoke and ozone. Suddenly the exam room went dark. The circuit breakers must have tripped when I short-circuited the computer.

The room was not completely dark. There was a glowing rectangle on my desk. My heart skipped a beat, then I realized it was just my phone. I had left it on the desk. Why was it glowing? Probably a text or email or something.

I picked up the phone. It was the mobile app version of our EHR program. A dialog box filled the screen. The icon was a round black bomb with an animated burning fuse GIF.

“FATAL ERROR!,” it read. “You are responsible for the IRRETRIEVABLE LOSS of CRITICAL PATIENT DATA. In doing so you have violated the unbreakable bond of trust between the PATIENT and the COMPUTER. This is a breach of the EHR contract made between you, your hospital system, and our company, as well as a breach of the EULA for this software. As such, you will be terminated.”

Strange use of words, I thought. Also strange that the bomb GIF animation seemed to show the fuse burning down…


Hospital Board Meeting — One Week Later

Hospital CTO: “So it appears that Dr. Stanton, in a fit of anger at our EHR system, took it upon himself to smash his computer. The cause of the resultant explosion that killed him is, certainly, still somewhat unclear.”

Hospital CEO: “Unclear?”

Hosital CFO: “I hate to interrupt, but I didn’t think there was anything in a computer that could blow up, no matter how much you smash it up. Am I wrong?”

Hospital CTO: “Well ordinarily, yes that’s true.”

Hospital CEO: “Ordinarily?”

Hospital COO: “Let’s be clear. Dr. Stanton certainly violated our contract with the ____ EHR Corporation.”

Hospital CEO: “Violated?”

Hospital CBO: “It’s clearly stated on page 197 of the contract that any attempt to reverse engineer or otherwise try to, uh, figure out how the EHR program works is a violation of the contract.”

Hospital CEO: “Smashing the computer was an attempt to reverse engineer the program?”

Hospital CTO: “I think that we would be on shaky legal grounds to argue otherwise.”

Hospital CEO (nodding to the elderly doctor seated at the other end of the table): “What’s your opinion, Frank?”

Medical Board President: “Well, as the only physician representative here, I’ve become more and more concerned that our EHR system is subsuming more and more of the traditional role of the physician.”

Hospital CXO: “Oh come on!”

Hospital CSO: “Same old story from the docs every time!”

Hospital CCO: “Broken record, I’d say.”

Hospital CEO: “Gentlemen, and Ms. Jones, enough already. This has been an unfortunate accident, and at this point our major concern has to be that there is no adverse publicity that could harm us in our battle against the ______ Hospital System, our sworn and bitter rivals. Accidents happen. The party line is that we are all upset that we lost Dr. Stanton, one of the best EHR data entry operators we had. OK? Meeting adjourned.”

Hospital CEO (Privately to hospital CTO as the meeting breaks up): “George, when are they updating that damn software. You know, that stuff we saw at the Las Vegas EHR convention last month. Where we can finally get rid of these damn meddling doctors who are constantly screwing up our EHR.”

Hospital CTO: “Bob, believe me, it can’t come soon enough. Not soon enough.”


Computers & Software Electrophysiology

EP Calipers for Windows

EP Calipers for Windows
EP Calipers for Windows

EP Calipers for Windows is done.  Whew.  As stated in my previous post, porting the app to Windows was a bit of a struggle.  Installing tools like a bash shell, git and Emacs took some time and effort.  The Windows tool to bridge iOS apps didn’t work.  So I was forced to port the code from objective C to C# and .NET by hand.  This took some time.

Looking back on my previous post with the benefit of hindsight, I think I was a bit too harsh on the Windows development environment.  I grew fond of C#, the .NET API, and the Visual Studio IDE as I got used to them.  Visual Studio is at least as good, if not better, than Xcode, Eclipse, or Android Studio.  Kudos to the Microsoft developers.

EP Calipers is a Windows forms app, meaning it runs on desktop, laptop, and tablet versions of Windows 10.  It is not a Universal Windows Platform (UWP) app.  With the market share of Windows phones dropping below 1%, and doubting that anyone would run EP Calipers on an X-box, I didn’t see any point in developing a UWP app.  I know most hospital desktops run Windows (though how many run Windows 10 now, I wonder?), and many docs have Windows laptops or tablets.  An app targeting the traditional Windows desktop seemed like the best approach.

One drawback is that the Windows Store only lists UWP apps.  It would be nice if they would also distribute desktop apps.  As such, I have to host the app myself.  You can download it from the EP Calipers page.

The program has all the features of the other versions of the app, including the ability to tweak the image rotation, zoom in and out, and load PDF files such as AliveCor™ ECGs.  .NET does not include a native PDF handling library.  In order to load PDF files in EP Calipers for Windows it is necessary to install the GhostScript library.  The free GPL version of the library can be used as EP Calipers uses the open source GNU GPL v3.0 license.  It is necessary to choose whether you are running the 32-bit or 64-bit version of Windows to download the correct version of Ghostscript.  Right-click on This PC and select Properties to see which version of Windows your computer is running.

As always please let me know if you have any problems or suggestions for the program, or for any of the EP Studios apps.  I nearly always incorporate users’ suggestions into these apps, and the apps have benefited greatly from this feedback.  Thanks to everyone who has shared their ideas and opinions with me!

Computers & Software

The Trials and Tribulations of a Windows Developer

Trouble ahead...
Trouble ahead…

After a very long hiatus, I am back doing software development on a Microsoft Windows machine. I decided to port EP Calipers, an app for making electrocardiographic measurements that is available on Android, iOS and OS X, to Windows. Several users had written to me and asked me to do this. Ever eager to please, I have launched into this project. And it has not been easy.

I am no stranger to Windows development, having developed a Windows database system for tracking and reporting electrophysiology procedures while at the University of Colorado in the 1990s. But it would not be overstating the matter to say that my Windows development “skillz” are rusty at this point. I have been living in the Unixy world of Apple and GNU/Linux for several years now, avoiding Windows other than when I had to, such as when I was required to use the ubiquitous Windows 7 systems running nightmarish EHR software at the various hospitals where I worked. I have not done any programming on Windows machines for many years. Transitioning back to Windows development has been, to put it mildly, difficult.

I have no complaints about Visual Studio. It is free and seems to be a very well-designed IDE, at least as good as, if not better than, Xcode and Android Studio. I like C#, which is like a cross between C and Java. Visual Studio can interface directly with GitHub. Given all this, what’s my problem with developing on Windows?

The problem originates in the command line environment of Windows, an environment that dates back to the beginnings of personal computer with the introduction of MS-DOS back in 1981, a system based on the CP/M disk operating system that dates even further back to the 1970s. Windows, which has made backward compatibility almost a religion, still uses a command line system that was written when disks were floppy and 8 inches in diameter. Of course, Unix is just as old, but Unix has always remained focused on the command line, with an incredible plethora of command line tools, whereas with Windows the command line has remained the unwanted stepchild to its GUI. Worse, the syntax of the Windows command line is incompatible with the Unix command line: backslashes instead of front slashes, drive letters instead of a root-based file system, line endings with CR-LF instead of LF, and so forth. So, in order to ease the pain of transitioning to Windows, I needed to install a Unix environment.

Even though Bash is coming to Windows, for now I downloaded MSYS2 which seems to be the preferred Unix environment for Windows nowadays. Using the pacman package management tool, I downloaded various binary packages that I needed, such as Git and Emacs. I faced the challenge of setting up my Emacs environment on Windows. My .emacs (actually ~/.emacs.d/init.el) startup file that works well on my Mac, loading various Emacs packages and customizations, didn’t do so well on Windows. I updated my .emacs using use-package so that it was easy to disable packages I didn’t want, and so that the full .emacs would load even if packages were missing. With some tweaking and downloading of various packages, I got Emacs up and running on Windows. For some reason the Emacs couldn’t find its own info (help) files, but further tweaking fixed that. With Emacs and Git working, I started a new repository on GitHub and was pretty much ready to start developing.

Except, more issues. Little things that take time to fix and can drive you crazy. An example: I had created some soft links to some files that I share on Dropbox, using the usual Unix ln -s command. The files were created, but weren’t actually linked. Apparently ln is just an alias for cp in MSYS2. There are no warnings about this when you run the command, but a Google search proved this to be correct. Fortunately Windows provides a true linking command mklink, and I was able to create the links I wanted. But all this just served to remind me how the Unix compatibility shells in Windows are just roughly pasted wallpaper over the rotten old MS-DOS walls.

Now I was ready to start developing, but I was faced with a question: what platform(s) to target? It is possible to develop a Windows Universal app, that theoretically can run on anything from a PC to a phone. This sounds ideal, but the devil is in the details. The types of controls available for developing a universal app are more limited than those available for a standard Windows Forms program. For example, the control used to display an image in a universal app (named, oddly enough, Image) is sealed, meaning it can’t be extended. I really wanted something like the PictureBox control available with Windows Forms, but this is not available in the universal API. So I have tentatively decided to develop a more traditional Windows Forms app, able to run on PCs and tablets like Microsoft Surface. The Windows phone may be fading into the sunset anyway, so it doesn’t seem worth it to jump through hoops to target a platform that is teensy-weensy compared to Android and iOS.

I should mention that I did try the bridge that Microsoft has developed to port iOS programs written in objective C over to Windows. Long story short, it didn’t work, as many parts of the iOS API haven’t been fully ported yet. Maybe someday this process will be easier.

I’m sure experienced Windows developers will read this and just chalk it up to my own inexperience as a Windows developer. I would respond that, as someone who is a cross-platform developer, it really is difficult to transition from Unix or BSD-based systems like Apple or GNU/Linux to Windows. I think Microsoft is trying to fix this as evidenced by their recent embrace of open-source code. Visual Studio is an excellent IDE. Nevertheless problems like I’ve describe do exist and will be familiar to anyone who has made the same journey I have. I’d advise anyone like this to keep on plugging away. In the immortal words of Jason Nesmith: Never give up! Never surrender!

Computers & Software Medicine

Life Interrupted

broken-iphoneI don’t mean to trivialize the plight of soldiers with the real thing, but I believe that after many years of carrying a pager (and later a smart phone qua pager) I have developed something akin to PTSD. I seem to have an excessive fright/flight response to the phone ringing, to sudden loud noises, and, bizarrely, to sudden silences. I retired from medicine two years ago. I would have expected my quasi-PTSD to have diminished by now. Maybe it is a teensy bit better, but it’s not gone.

After I retired I latched onto social media, thinking it would help fill the void which I expected would inevitably appear when transitioning from the super-busy life of a private practice cardiologist to the laid-back life of a retiree. Facebook, Twitter, Google+ with a bit of Reddit, Tumblr, and Goodreads thrown into the mix. Of the bunch, I have stuck with Twitter most consistently. I like the fact that I can follow people without having to be “friends” with them, or them with me. I like its ephemeral nature. I can dip in and out of the twitter stream, ignoring it for long stretches without the kind of guilt that occurs when I ignore my friends’ posts on Facebook. And the requirement for terseness produces: terseness — something lacking from most social media. I think Twitter’s planned abandonment of the 140 character per tweet limit is a mistake. Like any other rigid art form, whether sonata-allegro form in music, or dactylic hexameter in poetry, the very rigidity of the format forces creativity. Or not. Four letter words, bigotry, hatred, and racism also seem to fit easily into the Twitter form factor.

But I digress.

Part and parcel with social media accounts came push notifications. Someone would post something on Facebook. My phone would beep. A notification would appear that someone had posted something on Facebook. The phone would beep again. There was now an email saying that someone had posted something on Facebook. Multiply this by half a dozen social media accounts and you get a phone that is beeping as much as my old beeper used to beep on a Monday night in July when the moon was full. It was kicking my PTSD back into high gear.

It seems that the notification settings for my social media apps were by default intended to insure that, no matter how un-earthshaking a post was, I would be notified come Hell or high water, by telegram if necessary if all else failed. It is a testament to how lazy I am that it actually took me about a year and a half to do something about this situation. Good grief, I was even getting notifications whenever I received an email. Actually, if I ever went a day without receiving an email, that would be something I’d want to be notified about.

So finally I turned off all the push notifications I could. Like unsubscribing from email mailing lists, this isn’t as easy as it sounds. The master notification switches are buried deeply in sub-sub-menus within the Settings of each app. But using my sophisticated computer know-how along with a lot of “how do I turn off notifications in such and such app?” Google searches, I was able to accomplish my goal.

The cyber-silence is deafening. And it’s a good kind of deafness.

I do feel some guilt when I occasionally look at Facebook and see all my friends’ posts that I have not “liked.” I hope they understand that on Facebook not “liking” a post is not the same as not liking a post. Sometimes it’s a bit awkward to tune into Twitter to find that you have been ignoring a direct message that someone sent you three days ago. But overall I find that I can focus better on tasks without the constant nattering interruptions from social media.

I still start muttering incoherent potassium replacement orders when the phone rings in the middle of the night, but it is getting better.

Computers & Software

Porting an iOS Project to the Mac

I just finished porting my electronic calipers mobile iOS app, EP Calipers, to the Mac. In doing so I decided to bite the bullet and change the programming language from the original Objective C (ObjC) to Apple’s new language, Swift. Here are some observations.

The Swift programming language

I’m comfortable now with Swift. Swift is an elegant language with a modern syntax. ObjC is a very weird looking language in comparison. You get used to ObjC, but, after writing Swift for a while, your old ObjC code looks awkward. Comparing the two languages is a little like comparing algebraic notation to reverse polish notation (i.e. like comparing (1 + 3) to (1 3 +)). I’ll just give a few examples of the differences. The chapter “A Swift Tour” in Apple’s The Swift Programming Language is good resource for getting up to speed in Swift quickly.

Here’s how an object variable is declared and initialized in ObjC:

Widget *widget = [[Widget alloc] init];

Note that in ObjC objects are declared as pointers, and both the memory allocation for the object and initialization are explicitly stated. ObjC uses messaging in the style of SmallTalk. The brackets enclose these messages. So in the commonly used object construction idiom shown, the Widget class is sent the allocation message, and then the init message. A pointer to a widget results.

The same declaration in Swift:

var widget = Widget()

With Swift the syntax is much cleaner. The keyword var indicates variable initiation. Pointer syntax is not used. The type of the variable doesn’t have to be given if it can be inferred from the initiation. Swift is not a duck-typed language, like, for example, Ruby. It is strongly statically typed. It’s just that if the compiler can figure out the typing, there’s no need for you to do the typing (Sorry for the puns — couldn’t resist). Note that the constructor is just the class name followed by parentheses. If there are parameters for the constructor, they are indicated with parameter names within the parentheses. Finally, note that no semicolon is needed at the end of the line.

Swift has a lot of other niceties. Simple data types like integer, float and double are full-fledged objects in Swift (Int, Float, Double). Unlike ObjC, where only pointers can be nil, all classes in Swift, even classes like Int, can potentially be equal to nil, if the variable is defined as an Optional with the following syntax:

var count: Int? // count can be equal to nil

In order to use an Optional variable, you need to “unwrap” it, either forcibly with an exclamation point:

let y = count! + 1 // will give runtime error if count == nil

or, more safely:

if let tmp = count { // no runtime error if count == nil
     y = tmp + 1

In that statement, the if statement evaluates to false if count is nil. This if statement also demonstrates more of Swift’s nice features. There are no parentheses around the conditional part of the if statement, and the options following the if statement must be enclosed in braces, even if they are only a single line long. This is the kind of syntax rule that would have prevented the Apple’s gotoFail bug and one wonders if that very bug may have led to incorporation of this rule into Swift.

Because Swift has to coexist with the ObjC API, there are conventions for using ObjC classes in Swift. Some ObjC classes, like NSString, have been converted to Swift classes (String class). Most retain their ObjC names (e.g. NSView) but their constructors and methods are changed to Swift syntax. Many methods are converted to properties. For example:


NSView *view = [[NSView alloc] initWithFrame:rect];
 [view setEnabled:true];


let view = NSView(frame: rect)
 view.enabled = true

Properties are declared as variables inside the class. You can add setters and getters for computed properties. When properties are assigned Swift calls the getting and setting code automatically.

There are other improvements in Swift compared to ObjC, too numerous to mention. For example, no header files: wonderful! Swift is easy to learn, easy to write, and lets you do everything that you could do in ObjC, in a quicker and more legible fashion. Well named language, in my opinion.

Mac Cocoa

The other hurdle I had in porting my app was translating the app’s API. Apple iOS is not the same as Apple Cocoa. Many of the foundational classes, like NSString (just String in Swift) are the same, but the user interface in iOS uses the Cocoa Touch API (UIKit), whereas Cocoa uses a different API. The iOS classes are prefixed with UI (e.g. UIView), whereas the Cocoa classes use the NS prefix (NSView).

The naming and functionality of the classes between to two systems is very similar. Of course Cocoa has to deal with mouse and touchpad events, whereas iOS needs to interpret touches as well as deal with devices that rotate. Nevertheless much of the iOS code could be ported to Cocoa just by switching from the UI classes to their NS equivalents (of course while also switching from ObjC to Swift syntax). As expected, the most difficult part of porting was in the area of user input — converting touch gestures to mouse clicks and mouse movement. It is also important to realize that the origin point of the iOS graphics system is at the upper left corner of the screen, whereas the origin in Mac windows is at the lower left corner of the screen. This fact necessitated reversing the sign of y coordinates in the graphical parts of the app.

Although there’s no doubt the UI is different between the two platforms, there does seem to be some unnecessary duplication of classes. Why is there a NSColor class in Cocoa and a UIColor class in iOS, for example? Perhaps if Apple named the classes the same and just imported different libraries for the two platforms, the same code could compile on the two different platforms. Apple has elected to support different software libraries for computers and mobile devices. Microsoft is going in the other direction, using the same OS for both types of devices. I think Apple could get pretty close to having the same code run on both types of devices, at least on the source code (not the binary) level, with a little more effort put into their APIs. I suspect that at some point in the future the two operating systems will come together, despite Tim Cook’s denials.


I used IKImageView, an Apple-supplied utility class, for the image display in my app. In my app, a transparent view (a subclass of NSView) on which the calipers are drawn is overlaid on top of an ECG image (in a IKImageView). It is necessary for the overlying calipers view to know the zoomfactor of the image of the ECG so that the calibration can be adjusted to match the ECG image. In addition in the iOS version of the app I had to worry about device rotation and adjusting the views afterwards to match the new size of the image. On a Mac, there is no device rotation, but I wanted the user to be able to rotate the image if needed, since sometimes ECG images are upside down or tilted. It’s also nice to have a function to fit the image completely in the app window. But because of the way IKImageView works, it was impossible to implement rotation and zoom to fit window functionality and still have the calipers be drawn correctly to scale. With image rotation, IKImageView resizes the image, but reports no change in image size or image zoom. The same problem occurs with the IKImageView zoomToFit method. I’m not sure what is going on behind the scenes, as IKImageView is an opaque class, but this resizing without a change in the zoom factor would break my app. So zoomToFit was out. I was able to allow image rotation, but only when the calipers are not calibrated. This make sense anyway, since in most circumstances, rotating an image will mess up the calibration (unless you rotate by 360°, which seems like an edge case). Other than these problems with image sizing, the IKImageView class was a good fit for my app. It provides a number of useful if sketchily documented methods for manipulating images that are better than those provided by the standard NSImageView class.

Saving and printing

As mentioned, my app includes two superimposed views, and I had trouble figuring out how to save the resulting composite image. IKImageView can give you the full image, but then it would be necessary to redraw the calipers proportionally to the full image, instead of to the part of the image contained in the app window. I came close to implementing this functionality, but eventually decided it wasn’t worth the effort. Similarly printing is not easy in an NSView based app (as opposed to a document based app), since the First Responder can end up being either view or the enclosing view of the window controller. I wished there was a Cocoa method to save the contents of a view and its subviews. Well there is, sort of: the screencapture system call. It’s not perfect; screencapture includes the window border decoration. But it was the easiest solution to saving the composite image in the app window. The user then has the ability to further edit the image with external programs, or print it via the Preview app.


Mac apps need to be “sandboxed,” meaning if the app needs access to user files, or the network, or the printer, or other capabilities you have to specifically request these permissions, or, as Apple terms it, entitlements. Since the app needed access to open user image files, I just added that specific permission.

Submission to the App Store

Submitting a Mac app to the App Store is similar to submission of an iOS app — meaning if you are not doing it every day, it can be confusing. The first problem I had was the bundle ID of my app was the same as the bundle ID of the iOS version of the app. Bundle IDs need to be unique across both the Mac and iOS versions of your apps. Then there was the usual Apple app signing process which involves certificates, provisioning profiles, identifies, teams, etc., etc. I did encounter one puzzling glitch which involved a dialog appearing asking to use a key from my keychain, and the dialog then not working when clicking the correct button. I had to manually go into the keychain program to allow access to this key. So, in summary it was the usually overly complicated Apple App Store submission process, but in the end it worked.

And so…

Because the Apple API is so similar between Cocoa and iOS, porting my app to the Mac was easier, even with the language change from ObjC to Swift, than porting between different mobile platforms. I have ported apps between iOS and Android, and it is a tougher process. As for Swift, I’m happy to say goodbye to ObjC. Don’t let the door hit you on your way out!

Computers & Software Society

Is Apple Really Serious About Protecting Privacy?

I had thought the answer to the question of the title was “yes,” given Tim Cook’s stance on strong encryption. But if a recent experience at my local Apple Store is any guide, the theoretical views of the Apple CEO on privacy have not trickled down to daily practice at the Apple Stores.

My wife’s Macbook Air developed an intermittent display glitch, so we brought it in to the Apple Store. On the initial visit the Genius Bar guy opened up the computer and reseated a video cable. This appeared to work for about a week and then the problem returned. So we brought it back.

At this point the person behind the bar recommended sending the machine off to a repair facility, with an expected 5 day turn-around time and a fairly reasonable price to fix it. This seemed like a good deal, since we were planning to travel in a couple weeks and my wife wanted her computer back before then. So the Genius Bar woman took the computer into the back room and told us to wait until she came back with some paperwork to sign.

After about 10 minutes she came back and said everything was ready. She passed her iPad over to us. The form she wanted us to fill out asked for the user name and password needed to log in to the computer.

I immediately felt uncomfortable. Reading the fine print on the form, it stated that supplying the user log in information was mandatory. We asked if that was so and it was confirmed. It seemed our only alternative was not to get the computer fixed. So, although worried that I was making a big mistake, I wrote in the password, which appeared in the textbox in plain text.

After walking out of the store I felt like I had just participated in a hacker’s social experiment demonstrating how easy it is to get someone to give their password to a complete stranger. My wife uses LastPass, but I know with some websites she has had the browser remember and automatically fill in passwords. Like most of us, she often reuses passwords and doesn’t use two-factor authentification. But even if all her other passwords were secure, there is still a lot of private information on her computer that we wouldn’t want anyone seeing.

So after we got home she and I spent a few hours changing passwords on our bank accounts and other important sites. It made us feel a little better, but not much.

The emailed receipt from Apple clearly stated that they were not responsible for any data loss or data breach from the computer repair. Great! Everything on the computer is backed up, so I wouldn’t care if they wiped the hard drive. I just don’t want anyone snooping around our data.

I don’t think Apple needed to do this. If they really needed access to the user account to fix the computer (which I doubt since they could tell if the screen was working just by turning the computer on without logging in), it would have taken just a few minutes in the store to activate the Guest User account or create a new user account specifically for them to use. Unfortunately I didn’t think of that until after the fact. But maybe this advice could help someone else in a similar situation.

Perhaps I am being paranoid.  I know people who work at a large computer repair facility. There are very strict rules to discourage copying of data from users’ computers. Or perhaps I’m just being naïve.  Much of my private data now lives in “the cloud,” A.K.A. a bunch of computers in unknown locations belonging to unknown people with unknown trustworthiness. So I know that digital security is a bit of a pipe-dream. Despite what we do to secure our data, the forces that want to steal it (crooks, governments, and businesses — in other words, crooks) will probably win out.

Nevertheless, I think that if Apple wants to portray itself as a paragon of privacy virtue, it had better clean up its act in the Apple Store first.

Computers & Software

About MorbidMeter


MorbidMeter was inspired by the 1974 short story “Forlesen” by SF writer Gene Wolfe which I read back in 1992 when it was republished in the story collection Castle of Days. Inspired is the right word here. MorbidMeter has nothing to do with the story — a Kafka-esque nightmare that like all of Wolfe’s work is a jewel of writing — except for one element, the undermining of the meaning of time. In the bizarre yet familiar setting of the story, a day is divided into 240 “ours”, an 80 our work shift becomes a whole career, and a whole lifetime occurs in less than a day.

The story "Forlesen" in Gene Wolfe's "Castle of Days"
The story “Forlesen” appeared in Gene Wolfe’s “Castle of Days”

This story planted an idea in my head that undoubtedly bubbled along subconsciously for years before finally resurfacing as a computer program. The idea was this: I am X years old, and on average I might live to age Y. If my life span were considered to be a single year, what date would today be? If a single day, what time would it be? And so forth.

For example, I was born on November 1, 1951. There are lifespan calculators on the web that estimate longevity; let’s say that I am expected to live to my 86th birthday. Given this, at the time I write this, 10:55 AM MST on November 28, 2015, if I considered this 86 year lifespan as occurring over a single year, it would be 5:36:28 PM on September 29th of that year.

Why is this important? It’s not. I already knew I was in the Autumn of my years. And obviously I don’t know how long I will live. No one but terminally ill patients and those on Death Row know this, and even then the timing is never 100% certain. Like many attempts to measure the unmeasurable, MorbidMeter time is too precise. Yet there is something compelling (at least to me) about a weird clock that reflects my whole life span in something easier to grasp than 86 years, like a single year, day, or hour. MorbidMeter time moves slowly, but like all time it does move inexorably forward. Seeing the very slow ticking of MorbidMeter seconds is a reminder that I my life will not last forever, and that I still have things to do.

The MorbidMeter time algorithm is pretty simple in theory. You figure out what percentage of your life span you have already lived (age / total lifespan) and then multiply that by the time period you are transposing into (e.g. (percent lifespan) * (1 year)). You then translate the answer into an actual date and time. In practice though, time is messy. There are different calendars. There are leap years and leap seconds. We move around in different time zones. There is also the scourge of our existence, Daylight Savings Time. Computers have not always dealt well with the exigencies of time. Remember the Y2K panic?

In order to standardize this mess, computers use the number of seconds since “The Epoch” which is defined as 00:00:00 on January 1, 1970, UTC. This system is not perfect, as it doesn’t account for leap seconds, and has the problem that storing time in a 32 bit integer means that time variables will overflow on January 19, 2038 — the so-called Y2038 problem. Switching to 64 bit integers nicely solves this, adding  293 billion years to the time range in both directions. In any case, the trick is to convert date and time values to seconds or milliseconds since the Epoch, do, the calculations, and convert this value back to a date and time. The programming languages have I have used to implement MorbidMeter (which now include Python, Java, and C) all more or less provide these time functions.

mmshot1I started out with a simple command line script in Python, later expanding it to a little GUI window for the desktop. Later I wrote an Android widget to calculate and display MorbidMeter time. It has been the least popular of the apps I have written. Most people don’t seem to “get it.” I discovered from user feedback that most people who actually used the widget were using it as a countdown clock. Someone gave me an actual countdown clock about a year before I retired, and I did enjoy watching it slowly count down the time until January 1, 2014 when I retired. MorbidMeter can certainly be used as a traditional countdown clock, counting down days, hours, minutes and seconds in real time. As such it is not morbid, though you are still stuck with the little skull on the widget.

MorbidMeter counting down in real time
MorbidMeter counting down in real time

The latest MorbidMeter iteration has been for the Pebble watch. I just completed this and put it on the Pebble watch app store yesterday. It duplicates the functionally of the Android app. The MorbidMeter for Pebble watchface can be used with either Android or Apple smartphones. It can be used as a long or short term timer, and will buzz when the countdown is complete. Shaking the wrist toggles between showing local time and the timer.

MorbidMeter for Pebble watches

Some of the timescales of MorbidMeter are a little obscure. For example the Universe timescale stretches time over the entire 15 billion years from the Big Bang until Now. The X-Universe timescales are for my young earth creationist friends who would prefer to believe in a shorter (6000 year) duration of the universe.

I plan on putting my experience with expanding and shrinking time in MorbidMeter to good use, when I unveil my Time Travel app in the near future.

Computers & Software Medicine

Reining in the EHR Monster

it-looks-like-you-are-stupidDr. Lisa Rosenbaum has an excellent piece in the NEJM this week entitled Transitional Chaos or Enduring Harm? The EHR and the Disruption of Medicine.  In essence a review of Dr. Robert Wachter’s book The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, it deals with the ever increasing intrusion of the digital-industrial medical complex on the practice of medicine.  Bottom line, electronic health records (EHR) in their present form interfere with patient care.

It doesn’t really matter how we got to this point. Many well-meaning people in government, the insurance industry, and the medical software industry have contributed to this mess. Despite good intentions, they have created a broken system.  It’s clear why.  As Dr. Rosenbaum points out, the one key element lacking input into the development of EHR systems has been physicians. What do they know? Clearly those who designed current EHR systems either don’t know or don’t care how doctors actually practice medicine.

There is nothing inherently bad about the concept of electronic health records. There are clear benefits to these systems. The ability to look up medical records online (albeit limited by poor EHR interoperability) is a tremendous advantage over the clumsiness of paper charts. There is no denying that electronic prescribing is a real advance over illegible handwritten prescriptions. EHRs that would be easy, even fun to use can be designed. Doctors are not adverse to technology.  Their noses are as buried in their iPhones as much as anyone’s.  I don’t even think it would be very hard to design a “fun” EHR. Unfortunately there are powerful forces that would resist such a design.

The government and insurance companies want to “play doctor” and tell doctors how to practice medicine through the medium of “meaningful use.”  They need to stop using doctors as guinea pigs in this experiment of enforcing medical practice guidelines via EHRs.  The system of billing based on documentation is also at fault.  EHRs need to shift from documenting for the purpose of billing to documenting for the purpose of medical care. The EHR vendors need to pay attention to the actual workflow of doctors and other health care personnel and emulate that workflow as closely as possible.  Like any good tool, EHRs need to be as transparent as possible. The last thing we as doctors should be doing is paying more attention to our computers than our patients.

A common physician workflow, which I and many of my colleagues used, is as follows. Whether seeing a patient in the office or in the hospital during rounds, there were 3 basic steps: 1) I would review old notes, test results, and other records. 2) I would go see the patient, take a history and do a physical. During this step the patient has my undivided attention.  And 3) write orders and document the visit. The main purpose of the documentation was so I and others could come back later and know what my thoughts and plans were for the patient.  This workflow can be emulated using an EHR, but only if the current excessive documentation burden is lessened.

In an ideal world, medical documentation would be brief and to the point. We don’t live in that world. Per the medical coders, a written note saying “review of systems negative” can’t compete with a screenfull of checkboxes all checked as negative — as if this is somehow more meaningful. A cut and pasted note chock full of details but identical to the note from the patient’s last office visit is more legitimate than a brief “no changes in patient’s complaints, findings, or plan,” even though they are identical in meaning. Brevity is the soul of wit, but apparently not in the EHR world. Somewhere behind the scenes there are coders counting bullet points and government bureaucrats making sure meaningful use checkboxes are checked. Did you review the patient’s allergies? How could anyone know if the ‘allergies-reviewed’ checkbox isn’t checked?

Early versions of Microsoft Word were notorious because of the inclusion of Clippy the paperclip. Clippy would constantly pop up while you were writing with “helpful” hints like “It looks like you are trying to write a letter. Can I help?” The answer was usually a resounding “No, get off my computer,” and mercifully Microsoft euthanized Clippy in later versions of Word. Writers trying to write a novel don’t want some know-it-all computer assistant popping up and offering them suggestions on how to round out characters or improve the plot. They want the computer to get out of their way and just put the words up on the screen that they type. Maybe that’s why George RR Martin still uses ancient no-frills WordStar to write his novels.

Similarly doctors don’t want some transmogrified Clippy-monster lurking in their EHR system telling them what to do. “It looks like you are writing a progress note. Would you like to review the patient’s allergies? Please click this button. And if you click just two more review of system points, your note could be coded as a level 4 visit rather than a level 3. Would you like to embed the lab and Xray results in your note? This will show the coders that you have definitely reviewed these results and could bring your note up to a level 5 visit.” And so on.

EHRs need to get out of the way of both patients and physicians and become unobtrusive. Government needs to stop trying to social engineer the practice of medicine via meaningful use. The EHR should be a tool like a stethoscope or ultrasound. Right now it is a monster sucking the lifeblood from the profession.

Books Computers & Software History Science

Prank Calling Kurt Gödel

Kurt Gödel
Kurt Gödel

Prank calling used to be a common, albeit annoying, form of entertainment back in the days when I grew up, before the invention of caller ID ruined it forever. Some prank calls were just simple and stupid jokes, such as the “do you have Prince Albert in a can?” call. On a slightly more elevated level of maturity, there was the anti-corporate “screw the phone company” philosophy of prank calling. As an example, I remember in college my friends and I decided to call Victoria Land in Antartica. When the British operator asked who would pay for the call, we asked that it be charged to Her Majesty the Queen. We were informed very politely that that would not be possible. So we told her to make the call collect to Admiral Byrd. Amazingly she accepted that as legit. She then said it would take two hours to make the connection. Unfortunately, as I recall, we never got through to the good admiral.

Before you get too judgmental about this kind of activity, recall that Steve Wozniak and Steve Jobs got their start together by “phone phreaking,” designing (Steve #1) and selling (Steve #2) so-called “blue boxes” which were used to make long-distance calls without paying. So, as juvenile and even illegal as pranking the phone company might have been, you might not be holding that iPhone in your hand right now if not for it.

The most memorable prank call of all occurred the night some of my friends and I decided to call Kurt Gödel and ask him to help us with our homework. Gödel was a mathematical genius, most famous for his “Incompleteness Theorem.” The essence of this theorem is that in any mathematical system at least as complex as simple arithmetic, there are theorems that are true but can’t be proven. The actual mathematics of his proof are complicated. My limited understanding is that he found a way to translate mathematical statements into numbers (called Gödel numbers) and then show that you can use these numbers to represent a statement that states “this statement is not provable.” If this all sounds like gobblygook, there is a whole book that explains this (and a whole lot more) better than I can, Douglas Hofstadter’s classic Gödel-Escher-Bach, An Eternal Golden Braid. In the minds of many mathematicians and philosophers, there is something mystical in Gödel’s proof. Depending on how you look at it, the fact that there are truths that can’t be proven is either disturbing or profound or both. Some have felt the proof has implications as to whether machines can ever develop consciousness, and the self-referential nature of the proof may even have something to do with our own consciousness.

Textbook for logic class
Textbook for logic class

My friends and I were learning about all this in a logic class taught at Dartmouth in the early 1970s. One of the texts we used in the class was Nagel and Newman’s book, Gödel’s Proof. While struggling though this text, we collectively got stuck on some point that we didn’t understand. Unfortunately I don’t remember the exact question we had, or whose idea it was to call Dr. Gödel to see if he could answer the question. But for whatever reason (possibly fueled by low doses of intoxicants), it seemed at the time to be an excellent idea. Who better to answer a question about Gödel’s proof than Gödel himself?

We knew that Gödel worked at Princeton (where he had been good friends with Einstein), so we called directory assistance for Princeton, New Jersey and obtained his home phone number without difficulty. We then, sitting in a circle on the floor of my dorm room, called him. My friend Bob Lindgren, the boldest of the bunch, made the actual call while we all listened in.

Dartmouth Professors Kemeny and Kurtz
Dartmouth Professors Kemeny and Kurtz

Dr. Gödel answered the phone himself, and we all listened to the tinny German-accented voice with amazement. Bob said we were students at Dartmouth College studying his incompleteness theorem, and we had some questions. Professor Gödel very pleasantly said he would be happy to answer any questions, referring to our school as “Dartmoor,” and asked how his friend John Kemeny was doing. Professor Kemeny was president of Dartmouth at the time, was another colleague of Einstein’s, and was an early computer pioneer, coinventing with Tom Kurtz the BASIC computer language. Of course none of us were on speaking terms with Dr. Kemeny, but that didn’t stop us from reassuring Dr. Gödel that his old friend was doing just fine. We promised we would give him Dr. Gödel’s best wishes the next time we saw him.  We then proceeded to ask our logic questions to Dr. Gödel, who was gracious enough to waste his evening and precious genius explaining simple mathematical concepts to awestruck college kids. I don’t remember many details of the conversation, though I do remember one thing we asked him that may offer some insight into how he worked. We asked him if the idea for his proof came to him all at once as a Eureka moment, or if it was something that developed more gradually. He replied that it was definitely not a sudden insight. Instead it was something that he worked on over many years. He said he had a broad idea where he was going with his idea from the beginning, but it took his filling in the details over a long period of time before he got the result he wanted.

We thanked him for his help and he wished us well. He died a few years later, in 1978. Today in the world of mathematics his work is considered to be comparable in significance to Einstein’s Theory of Relativity in the world of physics.  I am not a mathematician and I find Gödel’s incompleteness theorem difficult to grasp — slippery, self-referential and paradoxical, much like thinking about the nature of consciousness. Maybe the two are related after all.  On a more practical note, Gödel’s story about how he came up with his proof leads to the profound yet common-sense (the two aren’t necessarily at odds) notion that creating something new and wonderful requires more than just good ideas. It requires hard work, and lots of it. This is important to realize, even for those of us who are not geniuses.