My Digital Divide

Every so often, I remember that I’m no longer creative in computer science or even in programming.  I’m now far from the source of creativity in computer science, by which I mean that I’m no longer at the cutting edge.  Lack of creativity in that domain makes good sense for me at this point in time.  But I could still be a programmer; one doesn’t even need a formal computer science education for that. (This is not to downplay the difficulty of programming.  It’s just not necessary to major in computer science to be a programmer, just as you don’t need an art degree to be an artist.)

After I left graduate school, I didn’t program for years.  This was primarily because I was very busy with other stuff (medical school, residency, and everything associated with and that followed those years).  When I wasn’t busy, I was distracted.  I did have spurts of interest in web programming (Rails, HTML/CSS, etc.) and “fun” languages (Haskell, Ruby, Python, etc.), but nothing really stuck with me.  I became dismayed by the fact that so much is incompatible with so much else, and that pretty much everything becomes obsolete after a few years.

Good algorithms/ideas are forever.  Unfortunately, software itself is ephemeral; hardware is also ephemeral, and vulnerable while it exists.  The open-source movement hasn’t been as prominent as for-profit companies have been, with their proprietary software and hardware that often disable customers from taking their data elsewhere. I do invest casually in my blog, but that’s mostly writing and pictures that can be stored as hardcopy.

Much of the programming code I wrote in college, for a large variety of interesting and often fun projects, is probably no longer accessible because it was backed up to CD-Rs that are almost certainly corrupted. (I’ve confirmed that some of my old CD-Rs from my college days are corrupt now.)  When they first came out, CD-Rs were touted as a storage technology that would last at least a century.  That turned out not to be the case.

A simple, inexpensive pen or pencil can be used to write or draw on paper produced by other manufacturers, or to write on many other materials. Paint produced by one manufacturer can be blended with paint made by others, and the range of colors produced by mixing paint at home is infinite. You can even create your own pigments (some artists do this). If you learn to play guitar on one brand of guitar, you can play on any brand.

Today’s digital creator, on the other hand, is at the mercy of manufacturers, programmers, and hackers.*  The relevance of any computer-based product is short-lived. Tech culture, as it is in 2016 (it doesn’t have to be this way), is largely different from the culture of, say, art or music or literature, in which there are pieces extant from thousands of years ago that are still relevant.**

*Imagine if Apple and Google sold paint, you bought a tube of it from each company, and they blended incorrectly because of proprietary formulations requiring you to stick with one company or the other.  This rarely happens with actual paint.

**For example, there are paintings (such as the “remarkably evocative renderings of animals and some humans that employ a complex mix of naturalism and abstraction” on cave walls at Lascaux, Chauvet-Pont-d’Arc, and elsewhere from tens of thousands of years ago that we can still appreciate), pieces of music (such as “White Snow in Early Spring” from ~552 BC, attributed to Shi Kuang, that is still played live), and well-known poetry and literature from ancient cultures that people still read and admire.


Old MacBook, Resurrected

Quick personal update:  This is my first blog post in quite a while.  In the past couple months, my girlfriend and I moved again, this time to be closer to each other.  Much has happened since then, including the following.  We’ve been very busy!

My “clamshell” unibody MacBook is about six-and-a-half years old.  Initially wooed by Apple with the lovely 12-inch PowerBook G4 from coffee-shop windows and the crooks of other people’s arms throughout my early twenties, I eventually succumbed to Apple’s charms and bought this MacBook–the only Apple computer I’ve ever owned other than my iPhone and iPad (and both of those were gifts from my dear mother).  I made a critical mistake when I first bought it, though, which is that I didn’t max out on RAM. I’ll never make that mistake again. You’d think I would have learned my lesson from the Lenovo laptop I purchased a few years before the MacBook.  I didn’t maximize the RAM on that one, either.  Soon hobbled by operating system updates that slowed everything down, I had to upgrade the RAM modules. (The upgrade was easy and effective.  The machine is still quite fast when running Linux or Windows.*)

Insufficient RAM was the Achilles’ heel of my MacBook for years.  Like the Lenovo, the MacBook was disabled by operating system updates soon after purchase, and, like a person with chronic, untreated inflammatory arthritis, I adapted to this impaired functionality by reducing the scope of my activities.  A few months ago, I installed cloud backup software, after which the computer’s performance degraded even further.  The coup de grâce was an OS X upgrade to El Capitan several weeks ago which rendered the machine practically unusable.

Suddenly desperate for a functional laptop, and realizing I had never gone so long without buying a new computer, I looked at the options at the online Apple Store and did some research.  I learned that the new MacBooks are clearly not targeted at my demographic:  they are ridiculously difficult to upgrade or repair (1, 2), have few ports, don’t have an optical drive, have relatively small storage space, and are becoming more dependent on Apple’s cloud-based services.  (Read this horror story by a blogger who had his entire personal music collection deleted from his hard drive by Apple when he signed up for Apple Music, along with his rare, cherished tracks being replaced by more popular versions in the cloud.)  As we all know, Little Brother has long since become Big Brother.

I wouldn’t buy this sort of computer unless I was absolutely forced to do so.

Curious about other options, I looked at PC laptops, then started looking at upgrade options for my current MacBook.  After a tiny bit of research, I learned that I can massively improve my current laptop’s performance to approximately that of the latest Apple MacBooks by spending at most 1/5 the price of the new MacBook Pro I was looking at a few days prior.  This gave me pause.  Why hadn’t I thought of such an easy solution in the years before my laptop came to a grinding halt?  A moment later, I realized it’s for the same reason so many new patients see me when they can no longer tolerate the pain of chronic arthritis: I was used to my laptop’s slow performance, just as many patients become used to chronic pain and to a smaller comfort zone until various treatments get them back up to speed.

The actual RAM/hard drive upgrade was easy.  I maxed out the RAM, replaced the hard drive with a solid state drive almost double the size of the old one, installed OS X El Capitan from a USB flash drive I had set up earlier, then used Migration Assistant and a USB drive adapter to transfer my old data over.  I didn’t replace the battery because after more than six years, it still has half of the maximum number of charge cycles left!  What this tells me is that I was so busy the first three years I owned the laptop–during internal medicine residency–that I hardly used it.

Anyone with a modicum of technical interest can upgrade or repair her old MacBook, too.  Here’s an excellent, minimally technical overview of the most common repairs.  Let me know if you’d like the step-by-step breakdown of my own approach.

My old MacBook is now the fastest computer I’ve ever used.  It’s quieter–the fan is basically never on–runs cooler, has a noticeably longer battery life, powers on and off more quickly, opens applications instantly, and doesn’t slow down.  I saved more than $1600 by upgrading this laptop instead of buying a new MacBook Pro.  A couple weeks post-upgrade, I’m still giddy, still amazed, that OS X can run so quickly.  This reminds me of some of my patients who are ecstatic for weeks or months after their debilitating conditions are treated because of how little pain they now have and by how much more they can do comfortably.  Perspective is only 20/20 in hindsight.

*Addendum 5/20/16: I’ve dual-booted Windows and Ubuntu Linux on my Lenovo laptop for years.  I use the Windows partition for Windows-specific applications and the Ubuntu side for a bit of programming, for writing, and for going online.  As I updated Ubuntu over time, it became more and more difficult for this ten-year-old computer to handle it.  I just reformatted the laptop and installed Windows alongside Lubuntu–a lightweight version of Ubuntu designed for netbooks and old computers–and am pleased to say that Lubuntu gives me what I like best about Ubuntu without all the graphics- and other resource-intensive frills.  I also bought a brand-new battery on eBay for only $17!  The laptop is highly usable once more.

*Update 8/28/16: I discovered today that my updated MacBook can videoconference smoothly.  I used it for an online guitar lesson via videoconference without any problems.  Previously, I used my iPad 4 for this purpose but it was slow, dropped and altered sounds and video, etc.  Before my MacBook was updated, I wasn’t able to videoconference with it because everything slowed down and the video and audio were of poor quality.

*Update 1/8/17: My MacBook and my Lenovo are still performing as well as they did post-upgrade! No problems so far.

Is Chess Obsolete?

Is chess an obsolete game now that computers are better than humans?

Is strength training obsolete now that we have machines that can easily lift much more than humans can?

Are bicycles obsolete now that we have cars?

Is walking obsolete now that we have bicycles?


Computers can certainly play a much better chess game than any human in the world can play.  Computers “think” about chess differently, but they do have the advantage of being able to “look” farther into the future while analyzing each possible position accurately.

So, if you’re a human and you want to play the most accurate/brilliant chess games ever, you’re already guaranteed to fail.

But you still appreciate a nice walk, even though you own a bicycle, and you still use a car or public transportation at least some of the time.

And if you go to the gym, you never think about the fact that a mechanical crane can “deadlift” much more than you can!

For the same reason, it simply doesn’t follow that chess is obsolete.

Even though humans are much less accurate at playing chess, there are numerous significant benefits from playing this and other strategy/logic games on a casual (or even serious!) level.  These benefits can be applied to the rest of your life.

Thoughts on “Hello Doctor” CEO’s Talk at Platform Houston

I just returned from a talk by Maayan Cohen, CEO of Hello Doctor. Hello Doctor is an app that uses OCR to digitize patients’ medical documents (lab & radiology results, H&Ps, progress notes, etc.), storing them locally on their mobile devices as well as in the cloud (everything is encrypted).

It’s a simple concept that’s actually very powerful in the fragmented world of medicine.  (In my mind, any moderately clever idea is revolutionary in medicine, because so many things in medicine are inefficient or broken.)  Physicians often see patients who have had extensive care elsewhere, but even when we officially request those records, we sometimes don’t receive them.  The patient who accurately remembers her medical history is quite rare.

An interesting feature of the app is the use of “big data” to help patients see what other patients have chosen at decision points.  (E.g., “65% of patients diagnosed with rheumatoid arthritis were started on this or that medication.”)

Overall, I was impressed with the app and I hope that it does well.  It has a direct competitor, Picnic Health.

This was the first “startup presentation” I’ve ever attended.  I surprised myself by asking her more questions than I expected.  I might have asked more questions than anyone else at the venue!  (Several other people asked very interesting questions, too.)  I asked her if physicians can easily transfer patient data from the app to their own medical information systems.  She said that’s tricky to do, both for legal and other reasons.  E.g., you can download only one document at a time…you can’t download everything in, say, one PDF or something.

After the talk, I introduced myself to a financial analyst who had asked several startup-related questions from the back of the room.  (I was introduced to the concept of “stickiness factor” by a question that he asked.)  We exchanged contact info.

A few weeks ago, I attended a Python programming meetup–this was my first “social” foray back into programming in at least a decade–and met a general surgery resident who had spent two years of a research fellowship creating a “smart” display for the EMR used in the SICU at his hospital.  He’d hired programmers and bought hardware with grants.  He told me something I won’t easily forget:  he said that administrators are too far removed from patient care, clinicians are too busy seeing patients, and that researchers are too busy trying to permanently establish their names in the literature for any of these groups to fix our broken healthcare system.  He said that it’s incumbent upon those physicians with a technological or entrepreneurial bent to try to do something to help out.

A few days later, he gave me a tour of the SICU, the hardware, software, and took me up to meet his team.  If things go well, they might start selling the system to other hospitals.  He suggested that they might have a position open this fall if the startup continues to succeed.  I told him I would consider it.

How Shading a Line Drawing Reminded Me of Graphics Algorithms

Someone I know drew this cute frontal portrait:


They didn’t know how to shade it, so they asked me to do so.  I used Procreate for iPad and a stylus.  A half hour later:


This got me thinking about graphics algorithms!  (I loved my college course in graphics algorithms, back in the day.  We used OpenGL and C++ to do some pretty neat stuff.  The projects were challenging but rewarding.)

I’m thinking that it shouldn’t be too difficult to write an algorithm that takes an unshaded frontal portrait and uses some heuristics about relative angles and lengths to transform it into a believable three-dimensional face.  Then, it can generate a light source from a prespecified location and “shade” the face automatically.  I don’t know if this has been done, but it seems fairly straightforward, so it probably has!

Hurdles in Software Development: My Experience

Great Wave off Kanagawa, by Hokusai

I’ve been intermittently working on a little GUI-based (GUI stands for “graphical user interface”) Ruby application for my department.  This is my first real piece of software in about a decade.  Interestingly, I encountered many of the same issues I recurrently wrestled with in college while writing programs.

The natural history of software development, for me, seems to go something like this:


I start working on a project, and things go pretty smoothly until I encounter hurdles, such as any of the following:

1. Lack of knowledge (e.g., not knowing how to create a GUI wrapper for a Ruby program).

SOLUTION: Search online or in a book until I find the right information (e.g., how to create GUIs in Ruby).

2.  A broken system (e.g., a particular Ruby gem, for example, green_shoes, just will not install on my particular Mac OS X installation).

SOLUTION: Give up entirely, after bashing your head against the problem by trying many different possible solutions, or switch to a different system (e.g., green_shoes installed successfully on my Linux machine.)

3. Lack of insight (e.g., not realizing that I should have coded something a different way).

SOLUTION: That “Eureka!” moment when you suddenly realize what you did wrong and go back and fix it.  This often comes to you when you’re not programming at all but are doing something entirely different.  (Some people attribute these solutions to the “unconscious mind”.)

4. THE FINAL HURDLE.  There is always (almost always, anyway) a final hurdle that is way, way bigger than any of those that came before it.  You won’t finish your program without surmounting this tsunami.  Sometimes, THE FINAL HURDLE is followed by “aftershock” hurdles that try to trip you up before you reach the finish line.  THE FINAL HURDLE in my current project took several times longer to surmount than any that came before it:  I wrote my basic Ruby program in a half hour, it worked, and then I wrapped a GUI around it by jumping over several minor/submajor hurdles.  I didn’t anticipate being blindsided by THE FINAL HURDLE, though, which was that green_shoes just would not install on my OS.  (Why did I use green_shoes instead of the original shoes.rb, you ask?  It’s because I couldn’t install gems additional to those already included in shoes…trust me, I tried every reasonable thing.  Anyway, green_shoes is a more elegant solution than shoes because it’s a gem included in your program.)

SOLUTION:  Cycle through the solutions to hurdles 1-3 until you surmount THE FINAL HURDLE and finish your program.   Save some energy for any sneaky minor/submajor hurdles waiting for you before the finish line.  You don’t want to die before you get there.

Programming takes a lot of perseverance.  You’re an explorer, often lost and grappling with profound difficulties, but you’re getting stronger.  Eventually, if you stick with it long-enough, you get better and the hurdles become smaller, for whatever that’s worth.

However, the programming world is always changing, so new programming languages and other systems are popping up all the time, with new 100-ft tsunamis headed your way…

Learning Ruby with Codecademy

Screen shot 2014-04-16 at 7.19.17 PM

I just finished the interactive Ruby course on Codecademy.  I highly recommend it!  It’s a free, interactive, engaging introduction to Ruby and is much more fun than reading a book.

I initially learned some Ruby (while learning Rails) back in fourth year of medical school and liked it a lot.  I then didn’t get to use it for years and sort of forgot it (but you don’t really forget…when you get back into it, it feels like reconstituting powdered milk or something: things come back to you pretty quickly).

I haven’t programmed seriously for a decade and was surprised at how much came back to me while taking this course.  I even remembered some esoteric stuff from C++ (I used to program in C++, Java, Lisp, and several other languages before I left computer science for medicine).

When I was a computer science undergraduate student, the emphasis was on theory, not on practical stuff like web programming (which was, quite frankly, rather looked down upon).  If you knew how to create websites, it was usually because you taught yourself.