Pages

Thursday, August 17, 2017

Why I Choose Delphi

Lately my work as an enterprise architect has focused on the problem of data reconciliation and legacy modernization in the healthcare space. This focus began for me while working at the US Department of Veterans Affairs on web-enablement of their "legacy" VistA electronic health record (EHR).

VistA is written in a language called MUMPS that provides very fast, flexible storage capability referred to as "globals." More precisely, VistA is based on a DBMS written in MUMPS called FileMan that also happens to shared by other EHR's, such as the Indian Health Service's RPMS and the DoD's CHCS. FileMan, VistA and RPMS are in fact open source.

For years the primary user interface to VistA has been a Windows desktop program called CPRS, or, the "Computerized Patient Record System," which was written in Delphi. As EHRs go, it has a great reputation among clinicians and although few doctors actually "like" their EHR, CPRS had a solid reputation as one of the least-bad out there from a usability point of view.

CPRS communicates over TCP according to a protocol called the "RPC Broker" which is built into FileMan. RPC stands for, of course, Remote Procedure Call. On the server side, RPC calls are simply MUMPS extrinsic functions that meet a certain signature and are registered as such, usually by an installation package. Much of what is considered "bad" about CPRS stems from how chatty and slow the RPC Broker approach is when you put the demands of a multi-threaded, highly modular application like CPRS on it. Anyway.

I had an opportunity while working on a medication reconciliation application at the VA to integrate with CPRS and renew my old Delphi flame building a prototype "tools menu" extension.

I was delighted to learn how much it had grown up in the intervening years since I last had a chance to work on it. Although CPRS has been "stuck" on Delphi 2007 for reasons beyond my fathoming, my "tools menu" extension was built on a much newer version (XE8 at the time IIRC), and thank God. I was able to use LiveBindings simply and to good effect and discover a way to use the MVVM pattern for cross-platform development of VCL and FMX applications that actually worked. (MVVM was originally conceived for WPF, but didn't provide anywhere near as seamless a reuse story between the then-disparate .NET runtimes,). I missed nothing from Java or C# as languages --- anonymous methods, generics, attributes, you name it.

As a certified linguaphile and something of a polyglot, I have a deep appreciation for "old" computer programming languages, and how little is really "new" in the new languages that are continually popping up these days. Truly, we live in a kind of Cambrian explosion of programming languages! Yet even the ancient MUMPS, which comparatively speaking makes Delphi look like a young puppy of a language, has some interesting redeeming features that I "get." I am also a huge fan of Lisp, for that matter, which, I'll note parenthetically (ahem), lately has been enjoying a bit of a renaissance in the form of Clojure, a wonderfully well-designed Lisp built to be hosted by runtimes like the JVM.

Anyway my point is that as much as I love the bleeding edge of programming language evolution, I am not one to poo-poo a language just because it's been around the block a couple decades. Legacy modernization on my experience has more to do with fundamental design and architecture considerations than it does language, though people often blame languages when a particular architectural pattern or paradigm goes out of vogue, or a way to repackage tried-and-true patterns with a hip new buzzwords captures the industry's attention for a season.

I would love to see a Delphi renaissance, specifically in the service of legacy application modernization. Embarcadero's focus on distributed systems and the Internet of Things is timely, needed---not to mention hip---and I'm glad that beating MS on Windows is no longer Delphi's only reason for existence.

My other major professional focus these days is data reconciliation---the problem of truth in the presence of multiple sources of truth that are out-of-sync (which is nice way of saying, one or more of them are currently untrue). Needless to say, data-driven applications has always been a sweet spot for Delphi, and working with its native data set abstractions has always been a pleasure. Although I consider Delphi tooling still a bit too centered on SQL-based data, SQL is still totally relevant in a world looking to migrate to new, mobile-savvy computing paradigms, and will be for a long time.

I love the "true native" focus of Embarcadero's mobile strategy as well. Cross compilers have made the job of compiling to multiple platforms much easier than the early days when the trade-off of not having to worry about that by surrendering freedom to nanny runtimes like the JVM and CLR made more sense, and app stores have largely taken away the big advantage web-based applications claimed over mere "desktop" apps---namely, easy of deployment and maintenance.

Although there are some rough edges to this strategy for Delphi component developers in the short term, if Embarcadero can build on the success of CPRS, Skype, and other popular applications that are part of the Delphi story, long term I think Delphi and RAD Studio can evolve into the premier language and programming environment for building long-running, well-loved, and highly useable distributed mobile and IoT applications.

So, that's why I choose Delphi. The only way to make that happen is to be part of it.


8 comments :

  1. I agree where you say the data-tooling is a bit too centered on SQL-based data. Tabular relational data goes back decades, and it's still somewhat of a new vogue to have more nuances in 'local' structure (e.g. NoSQL), so I guess it's safe to say there's still some time needed for more patterns to emerge. In the mean time I had a try to make something work for myself, and left the typical Delphi-data-tooling behind to roll a MongoDB connector by myself, heavily based on Variants in an attempt to get a really succint syntax: https://github.com/stijnsanders/TMongoWire

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. Forgot to mention this: though there's quite an age difference, what I understand about MUMPS globals, is that it's almost a spitting image of what JSON has to offer for structuring data... I guess history really does repeat now and again.

    ReplyDelete
  4. Will love to see support for graph databases in the near future.

    ReplyDelete
  5. Will love to see support for graph databases in the near future.

    ReplyDelete
  6. @Stijn - Actually, MUMPS globals aren't entirely compatible with JSON structure. A MUMPS global at some subscript, say, ^Foo("Bar","Baz") can have both a value and multiple children, e.g.

    ^Foo("Bar","Baz")="Jabberwocky"
    ^Foo("Bar","Baz","Qux")

    There is no agreed-upon way to represent this by default in JSON, you must use a convention that specifies how to represent node _values_ distinctly from node _children_ for each logical global node as a JSON object. It can be done but to call it "hand meet glove" ignores nuances of Globals. If that doesn't convince you know also that MUMPS globals are automagically sorted for you (and it assumes ASCII alphanumeric characters by default). JSON structures do not have guaranteed order of any kind according to the spec.

    ReplyDelete
  7. @"The Chief Priest" : Index-free adjacency (IFA) for the win! ;)

    I'm a huge fan of graph databases. SQL/relational databases paradoxically do better when the number of entities and relationships relative to the size of the data are comparatively small. You very quickly get to a place in any *enterprise* data model where more rows of data equal measurably slower query performance. The nice thing about the graph model is there is nothing to "compute" at runtime - you're just iterating real edges between nodes, and that is FAST. You do have to think differently about how you model graphs, though. Model it the way you plan to actually query it because deeply nested traversals can undo the benefits of IFA.

    ReplyDelete
  8. I would have loved to meet you while I was there last year in WorldVistA meet or while at OSEHRA summit but sure other than perhaps Kevin T and few within VA like Anthony, Harvey, Julie etc .... I thought all the CPRS/Delphi professionals had perhaps already migrated to greener pastures ...........

    By the way .... if my understanding is correct, CPRS is now officially in XE3, though I had compiled it for XE7 2 about years back while working to replace its MS Office based spell checker for WorldVistA client ......

    cheers

    ReplyDelete