2011/04/30

Change management

When developers hear of "version control" they tend to think of CVS, SVN, Git or other relevant source code version control systems. The idea behind version control is to have a repository of the work done so far accompanied by the corresponding metadata (who changed a source code file, what was changed, was is the comment of the developer that committed a change, when was the change committed, etc). Version control is such a basic and essential facility that no developer or programming hobbyist who take their work seriously can live without it.

From the perspective of the end-user, the corresponding process to manage the life cycle and operation of a payment system is change management. It has a somewhat different meaning and it refers to the process used to track changes done to the payment system. Change management encompasses the complete range of activities that have to do with the configuration of a payment system. These may include installation of new modules, installation of patches, database or file configuration changes, operating system patches or configuration changes, the method each change is applied and other related information.

The main deliverable of change management is a detailed audit trail of installable modules, configuration changes and people that authorized and performed an installation or change to the payment system. This has the following benefits:
  • A change management process ensures proper change authorization.
  • It enforces different roles to people requesting a change and those who actually apply the change.
  • A change management process can identify change prerequisites or areas affected by a change and it becomes easier to properly manage any related issues.
  • Change reporting is feasible in a very structured fashion. It is easy to get a snapshot of the latest installed modules, patches or configuration. Likewise, it's also easy to find out the path of installs and changes that resulted in the current system. And it's also trivial to attribute changes to the business entity or person that authorized them. 
  • Once in place, a properly used change management system forces you to formally evaluate each change and assess the related implications. Likewise, it ensures that a change is applied by the authorized people in the correct manner. In short, it forces you to do things the right way.
It appears that for some organizations change management is a difficult to grasp notion, despite its easily defined purpose. After all, what's it good for? All organizations have some kind of process that needs to be followed in order to authorize and make changes to a payment system, so what would change management add to this process? 

Well, sooner or later (most probably sooner) one of the following questions will pop up:
  • "I want to create a new QA system, what do I need to install and what do I need to configure?"
  • "Who authorized taking that patch live?"
  • "I want to create an image of the production server for DR purposes. What's installed in production?"
  • "Why did we make that change in the database?"
  • "How is the system configuration different to the default configuration?"
  • "What are the prerequisites before installing this patch in production?"
Without change management, answering each of the above is a time consuming and error-prone task. Change management is therefore very important. It doesn't matter how is it implemented. An organization may elect to create a home-grown change management system, purchase a license for an existing system or go out to a web-based or SaaS-based offering. The important thing is to have the process available and functioning.

2011/04/21

Tokenization

For several reasons, the acquirers and banks in the part of the world that I live in have not seriously gone after the web merchants to make them become PCI compliant. I guess that PCI is one of those topics that everyone hates. I believe that for merchants in particular, the subject must really be viewed as a major nuisance.

The word is out, though, that Visa is getting stricter and that there's going to be a fury of activity. Well, it was about time. I'm not a supporter of PCI just for the sake of PCI. But if you think about it, Visa is doing the merchants and the acquirers a favor. Security holes and procedural gaps that could potentially hurt the merchant and the acquirer will be assessed and addressed and that's a good thing. A security breach can be serious enough to close you down. Even if you put monetary losses aside, a breach can generate enough negative publicity to put you out of business.

One way to minimize the scope of PCI is to introduce tokenization. Web merchants that use this technology can get to a point where no sensitive data ever gets into their systems, regardless of what those systems are. Only a card token and possibly a transaction token are stored. This information, if stolen or intercepted, is useless to a data thief and cannot be used to send fraudulent transactions from some other part of the world.

There obviously need to be changes in the authorization flow if tokenization is to be introduced. For the acquirer the exercise is not exactly straightforward. Tokenization is not the typical service that acquirers provide to their merchants, at least not at this point in time. For banks that provide acquiring services to web merchants (and hence have smaller numbers of e-commerce transactions), the economics of the business case are even trickier. This is the reason why there are service providers with offerings that are centered mostly around tokenization and data security.

As payment systems continue to evolve, they will doubtlessly include tokenization as part of the standard transaction flow and the window of opportunity for tokenization service providers will close. Securing the authorization process with card and transaction token will be the first step. Handling recurring payments is somewhat trickier and could be a batch-based process but eventually payment systems will provide a solution for that business need as part of their standard out-of-the-box packaging as well.

2011/04/14

Ownership continued

Some time after blogging about "Ownership" it so happened that I got into a meeting with a company that provides payment consulting services. The people of that consultancy company seemed nice and they certainly have been around the block a few times. After discussing a migration project, I was left with the impression that they knew what they were talking about.

As the conversation unfolded, we started broaching the topic of testing and certification. I quickly discovered that the consultants had a very different view of how testing and certification should be handled. I am of the opinion that when an end-user (a bank) wants to in-source their payment system, they should take ownership of all testing and certification activities and immerse themselves in these tasks.

To my initial surprise, I discovered that the consultants favored the extreme opposite approach. They pretty much wanted to come up with the test cases, execute the tests and be in charge of certification cycles. In short, they felt that it is preferable to them to get all over these tasks themselves than help the end-user do them.

"How can the end-user ever get off the ground if they do not own these processes?", I thought to myself. "How can the end-user even operate their own payment system and innovate with it if they can't even go through a test cycle themselves?".I presented these concerns to them. Their response was along the lines "We can provide this service to the end-user".

Some times I'm pretty slow but at this point I grasped the obvious difference between us. The consultants are trained to act on behalf of the end-user. They are hired by the end-user and most of the time adopt the viewpoint of the end-user. When they are successful in a project, they think and act as the end-user and in essence they are the end-user. On the other end of the spectrum lies the vendor that provides the payment system software - the vendor would be very comfortable just selling a license. I happen to sit comfortably in the middle. I work with payment system vendors and, when I'm successful, I help the end-users to integrate the payment system in their organization.

In the end, the consultants virtually re-validated my opinion: for an in-sourced payment system, the end-user must step up and take ownership of the system. Whether this is achieved by the end-user's own resources or by getting hired guns to do the job is a different matter. I'm not by default against getting consultants to run an operation and take on all the things that an end-user might find distasteful. Sometimes the conditions and the economics may be right to do just that, perhaps for a large retailer, a specialized processor or a global bank. But in the long run it just doesn't make sense for the average case. If continuous integration, patching, testing and certification activities appear to be more than an end-user can bear, there's a perfect alternative which is out-sourcing.

2011/04/08

Complexity

Looking back at the 90s, the tools of the trade of programmers implementing payment solutions were horrific by today's standards. A great deal of work was done using Telnet and I remember that I quickly learned to hate the blue, text-only interface of the TTWin terminal emulator. Everything was slow. It wasn't a problem with the terminal emulator or with the text-based environment in itself, just a lack of programming accelerators and facilities. Whenever a project came along that allowed development outside of the proprietary box running the EFT switch, the contrast was evident.

Auxiliary tools were also non-existent or downright primitive. Message protocol simulators were basic and moody. Terminal simulators were starting to emerge but were costly and sometimes hard to use. Writing and executing tests was done with ad-hoc tools so esoteric that it was often too tempting to just do the whole thing manually. Traces were hard to read. Network protocols like SNA and X.25 required considerable expertise to get a hold of. And if you had to remotely connect to another system to do something, you had to go through a modem - it often turned out that the bandwidth of myself getting up and physically commuting to the remote system was considerably larger than that of the modem.

Well, nowadays the tools have been considerably upgraded. We're now coding in Eclipse or Studio, two very powerful IDEs with great facilities. Workstations have become very powerful. Dual monitor setups are becoming the norm, providing valuable additional screen real-estate. Simulators are sometimes build-in to the payment products and, if they aren't, they're easy to build or free to download or inexpensive to buy. Test facilities and test harness suites are abundant, easy to work with and sometimes ever free of charge. TCP/IP has dominated and has become the network protocol of choice; the implementation of other network protocols is usually left to dedicated network devices. Remote connections can be established in a secure and fast manner.

The weird thing though is this: while the state of the facilities and tools available has improved by an order of magnitude, productivity hasn't improved as fast. Back in the 90s the rule of thumb for estimating the implementation of an extensive stream message protocol was six months or more. Now it's three months or more. Way faster than before - after all, being able to do a job in half the time is a big improvement. But it's not an order of magnitude improvement.

Implementation of small-sized projects has benefited greatly from software evolution and maturity of tools. I have repeatedly been able to complete in a week projects that could easily take months in the bad old days. But the speed improvements diminish rapidly with the expanding size of projects. We're still faster but not by the same degree. And there's a very simple explanation for it. Complexity has increased. A gazillion new factors and acronyms have been introduced in our daily development cycle since the 90s. BCP, CNP, EMV, PCI, ISO 20022, NFC, CAP are some of the things that we have to keep in our heads.

This is by no means unique to the payments world. Software in general has become much more complicated. Attempts to simplify things and hide all this complexity behind frameworks, SDKs and APIs are partially successful but so far frameworks are not always able to help us keep up with increased complexity. Perhaps the abstractions provided by the frameworks will become better over time. Perhaps frameworks, like software, are themselves becoming too complex and contribute to the problem in their own way. The one thing that's clear is that human factor is becoming more important in our line of work. In the bad old days, a good programmer without payment systems experience could stand on his own two feet after a few months. This is not the case anymore. The individual has to capture and process a massive amount of information before beginning to see the light. Getting, and keeping, good people is now more important than ever.

2011/04/01

Authorization scripting

Modern payment systems usually offer a multitude of configuration options that cater for a large variety of situations. The thing, though, with such configuration parameters is that they are meant to cover the different situations that are thought up by the system designers. The extent of configuration parameters is limited by the architectural fantasies of the system designers. When users of payment systems delve outside these limits (something that tends to happen quite often) there comes a moment when configuration just doesn't cut it.

When configuration can't do the job, the traditional way of implementing changes in a payment system is through adding code. This could manifest in many forms depending on the system architecture. An external message protocol may be coded as a plug-in. A component may allow user exits to be written. If code for core components is available, end users may hack away at it and change the system behavior to implement their requirements (a practice that they universally regret in the years to come).

At the business spectrum, end users don't like code changes for several reasons.
  • They usually take a disproportionate amount of time to implement and thoroughly test.
  • They carry a financial cost.
  • Custom code tends to cater for uncommon situations and is generally not well covered by patches and upgrades.
  • When piled up, custom code may erode the system in a kind of spaghetti state.
A not-so-new development that claims to minimize the need for custom code is scripting. What a scripting facility in a payment system offers is an abstraction of a lot of the intricacies of application languages, making it possible for non-programmers to quickly implement a script. In addition, scripts are generally interpreted in nature and thus more dynamic.

To the user, scripting seems like a panacea but is it? It's true that a scripting facility offers some advantages.
  • The interpreted nature of scripts makes it easy to dynamically inject them in the authorization process without downtime.
  • Some customizations are very easy to implement with a script.
  • Implementation of a script generally requires less time and resources than a code change.
  • Scripts can be much more easily implemented by end users than code changes can.
However, not everything is as it seems.
  • By default, execution of scripts is an order of magnitude slower than execution of code.
  • Just as with configuration, the scripting model places restrictions to the customizations that are possible with a scripting language.
  • Making it easier to implement changes doesn't necessarily translate to a more manageable situation. Instead of code spaghetti you could have a script spaghetti of the same complexity, especially with script-happy users.
  • The burden of script maintenance and proper alignment with system patches and upgrades is generally on the end user.
A proper scripting facility is obviously a nice addition to a payment system. It's better to have it that not to have it. But it should also be appropriately rated by end users. No scripting facility, regardless of how useful it may be, can make up for shortcomings of a system in other critical areas. And, regardless of the versatility of the scripting facility, there will always come a time when implementation of custom code will be inevitable. In that respect, a scripting facility is useful but a software development kit is what's really invaluable.