Integration Or Incantation?

I was travelling recently with Virgin Atlantic. I went to check in online, typed in my booking code and selected both our names, clicked "Next", and got an odd error saying that I couldn’t check in. I wondered momentarily if it was yet more pre-Brexit paranoia about Frances’ Irish passport, but there was a "check in individually" option which rapidly revealed that Frances was fine, it was my ticket which was causing the problem.

The web site suggested I ring the reservation number, which I did, listened to 5 minutes of surprisingly loud rock music (you never mistake being on hold for Virgin with anyone else), and got through to a helpful chap. He said "OK, I can see the problem, I will re-issue the ticket." Two minutes of more distinctive music, and he invited me to try again. Same result. He confirmed that we were definitely booked in and had our seat reservations, and suggested that I wait until I get to the airport. "They will help you there." Fine.

Next morning, we were tackled on our way into the Virgin area by a keen young lady who asked if we had had any problems with check in. I said we had, and she led us into what can best be described as a "krall" of check-in terminals, and logged herself into one. This displayed a smart check-in agent’s application, complete with all the logos, the picture of Branson’s glamorous Mum, and so on. She quickly clicked through a set of very similar steps to the ones I had tried, and then click OK. "Oh, that’s odd", she said.

Next, she opens up a green screen application. Well, OK, it’s actually white on a Virgin red background, but I know a green screen application when I see one. She locates my ticket, checks a few things, and types in the command to issue my pass. Now I’m not an expert on Virgin’s IT solutions, but I know the word "ERR" when I see it. "Oh, that’s not right either" says the helpful young lady "I’ll get help".

Two minutes later, the young lady is joined by a somewhat older, rather larger lady. (OK, about the same age as me and she looked a lot better in her uniform than I would, but you get the idea.) "Hello Mr Johnston, let’s see if we can sort this out". She takes one look at the screen and says "We actually have two computer systems, and they don’t always talk to each other or have the same information."

… which could be the best, most succinct summary of the last 25 years of my career I have heard, but I digress…

Back to the story. The new lady looks hard at both applications, and then announces she can see the problem (remember, all this is happening on a screen I can see as well as the two Virgin employees). "Look, they’ve got your name with a ‘T’ here, and no ‘T’ here" (pointing to the "red screen" programme).

Turning to the younger lady, she says "Right, this is how to fix it." "Type DJT, then 01" (The details are wrong, but the flavour is correct…) "Put in his ticket number. Type CHG, then enter. Type in his name, make sure we’ve got the T this time. Now set that value to zero, because this isn’t a chargeable change, and we can do a one letter change without a charge. Put in zero for the luggage, we can change that in a minute. Type DJQ, enter. Type JYZ, enter. OK, that’s better. Now try and print his pass." Back to the sexy new check in app, click a few buttons, and I’m presented with two fresh boarding passes. Job done.

Now didn’t we have a series of books where a bunch of older, experienced wizards taught keen you wizards to tap things with sticks and make incantations? The solution might as well have been to tap the red screen programme with a wand and shout "ticketamus"…

The issues here are common ones. Is it right to be so dependent on what is clearly an elderly and complex legacy system? Are the knowledge transfer processes good enough, or is there a risk that the next time the more experienced lady who knows the magic incantations might not be available? Why is such a fundamental piece of information as the passenger names clearly being copy typed, not part of the automated integrations? As a result, is this a frequent enough problem that there should really be an easier way to fix it? Ultimately the solutions are traditional ones: replace the legacy system, or improve its integrations, but these are never quick or easy.

Now please note I’m not trying to get at Virgin at all. I know for a fact that every company more than a few years old has a similar situation somewhere in the depths of their IT. The Virgin staff were all cheerful, helpful and eventually resolved the problem quickly. However it is maybe a bit of a management error to publicly show the workings "behind the green screen" (to borrow another remarkably apposite magical image, from the Wizard of Oz). We expect to see the swan gliding, not the feet busily paddling. On this occasion it was interesting to get a glimpse, and I was sympathetic, but if the workings cannot be less dependent on "magic", maybe they should be less visible?

Posted in Agile & Architecture, Thoughts on the World | Leave a comment

Singing With Each Other

We went to see The Hollies at G Live in Guildford last night. While the words and melodies were those we loved,  and the instrumental performances were good, the trademark harmonies sounded, frankly, a bit flat, and I wondered if they had finally lost it.

Then, towards the end of the first set, they announced an experiment. They would sing a song around one microphone (“you know, like in the days when we only had one mike”). The three main vocalists moved together and sang Here I Go Again. Suddenly the sound was transformed. The magic was back. It sparkled. It flew. It disappeared, sadly, when the song ended and they moved back to their respective positions on the large stage.

If I had to characterise what happened, and I was being slightly harsh, I would say for short time they were singing together, but the rest of the time they were singing at the same time.

Now we know that G Live has an odd, flat, acoustic. We have seen experienced stand-up comedians struggle because they can’t hear the laughter, and other experienced musicians ask for monitor/foldback adjustments mid performance. However we seem to have really found the Achilles Heel of this otherwise good venue – it doesn’t work if you need to hear what other people are singing and wrap your voice around theirs.

Next time, guys, please just ignore the big stage and use one mike. We’ll love it!

Posted in Thoughts on the World | Leave a comment

Collection, or Obsession?

I have decided to start another collection. Actually the real truth is that I’ve got a bit obsessive about something, and now I’m trying to put a bit of shape and control on it.

I don’t generally have an addictive personality but I do get occasional obsessions where I get one thing and then have to have more similar things, or research and build my kit ad infinitum, until the fascination wears off a bit. The trick is to make sure that it’s something I can afford, where ownership of multiple items makes some sense and where it is possible to dispose of the unwanted items without costing too much money.

Most of my collections involve clothing, where it makes reasonable sense to buy another T shirt, or bright jacket, or endangered species tie (of which I may well have the world’s largest collection). They can all be used, don’t take up too much space, and have some natural turnover as favourites wear out. Likewise I have a reasonable collection of malt whiskies, but I do steadily drink them.

Another trick is to make sure that the collection has a strong theme, which makes sure you stay focused, and which ideally limits the rate of acquisition to one compatible with your financial and storage resources. I don’t collect "ties", or even "animal ties", I collect Endangered Species ties, which only came from two companies and haven’t been made for several years. Likewise my jackets must have a single strong colour, and fit me, which narrows things down usefully.

The new collection got started innocently enough. For nearly 18 years my only "good" watch was a Rado Ceramica, a dual display model. About a year ago I started to fancy a change, not least because between changes in my sight, a dimming of the Rado’s digital display, and a lot of nights in a very dark hotel room I realised it was functioning more as jewellery than a reliable way of telling the time. So I wanted a new watch, but I wasn’t inspired as to what.

Then I watched Broken Arrow, and fell in lust with John Travolta’s Breitling Aerospace. The only challenge was that they are quite expensive items, and I wasn’t quite ready to make that purchase. In the meantime we watched Mission Impossible 5, and I was also quite impressed with Simon Pegg’s Tissot T-Touch. That was more readily satisfied, and I got hold of a second-hand one with nice titanium trim and a cheerful orange strap for about £200. This turned out to be an excellent "holiday" watch, tough, colourful and with lots of fun features including a thermometer, an altimeter/barometer, a compass, and a clever dual time zone system. That temporarily kept the lust at bay, but as quite a chunky device it wasn’t the whole solution.

The astute amongst you will have recognised that there a couple of things going on here which could be the start of a "theme". Firstly I very much like unusual materials: the titanium in all watches I’ve mentioned, the sapphire faces of the Breitling and the Rado and that watch’s hi-tech ceramic.

Second all these watches have a dual digital/analogue display. I’ve always liked that concept, ever since the inexpensive Casio watch which I wore for most of the 90s. Not only is it a style I like, it’s also now a disappearing one, being displaced by cleverer smartphones and smart watches. Of the mainstream manufacturers only Breitling and Tissot still make such watches. That makes older, rarer examples eminently collectable.

To refine the collection, there’s another dimension. I like my stuff to be unusual, ideally unique. Sometimes there’s a functional justification, like the modified keyboards on my MacBooks, but it’s also why my last two cars started off black and ended up being resprayed. Likewise, when I finally decided to take advantage of the cheap jewellery prices in Barbados and bought my Breitling I looked hard at the different colour options and ended up getting the vendor to track down the last Aerospace with a blue face and matching blue strap in the Caribbean.

Of course, if I’m being honest there’s a certain amount of rationalisation after the event going on here. What actually happened is that after buying the Breitling I got a bit obsessed and bought several and sold several cheaper watches before really formulating the rules of my collection. However I can now specify that any new entrant must be (unless I change the rules, which may happen at any time at the collector’s sole option 🙂 ):

  • Dual display. That’s the theme, and I’m happy to stick to it, for now.
  • Functional and in good condition. These watches are going to be worn, and having tried to fix a duff one it’s not worth the effort.
  • Affordable. This is a collection for fun and function, not gain. While there’s a wide range between the cheapest and most expensive, most have cost around £200, and are at least second-hand.
  • The right size. With my relatively small hands and wrists, that means a maximum of about 44mm, but a minimum of about 37mm (below which the eyes may be more challenged). As I’m no fan of "knuckle dusters" most are no more than 11mm thick, although I’m slightly more flexible on that.
  • Beautiful, or really clever, or both. Like most men, a watch is my only jewellery, and I want to feel some pride of ownership and pleasure looking at it. Alternatively I’ll give a bit on that (just a bit) for a watch with unusual functionality or materials.
  • Unusual. Rare colour and material combinations preferred, and I’m highly likely to change straps and bracelets as well.

Ironically I’m not so insistent that it has to be a great "time telling" device. There are honourable exceptions (the Breitling), but there does seem to be a rough inverse relationship between a watch’s beauty and its clarity. I’m prepared to accommodate a range here, although it has to be said that most of the acquisitions beat the Rado in a dark room.

So will these conditions control my obsession, or inflame and challenge it? Time will tell, as will telling the time…

Posted in Thoughts on the World, Watches | 1 Comment

Back to the Future

I’ve opined before about how Microsoft have made significant retrograde steps with recent versions of Office. However this morning they topped themselves when Office 2016 started complaining about not being activated, and the recommended, automated solution was to do a complete download and "click to run" installation of some weird version of Office 365 over the top of my current installation.

In the meantime, I’ve been working with a main client whose standard desktop is based on Office 2010, and, you know what, it’s just better.

I’ve had enough. Office 2016 and 2013 have been removed from the primary operating systems of all my machines. In the unlikely event that I need Office 2016 (and the only real candidate is Skype for Business), I’ll run it in a VM. Long live Office 2010!

Posted in Thoughts on the World | Leave a comment

Business Models

Here’s a business model:

I’m a drug dealer. I sell you a crack cocaine pipe complete with a packet of wraps for £220. It’s a good pipe (assuming that such things exist) – burns clean and always hits the spot (OK I’m making this bit up, it’s not exactly an area of first-hand knowledge.)

To make my business plan work the packet of wraps is half high quality crack cocaine and half icing sugar. You come back to me and I’m very happy to sell you another packet of wraps. This time the price is £340, again for half high quality crack and half icing sugar.

This business model is illegal and for a number of very good reasons.

OK here is a completely different business model, nothing at all like the last one:

I am a manufacturer of consumer electronics. To be specific I’m a Korean manufacturer of occasionally explosively good consumer electronics. I sell you a printer complete with a set of toner cartridges for £220. It’s a very good printer – quiet, reliable, lovely output (I’m on safer ground here.)

To make my business plan work I put a little circuit in each toner cartridge so that at 5000 pages it says that it’s empty even if it it’s still half full. You come back to me and I’m very happy to sell you another set of cartridges, this time the price is £340. Again each cartridge is wired to show empty even when it’s still half full.

For reasons I fail to understand this model is legal, certainly in the UK.

There is of course an answer but it feels morally wrong. I just put my perfectly good printer in the bin and buy a new one complete with toner cartridges. I have also found a little chap in China who for £40 will sell me a set of chips for the cartridges. Five minutes with a junior hacksaw and some blu-tack and I can double their life.

Maybe the answer is just to throw the printer away every time the cartridges are empty. Surely it is not sustainable for the manufacturer if everyone just does this. But it doesn’t feel right…

Posted in Thoughts on the World | 1 Comment

A "False Colour" Experiment

Infrared trees with false colour
Camera: Panasonic DMC-GX7 | Date: 05-07-2017 09:54 | Resolution: 4390 x 1756 | ISO: 200 | Exp. bias: 0.33 EV | Exp. Time: 1/640s | Aperture: 8.0 | Focal Length: 17.0mm | State/Province: Swinhoe, Northumberland | See map

This is a bit of an experiment, but I think it works. I started with an infrared image in its standard form: yellow skies and blue foliage. I then performed a series of fairly simple colour replacement operations in Photoshop Elements: yellow to red, blue in top half of image to dark green, blue in bottom half of image to pale green, red to blue. The result is a bit like a hand-coloured black and white image. I like it, do you?

View featured image in Album
Posted in Photography | Leave a comment

Infrared White Balance

Alnwick Castle Reflections in the Infrared
Camera: Panasonic DMC-GX7 | Date: 05-07-2017 14:29 | Resolution: 4653 x 2908 | ISO: 200 | Exp. bias: 0 EV | Exp. Time: 1/800s | Aperture: 6.3 | Focal Length: 12.0mm | State/Province: Alnwick, Northumberland | See map | Lens: LUMIX G VARIO 12-35/F2.8

"I’m shooting infrared. My main output is RAW files, and any JPGs are just aides memoire. Between my raw processor and Photoshop I’m going to do some fancy channel mixing to either add false colour, or take it away entirely and generate a monochrome image. So I’m assuming my white balance doesn’t matter. Is that right?"

Nope, and this article explains why. If you’re struggling with, or puzzled by, the role of white balance in infrared photography, hopefully this will help untangle things.

View featured image in Album
Posted in Photography | Leave a comment

Liberation from the "Frightful Five"

There’s an interesting NY Times article on our dependency on "Tech’s Frightful Five", which includes a little interactive assessment of whether you could liberate yourself, and if so in which order. I thought it would be interesting to document my own assessment.

  1. FaceBook. No great loss. I’ve only started recently and I’m not a terribly social animal. I also have my own website and LinkedIn. Gone.
  2. Apple. Momentary wrench. My only connection to Apple is my MacBook Pro laptop, which is a great bit of kit. However it runs Windows and I’m sure Dell or Sony could sell me a reasonable replacement, although I would really miss the large 16×10 Retina screen.
  3. Alphabet/Google. Harder work, but straightforward. There are alternatives to Chrome as a browser, Google as a search engine, even Android as a phone/tablet operating system. It helps that Google has a bit of a track record of providing something you get to like, and then without warning disabling or crippling that rendering it of reduced or no value (think Android KitKat, Google Currents, I could go on). There’s a bit of work here, but it could be done.

And then I’m stuck. Like Farhad Manjoo Amazon has worked its way into a prime (or should that be "Prime") position in not only our shopping but also our viewing and reading habits. Yes, there are options, but the pain of transition would be substantial, and the loss of content (almost 400 Kindle books, Top Gear, Ripper Street and the Man in the High Castle among others) expensive. Amazon probably gets 4th place, but don’t ask me to do it! Steps 1-3 would leave me with an even heavier dependency than today on Windows and other Microsoft products and subsidiaries for all my day to day technical actions, and unless we’re going back to the Dark Ages I don’t see good alternatives, so Microsoft gets 5th by default, but it’s not really on the list. Well played, Bill.

Who are you most dependent on?

Posted in Thoughts on the World | Leave a comment

What Are Your Waypoints?

Country singer at the Listening Room, Nashville, providing important routeing information!
Camera: Panasonic DMC-GX7 | Date: 24-09-2014 18:14 | Resolution: 3424 x 3424 | ISO: 3200 | Exp. bias: 0 EV | Exp. Time: 1/25s | Aperture: 5.6 | Focal Length: 46.0mm | Location: The District | State/Province: Tennessee | See map | Lens: LUMIX G VARIO 35-100/F2.8

How do you remember the waypoints and landmarks on a journey? What are the key features by which you can replay in your mind, or to someone else, where you went and what you did?

Like any good Englishman, I can navigate substantial sections of our sceptred  isle by drinking establishment. This is, of course, a long tradition and officially recognised mechanism – it’s why British pubs have recognisable iconic signs, so that even if you were illiterate you could get yourself from inn to inn. It’s a bit more difficult today thanks to pub closures and the rise of pub chains with less distinguishable names, but it still works. Ask me to navigate you around Surrey, and there will be a lot of such landmarks in the discussion.

When I look back at other trips, especially to foreign parts, the mechanisms change. I can usually remember where I took favourite photographs, even without the GPS tagging, and I could immediately point to the locations of traumatic events whether in motion ("the Italian motorway with the big steel fences either side") or at rest ("the hotel with the sticky bathroom floor"). I also tend to hold in my head a sort of "moving map" picture of the journey’s flow, which might not be terribly accurate, but could be rendered more so quite quickly by studying a real map.

Frances, despite appearances to the contrary, navigates largely using food. Yesterday we had a typical example: "do you remember that lovely town square where we had breakfast in front of the town hall and we had to ask them whether they had real eggs because the powdered eggs were disagreeing with me? I think it was on the Washington trip." This was a challenge. "Breakfast" was probably right, so that narrowed things down a bit. "The Washington trip" was probably correct, but I have learned to treat such information with an element of caution.

At this point we had therefore to marry up two different reference systems, and try and work out where they overlapped. My first pass was to run the moving map of the Washington trip in my head, and call out the towns where we stayed. That eliminated a couple of stops, where we could both remember the breakfast arrangements (the very good restaurant at the Peaks of Otter Lodge, and a nice diner in Gatlinburg), but we were still missing an obvious match.

Then Frances said "I think we had to drive out of town for a bit because we’d had to change our route". Bingo! This now triggered the "traumatic event" register in my mind, specifically listening to a charming young lady in Nashville singing a song about the journey of a bottle of Jack Daniels, and suddenly realising I had put the wrong bloody Lynchburg on our route! Over dinner I had to do a quick replan and include Lynchburg Tennessee as well as Lynchburg Virginia in our itinerary. That meant an early start from Nashville next morning, heading south rather than directly east, and half-way to Lynchburg (the one with the Jack Daniels distillery) we stopped for breakfast because the offering at the hotel had looked very grim. Got there in the end.

(If you’re wondering, I do actually have a photographic record of this event. The young lady above is the one who sang the song with the critical routeing information.)

We’ve also had "that restaurant where we were the only white faces and the manager kept asking if we were OK" (Memphis, near Gracelands), and "that little store where they did the pulled pork sandwiches and the woman’s daughter lived in Birmingham" (Vesuvius, Virginia). In fairness to my wife, she can also accurately recall details of most of our retail transactions on each trip, including the unsuccessful ones. ("That town where we bought my Kokopeli material, and the old lady had to run across the street although there was no traffic"). Again there’s the challenge of marrying these up with my frame of reference, but the poor old lady in Cortez, Colorado, desperately trying to beat the count down timer on the pedestrian crossing, despite a traffic level of about 1 vehicle a minute, sticks in my mind as well, so that one was easy. Admittedly, I remember Cortez as "that nice town just outside Mesa Verde", but that’s me.

What’s your frame of reference?

View featured image in Album
Posted in Thoughts on the World, Travel | Leave a comment

How Strong Is Your Programming Language?

Line-up at the 2013 Europe's Strongest Man competition
Camera: Canon EOS 7D | Date: 29-06-2013 05:31 | Resolution: 5184 x 3456 | ISO: 200 | Exp. bias: -1/3 EV | Exp. Time: 1/160s | Aperture: 13.0 | Focal Length: 70.0mm (~113.4mm)

I write this with slight trepidation as I don’t want to provoke a "religious" discussion. I would appreciate comments focused on the engineering issues I have highlighted.

I’m in the middle of learning some new programming tools and languages, and my observations are coalescing around a metric which I haven’t seen assessed elsewhere. I’m going to call this "strength", as in "steel is strong", defined as the extent to which a programming language and its standard tooling avoid wasted effort and prevent errors. Essentially, "how hard is it to break?". This is not about the "power" or "reach" of a language, or its performance, although typically these correlate quite well with "strength". Neither does it include other considerations such as portability, tool cost or ease of deployment, which might be important in a specific choice. This is about the extent to which avoidable mistakes are actively avoided, thereby promoting developer productivity and low error rates.

I freely acknowledge that most languages have their place, and that it is perfectly possible to write good, solid code with a "weaker" language, as measured by this metric. It’s just harder than it has to be, especially if you are free to choose a stronger one.

I have identified the following factors which contribute to the strength of a language:

1. Explicit variable and type declaration

Together with case sensitivity issues, this is the primary cause of "silly" errors. If I start with a variable called FieldStrength and then accidentally refer to FeildStrength, and this can get through the editing and compile processes and throw a runtime error because I’m trying to use an undefined value then then programming "language" doesn’t deserve the label. In a strong language, this will be immediately questioned at edit time, because each variable must be explicitly defined, with a meaningful and clear type. Named types are better than those assigned by, for example, using multiple different types of brackets in the declaration.

2 Strong typing and early binding

Each variable’s type should be used by the editor to only allow code which invokes valid operations. To maximise the value of this the language and tooling should promote strong, "early bound" types in favour of weaker generic types: VehicleData not object or var. Generic objects and late binding have their place, in specific cases where code must handle incoming values whose type is not known until runtime, but the editor and language standards should then promote the practice of converting these to a strong type at the earliest practical opportunity.

Alongside this, the majority of type conversions should be explicit in code. Those which are always "safe" (e.g. from an integer to a floating point value, or from a strong type to a generic object) may be implicit, but all others should be spelt out in code with the ability to trap errors if they occur.

3. Intelligent case insensitivity

As noted above, this is a primary cause of "silly" errors. The worst case is a language which allows unintentional case errors at edit time and through deployment, and then throws runtime errors when things don’t match. Such a language isn’t worth the name. Best case is a language where the developer can choose meaningful capitalisation for clarity when defining methods and data structures, and the tools automatically correct any minor case issues as the developer references them, but if the items are accessed via a mechanism which cannot be corrected (e.g. via a text string passed from external sources), that’s case insensitive. In this best case the editor and compiler will reject any two definitions with overlapping scope which differ only in case, and require a stronger differentiation.

Somewhere between these extremes a language may be case sensitive but require explicit variable and method declaration and flag any mismatches at edit time. That’s weaker, as it becomes possible to have overlapping identifiers and accidentally invoke the wrong one, but it’s better than nothing.

4. Lack of "cruft", and elimination of "ambiguous cruft"

By "cruft", I mean all those language elements which are not strictly necessary for a human reader or an intelligent compiler/interpreter to unambiguously understand the code’s intent, but which the language’s syntax requires. They increase the programmer’s work, and each extra element introduces another opportunity for errors. Semicolons at the ends of statements, brackets everywhere and multiply repeated type names are good (or should that be bad?) examples. If I forget the semicolon but the statement fits on one line and otherwise makes syntactic sense then then code should work without it, or the tooling should insert it automatically.

However, the worse issue is what I have termed "ambiguous cruft", where it’s relatively easy to make an error in this stuff which takes time to track down and correct. My personal bête noire is the chain of multiple closing curly brackets at the end of a complex C-like code block or JSON file, where it’s very easy to mis-count and end up with the wrong nesting.  Contrast this with the explicit End XXX statements of VB.Net or name-matched closing tags of XML. Another example is where an identifier may or may not be followed by a pair of empty parentheses, but the two cases have different meanings: another error waiting to occur.

5. Automated dependency checking

Not a lot to say about this one. The compile/deploy stage should not allow through any code without all its dependencies being identified and appropriately handled. It just beggars belief that in 2017 we still have substantial volumes of work in environments which don’t guarantee this.

6. Edit and continue debugging

Single-stepping code is still one of the most powerful ways to check that it actually does what you intend, or to track down more complex errors. What is annoying is when this process indicates the error, but it requires a lengthy stop/edit/recompile/retest cycle to fix a minor problem, or when even a small exception causes the entire debug session to terminate. Best practice, although rare, is "edit and continue" support which allows code to be changed during a debug session. Worst case is where there’s no effective single-step debug support.

 

Some Assessments

Having defined the metric, here’s an attempt to assess some languages I know using it.

It will come as no surprise to those who know me that I give VB.Net a rating of Very Strong. It scores almost 100% on all the factors above, in particular being one of very few languages to express the outlined best practice approach to case sensitivity . Although fans of more "symbolic" languages derived from C may not like the way things are spelled out in words, the number of "tokens" required to achieve things is very low, with minimal "cruft". For example, creating a variable as a new instance of a specific type takes exactly 5 tokens in VB.Net, including explicit scope control if required and with the type name (often the longest token) used once. The same takes at least 6 tokens plus a semicolon in Java or C#, with the type name repeated at least once. As noted above, elements like code block ends are clear and specific removing a common cause of  silly errors.

Is VB.Net perfect? No. For example if I had a free hand I would be tempted to make the declaration of variables for collections or similar automatically create a new instance of the appropriate type rather than requiring explicit initiation, as this is a common source of errors (albeit well flagged by the editor and easily fixed). It allows some implicit type conversions which can cause problems, albeit rarely. However it’s pretty "bomb proof". I acknowledge there may be some cause and effect interplay going on here: it’s my language of choice because I’m sensitive to these issues, but I’m sensitive to these issues because the language I know best does them well and I miss that when working in other contexts.

It’s worth noting that these strengths relate to the language and are not restricted to expensive tools from "Big bad Microsoft". For example the same statements can be made for the excellent VB-based B4X Suite from tiny Israeli software house Anywhere Software, which uses Java as a runtime, executes on almost any platform, and includes remarkable edit and continue features for software which is being developed on PC but running on a mobile device.

I would rate Java and C# slightly lower as Pretty Strong. As fully compiled, strongly typed languages many potential error sources are caught at compile time if not earlier. However, the case-sensitivity and the reliance on additional, arguably redundant "punctuation" are both common sources of errors, as noted above. Tool support is also maybe a notch down: for example while the VB.Net editor can automatically correct minor errors such as the case of an identifier or missing parentheses, the C# editor either can’t do this, or it’s turned off and well hidden. On a positive note, both languages enforce slightly more rigor on type conversions. Score 4.5 out of 6?

Strongly-typed interpreted languages such as Python get a Moderate rating. The big issue is that the combination of implicit variable declaration and case sensitivity allow through far too many "silly" errors which cause runtime failures. "Cruft" is minimal, but the reliance on punctuation variations to distinguish the declaration and use of different collection types can be tricky. The use of indentation levels to distinguish code blocks is clear and reasonably unambiguous, but can be vulnerable to editors invisibly changing whitespace (e.g. converting tabs to spaces). On a positive note the better editors make good use of the strong typing to help the developer navigate and use the class structure. I also like the strong separation of concerns in the Django/Jinja development model, which echoes that of ASP.Net or Java Server Faces. I haven’t yet found an environment which offers edit and continue debugging, or graceful handling of runtime exceptions, but my investigations continue. Score 2.5 out of 6?

Weakly-typed scripting languages such as JavaScript or PHP are Weak, and in my experience highly error prone, offering almost none of the protections of a strong language as outlined above. While I am fully aware that like King Canute, I am powerless to stop the incoming tide of these languages, I would like to hope that maybe a few of those who promote their use might read this article, and take a minute to consider the possible benefits of a stronger choice.

 

Final Thoughts

There’s a lot of fashion in development, but like massive platforms and enormous flares, not all fashions are sensible ones… We need a return to treating development as an engineering discipline, and part of that may be choosing languages and tools which actively help us to avoid mistakes. I hope this concept of a "strength" metric might help promote such thinking.

View featured image in Album
Posted in Agile & Architecture, Code & Development | Leave a comment

3D Photos from Myanmar

Small temple at the Swedagon Pagoda, Yangon
Camera: Panasonic DMC-GX8 | Date: 10-02-2017 08:22 | Resolution: 5240 x 3275 | ISO: 200 | Exp. bias: 0 EV | Exp. Time: 1/80s | Aperture: 14.0 | Focal Length: 21.0mm | Location: Shwedagon Pagoda | State/Province: Wingaba, Yangon | See map | Lens: LUMIX G VARIO 12-35/F2.8

I’ve just finished processing my 3D shots from Myanmar. If you have a 3D TV or VR goggles, download a couple of the files from the following link and have a look.

https://www.andrewj.com/public/3D/

View featured image in Album
Posted in Myanmar Travel Blog, Photography, Travel | Leave a comment

Why I (Still) Do Programming

It’s an oddity that although I sell most of my time as a senior software architect, and can also afford to purchase software I need, I still spend a lot of time programming, writing code. Twenty-five years ago people a little older than I was then frequently told me “I stopped writing code a long time ago, you will probably be the same”, but it’s just turned out to be completely untrue. It’s not even that I only do it for a hobby or personal projects, I work some hands-on development into the majority of my professional engagements. Why?

At the risk of mis-quoting the Bible, the answer is legion, for they are many…

To get the functionality I want

I have always been a believer in getting computers to automate repetitive actions, something they are supremely good at. At the same time I have a very low patience threshold for undertaking repetitive tasks myself. If I can find an existing software solution great, but if not I will seriously consider writing one, or at the very least the “scaffolding” to integrate available tools into a smooth process. What often happens is I find a partial solution first, but as I get tired of working around its limitations I get to the point where I say “to hell with this, I’ll write my own”. This is more commonly a justification for personal projects, but there have been cases where I have filled gaps in client projects on this basis.

Related to this, if I need to quickly get a result in a complex calculation or piece of data processing, I’m happy to jump into a suitable macro language (or just VB) to get it, even for a single execution. Computers are faster than people, as long as it doesn’t take too long to set the process up.

To explore complex problems

While I am a great believer in the value of analysis and modelling, I acknowledge that words and diagrams have their limits in the case of the most complicated problem domains, and may be fundamentally difficult to formulate and communicate for complex and chaotic problem domains (using all these terms in their formal sense, and as they are used in the Cynefin framework, see here).

Even a low-functionality prototype may do more to elicit an understanding of a complex requirement than a lot of words and pictures: that’s one reason why agile methods have become so popular. The challenge is to strike a balance, and make sure that an analytical understanding does genuinely emerge, rather than just being buried in the code and my head. That’s why I am always keen to generate genuine models and documentation off the back of any such prototype.

The other case in which I may jump into code is if the dynamic behaviour of a system or process is difficult to model, and a simulation may be a valid way of exploring it. This may just be the implementation of a mathematical model, for example a Monte Carlo simulation, but I have also found myself building dynamic visual models of complex interactions.

To prove my ideas

Part of the value I bring to professional engagements is experience or knowledge of a range of architectural solutions, and the willingness to invoke unusual approaches if I think they are a good fit to a challenge. However it’s not unusual to find that other architects or developers are resistant to less traditional approaches, or those outside their comfort zones. Models and PowerPoint can go only so far in such situations, and a working proof of concept can be a very persuasive tool. Conversely, if I find that it isn’t as easy or as effective as I’d hoped, then “prove” takes on its older meaning of “test” and I may be the one being persuaded. I’m a scientist, so that’s fine too.

To prove or assess a technology

Related to the last, I have found by hard-won experience that vendors consistently overstate the capabilities of their solutions, and a quick proof of concept can be very powerful in confirming or refuting a proposed solution, establishing its limitations or narrowing down options.

A variant on this is where I need to measure myself, or others, for example to calibrate what might or might not be adequate productivity in a given situation.

To prove I can

While I am sceptical of overstated claims, I am equally suspicious if I think something should be achievable, and someone else says “that’s not possible”. Many projects both professional and personal have started from the assertion that “X is impossible”, and my disbelief in that. I get a great kick from bending technology to my will. To quote Deep Purple’s famously filthy song, Knocking At Your Back Door, itself a exploration into the limits of possibility (with censorship), “It’s not the kill, it’s the thrill of the chase.”.

In the modern world of agile development processes, architect and analyst roles are becoming blurred with that of “developer”. I have always straddled that boundary, and proving my development abilities my help my credibility with development teams, allowing me to engage at a lower level of detail when necessary. My ability to program makes me a better architect, at the same time as architecture knowledge makes me a better programmer.

To make money?

Maybe. If a development activity can help to sell my skills, or advance a client’s project, then it’s just part of my professional service offering, and on the same commercial basis as the rest. That’s great, especially if I can charge a rate commensurate with the bundle of skills, not just coding. My output may be part of the overall product or solution or a enduring utility, but more often any development I do is merely the means to an end which is a design, proof of concept, or measurement.

On the other hand, quite a lot of what I do makes little or no money. The stuff I build for my own purposes costs me little, but has a substantial opportunity cost if I could use the time another way, and I will usually buy a commercial solution if one exists. The total income from all my app and plugin development over the years has been a few hundred pounds, probably less than I’ve paid out for related tools and components. This is a “hobby with benefits”, not an income stream.

Because I enjoy it

This is perhaps the nub of the case: programming is something I enjoy doing. It’s a creative act, and puts my mind into a state I enjoy, solving problems, mastering technologies and creating an artefact of value from (usually) a blank sheet. It’s good mental exercise, and like any skill, if you want to retain it you have to keep in practice. The challenge is to do it in the right cases and at the right times, and remember that sometimes I really should be doing something else!

Posted in Agile & Architecture, Code & Development, Thoughts on the World | Leave a comment