On-Time Performance and Punctuality League

Incompatible Data

Mark from OAG directed my attention this week on OAG’s Punctuality League, which they offer for free download and compiled the results in a “dashboard”, though I find that exceptionally unintuitive and more confusing than helping. FlightStats offers a similar information in tables and graphs I find far more intuitive, the On-Time Performance Awards.

Now after a quick first look, it shows already that it’s incompatible.

I just look at the first OAG graph “Top 20 Airlines by LCCs/Mainline Airlines”.

  1. Hawaiian Airlines (89.87%)
  2. Copa Airlines (88.75%)
  3. KLM (87.89%)

and compare to FlightStats, where Hawaiian neither shows in the Top 10 International Airlines nor Major Airlines (neither Mainline nor Network), but only Top 1 on Regional Airlines. KLM is 1st on International Network flights and 4th on mainline flights.

When I first encountered the FlightStats monthly statistics for airlines and airports, I’ve contacted them (with no reply) if I may add that as an indicator to our airport data. As I consider that valuable information for aviation network planners.

But as I stumble immediately over differences, it raises question. Such, it might be a good idea if OAG and FlightStats talk to each other to make sure they use the same data, and logic before they dig into detail. Or that they explain how they value the data and interpret it. As is, there are unexplained differences. Sorry, now I distrust both sources…?

Indicator. Indicator?

It can only be an indicator, as both sources fail to relate the one to the other. My first question would be to correlate the on-time performance to the hub airlines. Because it is utterly unfair to blame an airport, if their major hub airline is notoriously late.

Then one shall also keep the size of an airport and it’s congestions in mind, i.e. British Airways suffering from congestions in London-Heathrow or Thai Airways in Bangkok. Who is cause? Who is victim?

Yes, for CheckIn.com we emphasize that all that data can only be indicators. To be interpreted by an experienced network planner. Because a single new flight makes a major impact on a new or small airport, but has little statistical relevance on a major hub. Saying that, isochrones are in itself valuable statistical data and we put them into our analyses for a reason. As they are a necessity in comparison with the catchment area analysis to interpret the possible impact for a route. In forecasting, you work with indicators, you have no facts.

Big Data – Big Trouble

At the same time you work with big data, so the more data you work with, the more vital it is to get them from a sound source and have them integrated into a common system. Whereas most established data providers, be it OAG, Flight Stats, SITA, etc. have not yet  addressed that for a “good reason”. But as an industry, it is vital we add this and integration is very high on our back log at CheckIn.com of what we where we want to go!

For the time being, national statistics differ from Eurostats, differ from aviation industry statistics, differ from common sources. These differences in data you get from FlightStats and OAG just being an example that this is also an issue in aviation. Who’s right? I even have examples where the numbers figure within an airport’s own website for a given year. In order to improve, we got to tear down the walls! And yes, that’s part of what I will talk about at coming Passenger Terminal Conference & Expo in March. Will you be there? Please let us meet!

Rotational Impact

So. Why do I give these on-time-performance, no those delay statistics so much thought? Aside the cost of delays summing up to millions, they are not just a nuisance, but a problem. Because when I did that additional case study on cost savings, based on the Zurich Airport’s deicing I did for SAE G12 and WinterOps.ca, I learned an important fact from Swiss (the airline). Whereas the passengers impacted by the immediate flight understand the problem and accept higher force, the aircraft is not operating a single flight, but an entire rotation (a chain of flights) during the day/week. Any major delay has a rippling effect in the network. And if you have a snow-caused delay in the morning in Zurich, your passengers on the evening flight from the Mediterranean summer vacation will not understand and file for compensation. And the airline usually pays!

And for network planning, it is vital to know if you have to build in (expensive) buffers into your schedule, to cover up for the potential delays. That means your aircraft and especially crews are not airborne as much as they could be, such causing further loss of revenue. There is a very good reason airlines increasingly add clauses in the handling contracts with the airports punishing for creating delays and rewarding for reducing such. Being said to be an expert in winter ops planning, it’s bad enough about technical or natural (weather) delays. But yes, delays are also caused by aviation management, be it handling agent, airline operations or air traffic control.

A Summary…

So what now. I think the availability of delay statistics is compelling, useful and needed. But take them with care, as you take all statistics. Try to understand how they are computed, the logic behind and ask your provider accordingly. Yes, that includes our own. That’s why we publish the CheckIn.com methodology. Only if you understand it, you can yourself interpret it. Trust it.

We got to understand in our industry the value of data and common data structures. A delay is a delay? Nonsense. As I mentioned back three years ago in the article about A-CDM.

And I distrust any “closed source” company that does not provide me with their methodology on their analyses. Like many airports do. On the other side, at CheckIn.com, the value is not really the methodology (which is sound), it’s the work that is behind it, the compilation of data from different sources, the constant improvements we give that. Only given sound data, we can provide quality analyses. Given the quality data, anyone can come up with more or less professional analyses. Even to come up with the calculations we do to calculate an airport’s impact on a traveler’s likeliness to choose the one or other airport can be replicated. Though no, we don’t explain in detail how we do it, but the general concept. The hard work we spend every day to merge data from different sources, to cover for mistakes and other short-comings – that makes our work so hard to copy… And is a main part of our USP (Unique Selling Proposition), what makes us “unique”.

Food For Thought
Comments welcome!

0 - click to show Jürgen you liked the post

Not Invented Here

Image courtesy Oxford Creativity

This week, I happened to stumble across this Wikipedia article: https://en.wikipedia.org/wiki/Not_invented_here

Then I stumbled across this image.

It reminded me very much of my experience with A-CDM, where most larger airports’ IT rejects external solutions in order to build a custom-made solution. After several years of work, we have several tires (or tiers?) of different size, incompatible to build upon.

It’s the same argument I hear from many airports and airlines when talking to them about CheckIn.com.

It will take time (and interest) until they understand that it’s not just another “same”, but something fundamentally new.

Linus Torwalds, inventor if the Linux operating system said: “The NIH Syndrome is a Desease”

Food for Thought
Comments welcome

5 - click to show Jürgen you liked the post

Cloud vs. Security. And the Internet of Things

The Travel Industry and the Cloud

GDShosteddistributionBack in 2000, in my presentation at ITB Travel Technology Congress, I addressed the changes e-Commerce brought to our distribution. Aviation and travel have a very strong history in what we today experience as “new”, call “cloud computing”.

Aviation has been a pacemaker in pre-Internet e-Commerce. Since the invention of the first “computerized reservation systems” (CRS), based on American’s ground-breaking development of the “Semi-Automated Business Research Environement” (Sabre). Read the Sabre-History for more. Thanks to the global SITA communications network (yes, those guys I temporarily worked for last year after they acquired my employer), aviation appreciated near instantaneous communication ever since I started working in aviation back in the late 80s. What we call e-Mail today, we called “Queue Messages” back then. To date, bookings, called “Passenger Name Records” (PNR) are created and maintained “in the cloud”. Whereas the “cloud based server” is either one of the Global Distribution Systems (GDS) and/or the airline’s own CRS.

Airline IT-managers celebrating this as the next big thing simply sell you old wine in new barrels. In the mid 90s, just about 20 years ago, the last “dummy terminals” were taken out of service, replaced by PCs with more sophisticated interfaces. Which were meanwhile very much replaced by web-clients working in standard browsers. The only difference being that those browsers often still use closed networks (such as SITA) for data transport instead of the Internet. Aside the obviously more reliable and stable data speed, this directly leads to the next question:

Cloud Security

amadeus 4 tiersWhere the GDS and CRS frequently work in a closed environment reducing the danger of hacking and other insecurities, recent developments make those services available through Internet links. Being a commodity, this is much cheaper. But it also opens the communication to a number of security issues. It needs complex security layers to avoid hacking or other unintended communication disrupting those large host systems. And this is also important to understand. “working in the cloud” is “clouding” (disguising) reality with fuzzy, hip wording. All it is is communicating through the cloud (word used to disguise “the Internet”) with servers that are not local but “elsewhere”.

Amadeus Datacenter Munich
Amadeus Datacenter Munich

The cloud servers of Apple, Amazon, Microsoft, Amadeus, Worldspan or Sabre. Where the “Sabre” computers have been sold to HP and Sabre uses “commercial services”, Amadeus still has it’s own and also publishes quite some diagrams and images I frequently refer to.

But a fact in all such cases: If you believe it’s your data, this is a self-deception. You got to trust the company where you store your data to be trustworthy. Whereas recently there are quite some concerns about governmental insight into data. As I mentioned back in 2008, it’s questionable if a national government demands access to data without guarantee that this confidential commercial information does not reach the company’s competitor in that country. The example was not Russian, but American. Who watches the watcher?

owncloudAs I mentioned in my ITB presentation 2004, there’s possibilities to use alternate services from the Open Source developments. With cloud computing, you’re no longer required to use commercial services: I recently shifted all my personal data, especially calendar and contacts from Google into my OwnCloud. I trust my friend maintaining my own server. It’s in a huge computer center but my friend secures it against “unfriendly” or unauthorized access. And I hope what I have is not interesting to the server center operator to have someone physically accessing my server to steal data. A theoretical possibility. It’s a (semi-constant) assessment, on who to trust.

I also mentioned in my 2013 blog about Big Data, “The first, Big-Data-experts came up with, have been personal profiles, coming from a variety of different sources. That Google and Facebook still offer me young Russian ladies for marriage is a good sign that they are way off even that goal.” It’s a simple question on big data. From the same post: “And as the amount of data grows faster than the processing power, the real problem is predictable.”

Open Data

As much as you want to keep your personal and commercial data in some areas private, there was a mantra in the 90s “My data is my capital”. It was the time the Internet started to make data available to everyone and who “owned” the data could sell it expensively. To date the value of the GDS, the OAGs, Albatross, CH Aviation and other such data collecting companies. Whereas it is relatively easy to process aviation data as most of it is very clearly standardized. But as much as the data processing adds some value, it’s life cycle is ending. More and more “common data” becomes available openly. Where that i.e. started with OpenStreetMap, meanwhile the basic cadaster (land registry) data like street data, administrative boundaries, etc. are made openly available. Others still try to charge horrendous amounts, but they become a minority and will become extinct soon. The value is no longer in “owning” the data, but in meaningful analysis and use of it.

NextVue2Having been pacemakers in e-Commerce, aviation today is light years behind other industries. U.S. tools showing aircraft in-flight on maps like Harris Corp. (Exelis) NextVue does not have access to Canadian data as NAV Canada wants to sell it. Expensively. Not exchange (to also have access to U.S. data). It’s mine. Such, planes not traveling to/from the U.S. airspace simply don’t show. And the NAV Canada data is very often “a problem” for webservices providing such information in other markets. Dear NAV Canada, this is your wake-up call. The same for many other government owned “businesses”. Open Data is here. If you don’t come along, you will find yourself bypassed before long.

The same experience I had in my past years working on Airport Collaborative Decision Making (A-CDM). As long as our industry does not learn that it is in the benefit to the entire business and industry to share work data at reasonable cost. Base data is freely available today. But it’s fascinating how much of the base data we get from the “official sources” (like IATA, ICAO and the likes) is of lousy quality requiring manual review and updates.

That’s aviation. Believe me, working with data from 33 countries in Europe so far, basic data like population on municipality level, associating that to commercial or openly available map data from the same country’s cadastre … It’s a challenge. Many countries where the name of a city is not unique, but a municipality may have three four different names in the country. Not to mention that there are duplicate municipality names even within the same state. Open data is needed, but I think it might be something if a country could decide on unique naming for a given municipality and if EuroStat and the national statistics offices could agree on a unique identifier. And make sure their data matches. Else, a lot of people in the world will have a full time job to repeat the stunt we did. And other such data correcting others did. Again. And again. And again again.

The Internet of Things

Big Data is like teenage sex. Everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it...The last weeks the messages on LinkedIn, hyping the “Internet of Things” (IoT) are “exploding”. At this point, it’s very much like “Big Data”. Because just like big data, the concerns mentioned above apply. As long as everyone does something different and there is no common understanding about how to connect the IoT, it’s a lot of smoke and distracting noise, but not too much on real results. No matter if it’s global players announcing their understanding of IoT. As long as they don’t agree and establish open standards, IoT is a buzz word with not much substance.

As an example from another industry, more common to us all: For many years I have a look at “house IoT”. It would be so nice to be able to have the thermostats and blinds being programmable. Unfortunately, all makers of “intelligent” thermostats have their own “standard”, making it impossible to mix them. So if you want to buy, you got to select the system. And you’re stuck with it… That’s like the times of VHS vs. Betamax or DVD±R, where you usually selected the wrong technology…

Babelfish
Babelfish

Just as “video tape” or “DVD” came, evolved a standard and then became household normality, the IoT will need to develop common standards to allow common tools to exchange information with them in a default way. And not have 150 different “interpreters” trying to talk to all those devices in their language…

Food for Thought
Comments welcome!

0 - click to show Jürgen you liked the post