There is one trend that has stuck with IT. Through from the days of mainframes, to the golden years of ITIL, the swift adoption of agile and now onwards into the much anticipated age of AI. This trend of course is the need for speed!

I have seen countless technologies, practices and frameworks come and go, but what unites them all is the constant requirement for improvement and change. However, year on year and month on month, the speed at which all these changes takes place always increases. But growing amongst this garden of great technological advancement and capability is something incredibly threatening. Something that for every layer of complexity we create for ourselves, grows and multiplies at faster and faster rates. I am of course talking about cyber security, or to be more precise awareness of information security.

Awareness is key because it highlights our often-organisational wide lack of education around the risks. It also suggests that there is a gap between our understanding of the opportunities of new technologies, verses the risks that come with them. Risks, which can often evolve faster than many of us are able to keep up with.

The arrival of GDPR may lay this on thick for a great number of businesses, at least from a policy perspective. But what I hope the most is that GDPR brings about a new height of thinking around data and information. I want companies and IT teams to intimately understand where their data is, how it moves around from place to place, supplier to supplier and how policy and compliance can be used as an enabler.

For me, this is so significantly important because my work leads me to working with complex multi-vendor IT environments. All of which require high numbers of robust and secure integrations and controlled information flows across huge ecosystems of tools, services and suppliers. Many of these fall under the SIAM frameworks and have data and security written into the project plans, but in realty the challenge of making all the right people aware of what that means, what’s involved with getting it right and more importantly, understanding the consequences of getting it wrong.

In our need for speed and new-found ease in quickly integrating more and more cloud based services, we equally as quickly forget to review, analyse and manage what sensitive or confidential data and information might be passing between these systems. And then, what sort of accountability should be applied to the ways we work to make that security work effectively.

Let me take the good old VPN as an example. VPN is very useful for connecting your employees to the services in your internal network and also for connecting separate sites that you control, over the public internet. But for sending and receiving integration messages over HTTP protocol between fixed endpoints, VPN wouldn't offer additional security over the reverse proxy solution.

Instead, it would make things much more complicated. The problem with VPNs is that they are complicated to setup, there are interoperability issues on the hardware and software implementing it and there are chances that the VPN tunnel problems cause downtime to integrations. As the integrations predominantly use HTTP protocol, it's also much simpler to understand and limit the allowed traffic and ports with the reverse proxy solution than with the VPN, which usually allows all protocols.

It’s quite likely that IT leaders reading this will share my concerns, especially those looking after complex, highly integrated multi-vendor ecosystems. My main message to those of you is to gain accurate visibility and auditability of the data and ensure you can control process flows end-to-end. Security is such an important part of that journey, so here are a few ideas on how to start moving forward a little faster with it.

Supplier documentation

For large integration projects, ensure supplier documentation is well-reviewed and suitable for your approach to managing internal and customer data. Where data is stored, how it’s backed-up and so on are all useful things. But truly modern IT services need to know much more, such as how quickly and easily can those suppliers show and explain all the data they have of yours, and how do they manage passing data from one system to another? All this should be documented and included in contracts. This also applies to all third parties providing the integrations for you,

Get ready for GDPR

Some businesses are really on top of GDPR, but others have barely heard of it. If you’re the former, then begin having more detailed discussions with suppliers and make sure they are still suitable partners for you moving ahead. Apply pressure where needed and ensure their methods of integrations are as equally as compliant as their storage and/or security. In addition, ensure that everyone knows and understands their role regarding GDPR. For instance, Service-Flow’s role as an integration service is a data processor.

Organisational awareness

It’s not just IT that have to be adequate custodians of data anymore. Every business user has contact with important data now, and if things go wrong it can’t just be passed on as a problem for IT. Organisations big and small should be developing programmes to help educate all parts of the business about these important topics and teaching them the skills and techniques to become self-sufficient and responsible in managing data.

Real-time visibility

As mentioned, I believe an important goal for any modern IT service is to be making that service more and more invisible to the customer. This achieved by firstly creating a greater level of real-time visibility of service data from within IT.  The more you can see, the more you can manage what the then customers see. So, get on top of what data IT owns, how its passed around the organisation, suppliers, cloud and third-party services. Then make this information highly available to those who’ll benefit from it most.

If you would like to learn more about solving some of these issues in your organisation, please feel free to get in touch with me and my team at