NFC contactless are the next evolution

“Smartcard to wipe out cash”, a headline which sounds like it may have been used any time in the last few years, is actually taken from the Evening Standard in 1993 just before the launch of Mondex, one of the earliest smart card cashless payment systems.

Launched in Swindon, UK Mondex promised to revolutionise payments using what today is known as a ‘closed loop’ system where money is transferred to a smart card containing a chip and the card is then used to pay for items using a special reader until the virtual cash is used up.

It sounded great and launched to much fanfare but four years later it quietly disappeared never to be heard of again. Its lack of success is generally cited as being down to the hassle of loading the cash, the limited locations at which it could be used and the infrastructure required to support it. Soon after this chip & pin started to emerge offering an ‘open loop’ solution whereby the cash is debited directly from your bank account and within a few years this became the norm.

Skip forward twenty years and it feels like we are seeing history repeat itself.

For the last five years or so the talk of cashless payment at events has fuelled many a debate but the implementation and adoption in the UK at least has been very slow and fraught with issues. The basic idea has been the same as Mondex all those years ago – a closed loop system with the chip (now wirelessly contactable) typically embedded in a wrist band rather than a card.

Many of the same challenges still exist today – the hassle of adding credit to the wristband, the dedicated infrastructure required, limited areas of acceptance and redeeming unused balances. Then there is the user aspect, many of the benefits are for the organiser and promotor rather than the attendee. This is coupled with attendees having concerns about too much information being made available to the event about their purchases and payments.

The aforementioned issues with closed loop systems have allowed the next generation of open loop contactless systems to gain adoption at a much faster rate. Open loop contactless using an existing debit or credit card is a natural progression from chip & pin and removes many of the hurdles seen with closed loop. It is quite telling when one of the world’s largest closed loop systems – Transport for London’s Oyster card – is now moving to an open loop approach.

What is interesting is that in some countries there has been higher adoption of closed loop – the US for example. The US were much faster to the chip & pin party but have been behind the curve on contactless and this may have left a window for closed loop in the short term.

The question is where does this leave events who have several drivers to move to a cashless environment. With the rapid adoption of open loop contactless in day to day life coupled with several disrupters like Apple Pay, Android Pay and PayPal Here, all of which use an open loop approach with NFC (Near Field Communications) embedded in smart devices, the modern generation of event goers will move to the trusted services and closed loop will quietly die away.

What remains is the challenge at events in terms of how to deal with smart reader based payments as the infrastructure cost can be a hurdle to adoption. There are several components to this:

Universal Payment Terminals – The banking world needs to move faster in providing good quality payment terminals that are certified across multiple methods of communication (wired, Wi-Fi & mobile data) and multiple payment methods (chip & pin, contactless & NFC). Today different terminals have to be used depending on the connection method and payment type which means merchants have to hire terminals for use at events because they cannot use their normal terminal. A universal terminal would also make deployment on event sites much easier and cost effective.

Merchant IDs – Many smaller traders at events do not have the magic ‘Merchant ID’ required to set-up card based payment terminals. Merchant IDs are controlled by payment houses and can be costly and complex for very small businesses so a better mechanism is needed to facilitate access to open loop systems for those traders. This sounds like an easy area but it has some complexities due to money laundering issues. Systems such as iZettle help with this but carry (generally) higher fees.

Access to Data – A difference between closed loop and open loop for a promotor or organiser is the ability to easily access usage data. As closed loop is in the control of the organiser they get full visibility (although this can be seen as a negative by attendees). With open loop the data is held by the payment providers so to get a better view across the entire event (involving many merchant IDs) some form of agreed consolidated reporting would remove the concerns organisers have about visibility.

Providing Infrastructure – Open loop systems tend to have a slightly higher requirement when it comes to readers being connected to a network (although many closed loop systems are not as offline as promoted). A modern event has such high requirements in other areas for connectivity that adding in payment systems is not the concern it once was. It is now well accepted that providing access to contactless card based payments drives a higher spend so it should be recognised that an increased spend on infrastructure will reap returns overall.

In the last few years we have seen a rapid swing to providing a resilient payment environment across events and the feedback is very positive – fast and easy transactions, and an increased spend by attendees. It just needs more support from the banking world to resolve the last few issues and make the cashless (or near cashless) event a reality.

Notes:

Swindon Advertiser – How Smart was that?

15402526381_c8db6505dc_m

We may be heading into UK winter but Etherlive are already busy planning for the summer of 2016. We are always on the lookout for new recruits with opportunities for students and seasoned professionals for the peak summer season. Our environment does not suit everyone, this is IT with a difference; tight deadlines, indoors and outdoors across the UK and Europe, with events ranging from business briefings to large music festivals, these are roles for people who thrive on challenge, can demonstrate problem solving and who have excellent communication skills.

Our teams deploy technology services which support some of the largest and greatest events throughout the UK and Europe. Etherlive is the power behind the scenes which helps the production crews, security, artists and broadcasters bring their events to the millions who attend.

For some roles excellent technical knowledge is required, especially around wireless technologies and core IT such as TCP/IP, networking and server management but for others it is customer skills and project management that are key. Knowledge of the events industry, particularly outdoor events and festivals is an advantage but not essential.

We will be running a series of wide ranging skills and training workshops throughout the winter and spring so whatever your skillset or experience we are keen to hear from energetic and enterprising candidates with a passion for working in the events industry.

Those looking to apply should be ready to spend periods working away from home, sometimes several weeks. In the first instance please send your CV and a covering letter to introduce yourself to jobs@etherlive.co.uk.

WOMAD offers free public wi-fi to all attendeesThe topic of public or attendee Wi-Fi at events creates more churn and discussion than just about any other aspect in the technology arena. Organiser questions come thick and fast – Should we provide it? How should we charge for it? Will it work? Why does it cost so much? How many people will use it? The list goes on.

The approach to production, exhibitor and trader Wi-Fi is clear cut but for the public, opinion on approach, the need and value flip on a regular basis. This is not entirely surprising given the confusing and often incorrect messaging which swirls around the industry, accompanied by the fact that the topic is more complex than it initially looks.

If you are running an event in a location with little or no mobile coverage, then the desire to provide connectivity for attendees is well placed as there is an expectation in today’s world for ubiquitous connectivity and attendees will quickly rally round to complain if they are disconnected from the rest of the world.

Mobile 3G & 4G coverage at events is improving but outside of a select few the reality is the mobile networks are not designed to service the volume of users at large events which leads to sporadic or non-existent performance. Even if there is good mobile coverage the drive to provide a public Wi-Fi network may be down to different factors, not least by the fact that a dedicated network is in the control of the organiser providing opportunities to gather statistics, target advertising, monitor usage and offer interactive services.

How do I pay for it?

Monetising the provision is, however, a difficult area as directly charging for Wi-Fi access is not a good approach and sees very limited take-up. Users are offended by the idea that after paying to attend an event they are asked to pay extra for internet access which in their view is a utility and life-right, especially when in most scenarios Wi-Fi access is ‘free’. It may be accepted by an organiser that any provision is just an overhead cost, the value being in the good feedback and enhanced social media presence that such an offering provides but in most cases there is an expectation of some direct value or cost recovery.

The key point is not to focus on the Wi-Fi connection but to look beyond at what the connection delivers – that may be additional paid for content, sponsorship and advertising, attendee interaction, geo-fencing and location services, add on experiences which are sold through the network, payment systems or other value-add elements which may be more accepted as a paid-for offering.

What capacity do I need?

One of the hardest things about public Wi-Fi at events is predicting usage and capacity required. There are multiple vectors to this but historical data and experience provide a good starting point. The key aspect is the likely amount of concurrent users as this drives the high water mark for system capacity.

The first vector is the type of event, a music festival for example will typically see a lower concurrent usage percentage than a more business focused event such as an exhibition. This is driven by the immediacy of modern business working versus the more local experience of a festival, coupled with the need at a festival to conserve battery life such that Wi-Fi is turned off unless actually required. Interestingly though over the course of a multi-day festival a higher percentage of attendees will use the Wi-Fi at some point compared to a business focused exhibition. In our experience we would not expect concurrent usage at a festival to be more than 10-20%, whereas an exhibition may be closer to 30-40%.

The second vector is the duration of an event, crudely the shorter the event the higher the percentage of concurrent users. This dynamic is partly down to the battery life concern at multi-day events in contrast to the ‘in the moment’ social media nature of a short event that is likely to have a single focal point and may see concurrent usage rise above 50%

The last vector is the hardest to predict – the marketing and messaging from the event itself. A smartphone app, twitter walls, content, streaming, promotions and campaigns can all drive up usage significantly and need to be understood as part of the planning cycle. Public Wi-Fi providing a low key email and internet access service is very different to the launch of a new 150MB smartphone app with rich content that everyone needs to download in the first hour of an event!

Will it work? What will it cost?

This brings us to the technical aspect and the associated cost. The big factors are the coverage area, the user density and the internet backhaul required. High density Wi-Fi is a very different beast to normal Wi-Fi – it involves much more complex design with sector based antennas, high end Wi-Fi access points, very careful spectrum (radio) management and various networking approaches to ensure the system does not saturate and grind to a halt. In front of a crowded stage with 10,000 people it requires a lot of Wi-Fi magic to deliver an acceptable service.

Coverage area adds an additional non-linear cost increase, especially in a green-field environment, simply down to the practicalities of deployment and connecting the entire network together. A typical device such a smartphone will only work reasonably if it is within about 100m of a Wi-Fi access point so if you are trying to cover 200 acres that’s a lot of access points all of which need to be connected together and have a source of power.

Behind all of this there has to be suitable internet connectivity (backhaul), many deployments are let down by not having enough backhaul or by having the wrong type. Some methods of internet connectivity are just not suited to a public Wi-Fi deployment where there may be thousands of users all chatting away simultaneously.

This all may seem a little overwhelming but it shouldn’t be, a well-planned and thought through deployment can be very successful but it needs to be a larger discussion than just the practicalities of making it work, including those who lead areas such as marketing and sponsorship. The demands on connectivity at events will only continue to increase and the best way to service that need is a clear approach around public Wi-Fi which forms part of the overall event strategy rather than as a costly bolt on.

No Wi-Fi HereAnother week, another big event, another twitter stream full of complaints about Wi-Fi. Rightly or wrongly Wi-Fi is touted above food, toilets, queuing, decoration and just about everything else as being critical to an event. It’s been the same for several years now with seemingly little progress, how can that be the case?

The first response is typically to blame the technology and there are certainly plenty of cases where poor designs and implementations are part of the problem. Building an effective, reliable and performing wired and wireless network is complex but not impossible. These days the main issues tend to lie elsewhere.

The first issue is cost. Delivering a true high capacity, high density network requires significant investment with a large chunk of the cost down to the internet bandwidth required. The price of low quality consumer bandwidth like ADSL and FTTC may be at an all time low but high capacity business quality fibre circuits are still very expensive, especially for short term use. The usage patterns of the attendees have also changed over the last few years with current demand as much about upload as download which, coupled with richer content, all continue to drive demand for more bandwidth.

You can provide the best Wi-Fi on the planet but if it isn’t backed up by the appropriate internet bandwidth then users will have a poor experience. There is no magic here, if you want 10,000 users to have a good experience you need multiple high capacity business grade links, yet most organisers see the cost of this bandwidth as top of the list for cutting, well above other items which ironically users complain far less about. 

The second problem is particularly significant in the exhibition and conference areas – rogue Wi-Fi. The Achilles heal of Wi-Fi is its unlicensed nature, which on one hand has allowed Wi-Fi to become pervasive across the globe rapidly but on the other hand is slowly killing it. Wi-Fi currently operates at two relatively narrow frequency bands – 2.4GHz and 5GHz. These two bands are divided into a number of channels which are shared by all Wi-Fi (and some other) devices. The problem is there are not enough channels available, especially at 2.4GHz so in a high density environment managing the channels which are available is critical to success. That in itself is hard enough but now add in all the exhibitors who have brought in their own Wi-Fi access points, then all the Mi-Fi devices and to top it off all the Bluetooth noise (which also operates at 2.4GHz) and you end up with a large conference hall with thousands of devices all shouting at each other to the point no one can be heard because it is just a mass of interference.

The idea that all of these devices can share the wireless spectrum effectively is simply not true in a dense environment. To make matters worse it’s a vicious circle – the more often an attendee or exhibitor has a bad experience the more likely they are to bring their own device next time further adding to the problem. Even worse is that every new Mi-Fi device has a little more power and those with their own Wi-Fi think more power and more access points will make things better raising the interference and noise further.

Those who work in this area have known for some time that 2.4GHz as a client access frequency at an event was a lost cause and the only hope was to move people to 5GHz as laptops, tablets and smartphones increasingly supported it. The extra channel capacity at 5GHz, no Bluetooth interference and fewer 5GHz Mi-Fi devices made for ‘cleaner’ air, unfortunately that is rapidly changing and soon 5GHz will be as crowded as 2.4GHz.

There are only a couple of solutions to this problem, the first is long term and probably unlikely. Wi-Fi needs more spectrum and there are various discussions and proposals for increasing the spectrum available but it also needs to be managed – separating consumer type devices away from lightly licensed professional frequencies so that each has its own space. This will not happen quickly and would take many years to trickle down through devices but it could be the long term nirvana to truly offer a reliable Wi-Fi service.

The second solution is not really technical at all, it just requires event organisers to listen to and take seriously what event IT companies have been saying for years – the Wi-Fi spectrum at events must be managed. In the broadcast arena spectrum management has been taken seriously for years and it works very well. If we want event Wi-Fi to work then the same approach must be used. That means taking a hard line when an exhibitor wants to use their own device – it has to be pre-approved with specific parameters or rejected, and the agreement has to be enforced. No more rogue Wi-Fi it ruins experience for everyone.

This is easy to say, it requires trust that an official provider is going to deliver a good service and I appreciate it is hard to enforce requiring support from all levels but it can be done (we have examples) and the difference it makes is considerable and everyone gets a working service. It doesn’t fix everything but unless something is done across the industry to support this approach then paying money out for Wi-Fi is pointless and frustrates users more than if there was no Wi-Fi at all.

EE have launched their Wi-Fi Calling service and Vodafone are expected to follow shortly along with other operators. With the prevalence of other VoIP based calling such as Skype, WhatsApp, Viber, etc. you could be excused for thinking what all the fuss is about.

There are two big things about Wi-Fi Calling, the first is that it uses your normal mobile number so it doesn’t matter if you don’t have a mobile signal you can still receive and make calls on your normal number.

The second aspect of true Wi-Fi Calling is that it is seamless – you don’t have to launch an app and make a conscious decision to switch, it is handled directly by the phone. Here though lies an issue in that only newer generation phones support this aspect today, however, it is expected that all future phones will adopt it. Seamless is also not truly seamless yet in that active calls at this point cannot roam from the mobile operator network to Wi-Fi or vice versa but this is expected to be introduced in the future.

The other cheeky point to note is that operators are still likely to charge (or deduct from bundled minutes) for a call made over Wi-Fi even though they are not providing the network.

For event organisers Wi-Fi Calling sounds like a great development as requests to improve mobile coverage and capacity is up at the top of the list of the things we get asked to fix most frequently, yet generally we are fairly powerless to address as the current system has been a closed environment controlled by the mobile operators.

At a high level this is a great development for event organisers, especially for production staff who can be offered an alternative to the mobile network very easily but it throws up some challenges which need to be considered very carefully if it is to be used beyond production staff. Any event providing a Wi-Fi network for its attendees is now potentially going to see extra demand on that network, not so much in terms of capacity as voice traffic is fairly small, more in terms of quality of service.

Voice traffic is not tolerant of congested networks, previously an attendee just downloading some email might see the network as being a bit slow but it still works, with voice it is a different story with stuttering audio rendering the call unworkable and frustrating the user far more than slow email.

Event organisers will need to make conscious decisions about the use of Wi-Fi Calling and ensuring any network is capable of delivering it at a quality that is acceptable to users. This may mean high density design and increased internet capacity – both of which can push up costs.

For smaller events this is not likely to be that much of a problem but as you scale up to large outdoor events with thousands of people the challenge is a lot more significant. Wi-Fi Calling has the potential to help solve one of the big frustrations at festivals, arenas and sports events but without a good public Wi-Fi network it could make the frustration worse.

The interesting question is that if Wi-Fi Calling is adopted by users and becomes the norm when in a public Wi-Fi hotspot will attendees increasingly expect it at events? And if so, who pays?

Etherlive and The Event Intelligence Group Announce Partnership to Deliver Suite of Services to Events

Etherlive and The Event Intelligence Group announced today a broad ranging partnership to deliver an exciting and compelling suite of services including integrated communications, HD CCTV, crowd analytics, location awareness and social media tracking to assist event organisers in managing large and complex events.

The partnership includes the sharing of resources, equipment and knowledge to deliver the best experience and value to event organisers. This unique approach of the two companies, combining resources and skills, will allow event organisers who require a single technology partner to get the best of both worlds with the best companies for the job, operating under unified project management.

With increasing complexity and pressure on aspects such as wireless spectrum management, as well as a need for a single view on all aspects such as access control, people flow and incident management, a centrally managed approach is essential.

Both companies have been developing next generation services in areas such as real-time crowd density analysis and event intelligence gathered from CCTV, mobile devices and social media usage and this partnership will further speed their development and adoption at events.

Chris Green, Managing Director at Etherlive, comments “Having worked with The Event Intelligence Group over the last year it was obvious that our visions on the value of technology at events were very similar and this partnership brings together a great skill base to deliver the best and most forward looking technology services.”

Chris Burke, Partner at The Event Intelligence Group, says “We’ve been working very hard at creating the next generation of surveillance and intelligence services for events, we are excited to create this relationship with a company that shares our passion for excellence and innovation.”

Existing customer relationships remain unchanged with the partnership being transparent to customers. As appropriate both company names will appear on quotes where both companies are involved. In the immediate term the partnership ensures that next generation Full HD CCTV and integrated communications can be delivered to a wider range of events during 2014.

 -Ends-

 For further information, please contact Triggerfish Communications 020 7233 9700

About Etherlive

  • Etherlive is one of the UK’s leading providers of IT and communications technology for the live event sector.
  • Etherlive is a specialist in large-scale indoor & outdoor events with multi-faceted communications needs, as well as being a major supplier to the corporate, conference, exhibition and general events market.
  • Etherlive partners to provide services for events such as G8 Summit, WOMAD festival, Green Man Festival, Goodwood Festival of Speed, Southampton Boat Show, London World Triathlon, Nokia, Amazon and Sony.
  • Website: www.etherlive.co.uk
  • Blog: www.etherlive.co.uk/blog
  • Twitter: www.twitter.com/etherlive
  • Head Office: Etherlive Limited, Brinkworth House, Chippenham, Wiltshire, SN15 4HQ
  • Tel: 01666 511862

 About The Event Intelligence Group

  • The Event Intelligence Group is North America’s premier event surveillance and intelligence services provider for the live events industry.
  • The Event Intelligence Group provides actionable intelligence for large-scale events. By partnering with event producers, security contractors and local authorities, EIG deploys event specific surveillance solutions using state-of-the-art technology.
  • The Event Intelligence Group provides services to events such as: Coachella, Chicago Marathon, Electric Daisy Carnival, V Festival Chelmsford, SXSW and Lollapalooza.
  • Website: intelevent.net
  • Head Office UK: The Event Intelligence Group, Unit B, Hadrian’s Business Park, Haltwhistle, Northumberland, NE49 0EX
  • Tel: 0191 300 6200

It has been said that there are two certainties in life – death and taxes. If you said the same about technology it would read ‘confusing acronyms and over hyped performance claims’. Although 802.11ac, the latest in a long line of Wi-Fi standards, has been in development for several years it was finally approved in January and more client devices are now appearing which support it, including the Samsung Galaxy S4, newer iPads and the rumoured iPhone 6.

802-11-ac-logo

The ever growing list of Wi-Fi standards

Claiming speeds of 433Mbps up to 6.77Gbps, multi user MIMO and beamforming it would sound like we should all be rushing to implement this technology as soon as we can to solve our Wi-Fi woes. For the home user a shiny new 11ac Wi-Fi router and compatible tablet may indeed offer some benefits but if you look at the limiting performance factor in most households it is the broadband connection itself and not the Wi-Fi which throttles everything to a crawl.

For those of us deploying large scale, high density Wi-Fi, particularly at events and stadiums, the potential impact of 11ac is far more important and if not considered carefully could easily reduce performance rather than improve it. There are many enhancements and extensions within 11ac and as before with 11n it will take time for all the features to be implemented and used effectively.

One of the big changes is with MIMO or Multiple Input Multiple Output streams. MIMO is like moving from a single carriageway road to a dual carriageway or motorway – the data travels from your device to the Wi-Fi access point using multiple paths. Most business quality Wi-Fi access points have supported MIMO since 802.11n but many handheld devices have only just started to implement it. It can provide better overall speed and improve coverage especially where there are lots of obstacles. 11ac allows for up to 8 streams, whereas 11n is limited to 3, however, in reality most devices will not implement more than 3 and in fact most handheld devices will be limited to 1 or 2 because of the cost, complexity and extra power drain of adding more streams.

Those extra streams are not necessarily lost though as 11ac will eventually offer multi-user MIMO where different streams can be directed to different clients providing a much needed boost in situations such as events where the pinch point is the number of connected devices rather than absolute speed. Unfortunately version 1 of 11ac does not support multi-use MIMO so we will have to wait another year or two for that.

Beamforming is another aspect which 11ac requires, a technology which aims to optimise performance based on the direction of signals and provide a higher interference rejection. Beamforming is already supported in 11n and, when combined with adaptive antenna arrays, is very powerful in ‘noisy’ environments like event sites, however, many wireless vendors do not implement it so 11ac aims to standardise beamforming across clients and vendors, which over time will provide performance improvements.

So far it all sounds good so what is the problem? To answer that we need to look at why we have a problem today. Wi-Fi is a shared medium, a Wi-Fi ‘access point’ has to simultaneously talk to a number of client devices and split the available capacity between all the devices it is talking to. For example an 11n wireless access point (without MIMO) can at best deliver 150Mbps of capacity, if there are 100 users connected to it then each user would see a maximum speed of 1.5Mbps. This is absolute best case, real world would be far, far lower.

To add more capacity more wireless access points are used but they all need their own ‘space’ to operate in otherwise they would just interfere with each other like a room full of people shouting. To do this there are a number of standard ‘channels’ defined and each Wi-Fi access point is assigned a channel. The most common form of Wi-Fi today runs at 2.4GHz which has 14 channels but these channels overlap and not all of them are usable in all countries, in fact there are really only 3 usable channels when it comes to designs for large scale deployments. On top of this 2.4 GHz Wi-Fi has to contend with Bluetooth devices, baby monitors, microwave ovens and a whole host of other things which also use the same frequency range!

At home where there are likely to be only a few devices connecting to the Wi-Fi network these issues are not generally a great concern but on an event site where hundreds, or now more typically thousands of users have to be connected simultaneously the combination of the lack of capacity and interference creates a huge problem.

All is not lost though as there is a second Wi-Fi frequency range at 5GHz which offers 23 non-overlapping channels (although that is before you factor in indoor, outdoor, DFS and country restrictions) and much lower interference. Today most of the wireless backbone infrastructure on event sites uses 5GHz – this includes normal data transmission, CCTV cameras and other wireless devices such as video senders. There are enough channels to do this successfully provided it is all managed carefully.

Until recently most client devices did not support 5GHz but now many do meaning that client access can also be provided at 5GHz avoiding the problems of 2.4GHz. The downside of this though is that 5GHz is no longer the quiet frequency it used to be with many domestic Wi-Fi routers supporting it and permanent wireless links using it, all of which increases interference and limits available free channels. 11ac however could make things far worse.

Whereas 802.11n was a standard for both 2.4GHz and 5GHz, 802.11ac is a 5GHz only standard which means we will see an acceleration in the adoption of 5GHz in all devices. This in itself is not a bad thing but it will change the dynamics of Wi-Fi deployments with more and more focus on 5GHz client access leading to less room for 5GHz backhaul. The likely result is that backhaul will have to move to licenced frequencies or higher unlicensed frequencies such as 24, 60 or 80GHz but there are cost and implementation considerations.

The second problem is that 11ac focusses on delivering more speed but one of the ways it does this is by using a wider channel in which to send data and this is implemented by in effect ‘bonding’ channels reducing the number of independent channels available. 11n can already bond two channels but 11ac can bond four which could reduce the available channels by 75% leading to interference issues.

All of these factors are configurable and manageable and the design for a large event site will be considerably different to say an office environment but for everything to work in harmony there will need to be an even greater focus on ‘spectrum management’ ensuring that all parties using wireless equipment do so in a controlled and agreed manner. Without this structure and control the user experience will deteriorate rather than improve. 11ac can bring benefits, albeit without the headline speed claims, but there are greater risks in terms of poor design.

We will be starting to deploy some 11ac access points in a controlled manner over the coming months, working closely with vendors to optimise designs for the challenging needs of event sites.

The last couple of years has seen a major shift to ‘cloud computing’ driven by a combination of the expectation of connectivity everywhere and the explosion of users wishing to easily share content between several devices. On the whole the use of cloud services such as Dropbox, Box, Google Drive, Sky Drive and cloud hosted business applications such as Salesforce, Office365, Adobe and even Sage to name a few, can be great for companies, providing the well documented concerns around security, data protection and availability are addressed appropriately.

There is, however, a grey lining to the cloud when it comes to events – the impact cloud services have on the event infrastructure. The internal infrastructure in terms of Wi-Fi and wired connections is not the concern, any well designed network should be able to handle that, it is the internet connectivity aspect. The additional loading that cloud services create is having a significant impact on the capacity required and many events and customers are not factoring this into plans and budgets.

Cloud services change the usage characteristics of internet connections, creating much more demand on the upload capacity compared to traditional browsing, this means that ‘synchronous’ connections need to be used where the upload and download speeds are more closely matched compared to ADSL where the download is much faster than the upload. Overall the capacity of the internet connections need to be higher too because everything is going back and forth to the internet. For example a user sharing a 10MB site plan to ten others will generate 10MB of upload traffic and 100MB of download traffic because it will synchronise to each user separately – this may not seem a lot but scale up to a whole site across all users (with multiple devices) and all services and the numbers get very large very quickly. Scrimping on the connectivity for a corporate training event where all 200 users are set to use a cloud service is a recipe for disaster as the user experience will undoubtedly be poor.

Storm
Cloud based services can cause a storm with event internet requirements

Adding more capacity is generally the easiest route but the jump in cost can be higher than customers expect because of the need to move to synchronous and low contention services such as EFM (Ethernet First Mile) and optic fibre. In some cases wired connectivity is limited so additional wireless or satellite capacity is required and these have their own requirements which need to be factored in early on in planning. Newer services like BT Infinity do help in some cases but it should be remembered that although the headline numbers look good these are consumer focused services which come with high contention ratios (meaning a high risk that performance will be much lower than stated at busy times) and there is no guarantee on the service.

Slowly cloud based services are realising there is a need to be able to distribute load and are making available the ability to provide local caches of data which will help over time but we are some way off this being easily available for most services. In the mean time understanding what services are being used and undertaking capacity planning is essential so that the correct level of capacity can be put in place.

Overall cloud services can be very cost effective but as the saying goes there is no such thing as a free lunch!