WOMAD offers free public wi-fi to all attendeesThe topic of public or attendee Wi-Fi at events creates more churn and discussion than just about any other aspect in the technology arena. Organiser questions come thick and fast – Should we provide it? How should we charge for it? Will it work? Why does it cost so much? How many people will use it? The list goes on.

The approach to production, exhibitor and trader Wi-Fi is clear cut but for the public, opinion on approach, the need and value flip on a regular basis. This is not entirely surprising given the confusing and often incorrect messaging which swirls around the industry, accompanied by the fact that the topic is more complex than it initially looks.

If you are running an event in a location with little or no mobile coverage, then the desire to provide connectivity for attendees is well placed as there is an expectation in today’s world for ubiquitous connectivity and attendees will quickly rally round to complain if they are disconnected from the rest of the world.

Mobile 3G & 4G coverage at events is improving but outside of a select few the reality is the mobile networks are not designed to service the volume of users at large events which leads to sporadic or non-existent performance. Even if there is good mobile coverage the drive to provide a public Wi-Fi network may be down to different factors, not least by the fact that a dedicated network is in the control of the organiser providing opportunities to gather statistics, target advertising, monitor usage and offer interactive services.

How do I pay for it?

Monetising the provision is, however, a difficult area as directly charging for Wi-Fi access is not a good approach and sees very limited take-up. Users are offended by the idea that after paying to attend an event they are asked to pay extra for internet access which in their view is a utility and life-right, especially when in most scenarios Wi-Fi access is ‘free’. It may be accepted by an organiser that any provision is just an overhead cost, the value being in the good feedback and enhanced social media presence that such an offering provides but in most cases there is an expectation of some direct value or cost recovery.

The key point is not to focus on the Wi-Fi connection but to look beyond at what the connection delivers – that may be additional paid for content, sponsorship and advertising, attendee interaction, geo-fencing and location services, add on experiences which are sold through the network, payment systems or other value-add elements which may be more accepted as a paid-for offering.

What capacity do I need?

One of the hardest things about public Wi-Fi at events is predicting usage and capacity required. There are multiple vectors to this but historical data and experience provide a good starting point. The key aspect is the likely amount of concurrent users as this drives the high water mark for system capacity.

The first vector is the type of event, a music festival for example will typically see a lower concurrent usage percentage than a more business focused event such as an exhibition. This is driven by the immediacy of modern business working versus the more local experience of a festival, coupled with the need at a festival to conserve battery life such that Wi-Fi is turned off unless actually required. Interestingly though over the course of a multi-day festival a higher percentage of attendees will use the Wi-Fi at some point compared to a business focused exhibition. In our experience we would not expect concurrent usage at a festival to be more than 10-20%, whereas an exhibition may be closer to 30-40%.

The second vector is the duration of an event, crudely the shorter the event the higher the percentage of concurrent users. This dynamic is partly down to the battery life concern at multi-day events in contrast to the ‘in the moment’ social media nature of a short event that is likely to have a single focal point and may see concurrent usage rise above 50%

The last vector is the hardest to predict – the marketing and messaging from the event itself. A smartphone app, twitter walls, content, streaming, promotions and campaigns can all drive up usage significantly and need to be understood as part of the planning cycle. Public Wi-Fi providing a low key email and internet access service is very different to the launch of a new 150MB smartphone app with rich content that everyone needs to download in the first hour of an event!

Will it work? What will it cost?

This brings us to the technical aspect and the associated cost. The big factors are the coverage area, the user density and the internet backhaul required. High density Wi-Fi is a very different beast to normal Wi-Fi – it involves much more complex design with sector based antennas, high end Wi-Fi access points, very careful spectrum (radio) management and various networking approaches to ensure the system does not saturate and grind to a halt. In front of a crowded stage with 10,000 people it requires a lot of Wi-Fi magic to deliver an acceptable service.

Coverage area adds an additional non-linear cost increase, especially in a green-field environment, simply down to the practicalities of deployment and connecting the entire network together. A typical device such a smartphone will only work reasonably if it is within about 100m of a Wi-Fi access point so if you are trying to cover 200 acres that’s a lot of access points all of which need to be connected together and have a source of power.

Behind all of this there has to be suitable internet connectivity (backhaul), many deployments are let down by not having enough backhaul or by having the wrong type. Some methods of internet connectivity are just not suited to a public Wi-Fi deployment where there may be thousands of users all chatting away simultaneously.

This all may seem a little overwhelming but it shouldn’t be, a well-planned and thought through deployment can be very successful but it needs to be a larger discussion than just the practicalities of making it work, including those who lead areas such as marketing and sponsorship. The demands on connectivity at events will only continue to increase and the best way to service that need is a clear approach around public Wi-Fi which forms part of the overall event strategy rather than as a costly bolt on.

No Wi-Fi HereAnother week, another big event, another twitter stream full of complaints about Wi-Fi. Rightly or wrongly Wi-Fi is touted above food, toilets, queuing, decoration and just about everything else as being critical to an event. It’s been the same for several years now with seemingly little progress, how can that be the case?

The first response is typically to blame the technology and there are certainly plenty of cases where poor designs and implementations are part of the problem. Building an effective, reliable and performing wired and wireless network is complex but not impossible. These days the main issues tend to lie elsewhere.

The first issue is cost. Delivering a true high capacity, high density network requires significant investment with a large chunk of the cost down to the internet bandwidth required. The price of low quality consumer bandwidth like ADSL and FTTC may be at an all time low but high capacity business quality fibre circuits are still very expensive, especially for short term use. The usage patterns of the attendees have also changed over the last few years with current demand as much about upload as download which, coupled with richer content, all continue to drive demand for more bandwidth.

You can provide the best Wi-Fi on the planet but if it isn’t backed up by the appropriate internet bandwidth then users will have a poor experience. There is no magic here, if you want 10,000 users to have a good experience you need multiple high capacity business grade links, yet most organisers see the cost of this bandwidth as top of the list for cutting, well above other items which ironically users complain far less about.

The second problem is particularly significant in the exhibition and conference areas – rogue Wi-Fi. The Achilles heal of Wi-Fi is its unlicensed nature, which on one hand has allowed Wi-Fi to become pervasive across the globe rapidly but on the other hand is slowly killing it. Wi-Fi currently operates at two relatively narrow frequency bands – 2.4GHz and 5GHz. These two bands are divided into a number of channels which are shared by all Wi-Fi (and some other) devices. The problem is there are not enough channels available, especially at 2.4GHz so in a high density environment managing the channels which are available is critical to success. That in itself is hard enough but now add in all the exhibitors who have brought in their own Wi-Fi access points, then all the Mi-Fi devices and to top it off all the Bluetooth noise (which also operates at 2.4GHz) and you end up with a large conference hall with thousands of devices all shouting at each other to the point no one can be heard because it is just a mass of interference.

The idea that all of these devices can share the wireless spectrum effectively is simply not true in a dense environment. To make matters worse it’s a vicious circle – the more often an attendee or exhibitor has a bad experience the more likely they are to bring their own device next time further adding to the problem. Even worse is that every new Mi-Fi device has a little more power and those with their own Wi-Fi think more power and more access points will make things better raising the interference and noise further.

Those who work in this area have known for some time that 2.4GHz as a client access frequency at an event was a lost cause and the only hope was to move people to 5GHz as laptops, tablets and smartphones increasingly supported it. The extra channel capacity at 5GHz, no Bluetooth interference and fewer 5GHz Mi-Fi devices made for ‘cleaner’ air, unfortunately that is rapidly changing and soon 5GHz will be as crowded as 2.4GHz.

There are only a couple of solutions to this problem, the first is long term and probably unlikely. Wi-Fi needs more spectrum and there are various discussions and proposals for increasing the spectrum available but it also needs to be managed – separating consumer type devices away from lightly licensed professional frequencies so that each has its own space. This will not happen quickly and would take many years to trickle down through devices but it could be the long term nirvana to truly offer a reliable Wi-Fi service.

The second solution is not really technical at all, it just requires event organisers to listen to and take seriously what event IT companies have been saying for years – the Wi-Fi spectrum at events must be managed. In the broadcast arena spectrum management has been taken seriously for years and it works very well. If we want event Wi-Fi to work then the same approach must be used. That means taking a hard line when an exhibitor wants to use their own device – it has to be pre-approved with specific parameters or rejected, and the agreement has to be enforced. No more rogue Wi-Fi it ruins experience for everyone.

This is easy to say, it requires trust that an official provider is going to deliver a good service and I appreciate it is hard to enforce requiring support from all levels but it can be done (we have examples) and the difference it makes is considerable and everyone gets a working service. It doesn’t fix everything but unless something is done across the industry to support this approach then paying money out for Wi-Fi is pointless and frustrates users more than if there was no Wi-Fi at all.

EE have launched their Wi-Fi Calling service and Vodafone are expected to follow shortly along with other operators. With the prevalence of other VoIP based calling such as Skype, WhatsApp, Viber, etc. you could be excused for thinking what all the fuss is about.

There are two big things about Wi-Fi Calling, the first is that it uses your normal mobile number so it doesn’t matter if you don’t have a mobile signal you can still receive and make calls on your normal number.

The second aspect of true Wi-Fi Calling is that it is seamless – you don’t have to launch an app and make a conscious decision to switch, it is handled directly by the phone. Here though lies an issue in that only newer generation phones support this aspect today, however, it is expected that all future phones will adopt it. Seamless is also not truly seamless yet in that active calls at this point cannot roam from the mobile operator network to Wi-Fi or vice versa but this is expected to be introduced in the future.

The other cheeky point to note is that operators are still likely to charge (or deduct from bundled minutes) for a call made over Wi-Fi even though they are not providing the network.

For event organisers Wi-Fi Calling sounds like a great development as requests to improve mobile coverage and capacity is up at the top of the list of the things we get asked to fix most frequently, yet generally we are fairly powerless to address as the current system has been a closed environment controlled by the mobile operators.

At a high level this is a great development for event organisers, especially for production staff who can be offered an alternative to the mobile network very easily but it throws up some challenges which need to be considered very carefully if it is to be used beyond production staff. Any event providing a Wi-Fi network for its attendees is now potentially going to see extra demand on that network, not so much in terms of capacity as voice traffic is fairly small, more in terms of quality of service.

Voice traffic is not tolerant of congested networks, previously an attendee just downloading some email might see the network as being a bit slow but it still works, with voice it is a different story with stuttering audio rendering the call unworkable and frustrating the user far more than slow email.

Event organisers will need to make conscious decisions about the use of Wi-Fi Calling and ensuring any network is capable of delivering it at a quality that is acceptable to users. This may mean high density design and increased internet capacity – both of which can push up costs.

For smaller events this is not likely to be that much of a problem but as you scale up to large outdoor events with thousands of people the challenge is a lot more significant. Wi-Fi Calling has the potential to help solve one of the big frustrations at festivals, arenas and sports events but without a good public Wi-Fi network it could make the frustration worse.

The interesting question is that if Wi-Fi Calling is adopted by users and becomes the norm when in a public Wi-Fi hotspot will attendees increasingly expect it at events? And if so, who pays?

Etherlive and The Event Intelligence Group Announce Partnership to Deliver Suite of Services to Events

Etherlive and The Event Intelligence Group announced today a broad ranging partnership to deliver an exciting and compelling suite of services including integrated communications, HD CCTV, crowd analytics, location awareness and social media tracking to assist event organisers in managing large and complex events.

The partnership includes the sharing of resources, equipment and knowledge to deliver the best experience and value to event organisers. This unique approach of the two companies, combining resources and skills, will allow event organisers who require a single technology partner to get the best of both worlds with the best companies for the job, operating under unified project management.

With increasing complexity and pressure on aspects such as wireless spectrum management, as well as a need for a single view on all aspects such as access control, people flow and incident management, a centrally managed approach is essential.

Both companies have been developing next generation services in areas such as real-time crowd density analysis and event intelligence gathered from CCTV, mobile devices and social media usage and this partnership will further speed their development and adoption at events.

Chris Green, Managing Director at Etherlive, comments “Having worked with The Event Intelligence Group over the last year it was obvious that our visions on the value of technology at events were very similar and this partnership brings together a great skill base to deliver the best and most forward looking technology services.”

Chris Burke, Partner at The Event Intelligence Group, says “We’ve been working very hard at creating the next generation of surveillance and intelligence services for events, we are excited to create this relationship with a company that shares our passion for excellence and innovation.”

Existing customer relationships remain unchanged with the partnership being transparent to customers. As appropriate both company names will appear on quotes where both companies are involved. In the immediate term the partnership ensures that next generation Full HD CCTV and integrated communications can be delivered to a wider range of events during 2014.

 -Ends-

 For further information, please contact Triggerfish Communications 020 7233 9700

About Etherlive

  • Etherlive is one of the UK’s leading providers of IT and communications technology for the live event sector.
  • Etherlive is a specialist in large-scale indoor & outdoor events with multi-faceted communications needs, as well as being a major supplier to the corporate, conference, exhibition and general events market.
  • Etherlive partners to provide services for events such as G8 Summit, WOMAD festival, Green Man Festival, Goodwood Festival of Speed, Southampton Boat Show, London World Triathlon, Nokia, Amazon and Sony.
  • Website: www.etherlive.co.uk
  • Blog: www.etherlive.co.uk/blog
  • Twitter: www.twitter.com/etherlive
  • Head Office: Etherlive Limited, Brinkworth House, Chippenham, Wiltshire, SN15 4HQ
  • Tel: 01666 511862

 About The Event Intelligence Group

  • The Event Intelligence Group is North America’s premier event surveillance and intelligence services provider for the live events industry.
  • The Event Intelligence Group provides actionable intelligence for large-scale events. By partnering with event producers, security contractors and local authorities, EIG deploys event specific surveillance solutions using state-of-the-art technology.
  • The Event Intelligence Group provides services to events such as: Coachella, Chicago Marathon, Electric Daisy Carnival, V Festival Chelmsford, SXSW and Lollapalooza.
  • Website: intelevent.net
  • Head Office UK: The Event Intelligence Group, Unit B, Hadrian’s Business Park, Haltwhistle, Northumberland, NE49 0EX
  • Tel: 0191 300 6200

It has been said that there are two certainties in life – death and taxes. If you said the same about technology it would read ‘confusing acronyms and over hyped performance claims’. Although 802.11ac, the latest in a long line of Wi-Fi standards, has been in development for several years it was finally approved in January and more client devices are now appearing which support it, including the Samsung Galaxy S4, newer iPads and the rumoured iPhone 6.

802-11-ac-logo

The ever growing list of Wi-Fi standards

Claiming speeds of 433Mbps up to 6.77Gbps, multi user MIMO and beamforming it would sound like we should all be rushing to implement this technology as soon as we can to solve our Wi-Fi woes. For the home user a shiny new 11ac Wi-Fi router and compatible tablet may indeed offer some benefits but if you look at the limiting performance factor in most households it is the broadband connection itself and not the Wi-Fi which throttles everything to a crawl.

For those of us deploying large scale, high density Wi-Fi, particularly at events and stadiums, the potential impact of 11ac is far more important and if not considered carefully could easily reduce performance rather than improve it. There are many enhancements and extensions within 11ac and as before with 11n it will take time for all the features to be implemented and used effectively.

One of the big changes is with MIMO or Multiple Input Multiple Output streams. MIMO is like moving from a single carriageway road to a dual carriageway or motorway – the data travels from your device to the Wi-Fi access point using multiple paths. Most business quality Wi-Fi access points have supported MIMO since 802.11n but many handheld devices have only just started to implement it. It can provide better overall speed and improve coverage especially where there are lots of obstacles. 11ac allows for up to 8 streams, whereas 11n is limited to 3, however, in reality most devices will not implement more than 3 and in fact most handheld devices will be limited to 1 or 2 because of the cost, complexity and extra power drain of adding more streams.

Those extra streams are not necessarily lost though as 11ac will eventually offer multi-user MIMO where different streams can be directed to different clients providing a much needed boost in situations such as events where the pinch point is the number of connected devices rather than absolute speed. Unfortunately version 1 of 11ac does not support multi-use MIMO so we will have to wait another year or two for that.

Beamforming is another aspect which 11ac requires, a technology which aims to optimise performance based on the direction of signals and provide a higher interference rejection. Beamforming is already supported in 11n and, when combined with adaptive antenna arrays, is very powerful in ‘noisy’ environments like event sites, however, many wireless vendors do not implement it so 11ac aims to standardise beamforming across clients and vendors, which over time will provide performance improvements.

So far it all sounds good so what is the problem? To answer that we need to look at why we have a problem today. Wi-Fi is a shared medium, a Wi-Fi ‘access point’ has to simultaneously talk to a number of client devices and split the available capacity between all the devices it is talking to. For example an 11n wireless access point (without MIMO) can at best deliver 150Mbps of capacity, if there are 100 users connected to it then each user would see a maximum speed of 1.5Mbps. This is absolute best case, real world would be far, far lower.

To add more capacity more wireless access points are used but they all need their own ‘space’ to operate in otherwise they would just interfere with each other like a room full of people shouting. To do this there are a number of standard ‘channels’ defined and each Wi-Fi access point is assigned a channel. The most common form of Wi-Fi today runs at 2.4GHz which has 14 channels but these channels overlap and not all of them are usable in all countries, in fact there are really only 3 usable channels when it comes to designs for large scale deployments. On top of this 2.4 GHz Wi-Fi has to contend with Bluetooth devices, baby monitors, microwave ovens and a whole host of other things which also use the same frequency range!

At home where there are likely to be only a few devices connecting to the Wi-Fi network these issues are not generally a great concern but on an event site where hundreds, or now more typically thousands of users have to be connected simultaneously the combination of the lack of capacity and interference creates a huge problem.

All is not lost though as there is a second Wi-Fi frequency range at 5GHz which offers 23 non-overlapping channels (although that is before you factor in indoor, outdoor, DFS and country restrictions) and much lower interference. Today most of the wireless backbone infrastructure on event sites uses 5GHz – this includes normal data transmission, CCTV cameras and other wireless devices such as video senders. There are enough channels to do this successfully provided it is all managed carefully.

Until recently most client devices did not support 5GHz but now many do meaning that client access can also be provided at 5GHz avoiding the problems of 2.4GHz. The downside of this though is that 5GHz is no longer the quiet frequency it used to be with many domestic Wi-Fi routers supporting it and permanent wireless links using it, all of which increases interference and limits available free channels. 11ac however could make things far worse.

Whereas 802.11n was a standard for both 2.4GHz and 5GHz, 802.11ac is a 5GHz only standard which means we will see an acceleration in the adoption of 5GHz in all devices. This in itself is not a bad thing but it will change the dynamics of Wi-Fi deployments with more and more focus on 5GHz client access leading to less room for 5GHz backhaul. The likely result is that backhaul will have to move to licenced frequencies or higher unlicensed frequencies such as 24, 60 or 80GHz but there are cost and implementation considerations.

The second problem is that 11ac focusses on delivering more speed but one of the ways it does this is by using a wider channel in which to send data and this is implemented by in effect ‘bonding’ channels reducing the number of independent channels available. 11n can already bond two channels but 11ac can bond four which could reduce the available channels by 75% leading to interference issues.

All of these factors are configurable and manageable and the design for a large event site will be considerably different to say an office environment but for everything to work in harmony there will need to be an even greater focus on ‘spectrum management’ ensuring that all parties using wireless equipment do so in a controlled and agreed manner. Without this structure and control the user experience will deteriorate rather than improve. 11ac can bring benefits, albeit without the headline speed claims, but there are greater risks in terms of poor design.

We will be starting to deploy some 11ac access points in a controlled manner over the coming months, working closely with vendors to optimise designs for the challenging needs of event sites.

The last couple of years has seen a major shift to ‘cloud computing’ driven by a combination of the expectation of connectivity everywhere and the explosion of users wishing to easily share content between several devices. On the whole the use of cloud services such as Dropbox, Box, Google Drive, Sky Drive and cloud hosted business applications such as Salesforce, Office365, Adobe and even Sage to name a few, can be great for companies, providing the well documented concerns around security, data protection and availability are addressed appropriately.

There is, however, a grey lining to the cloud when it comes to events – the impact cloud services have on the event infrastructure. The internal infrastructure in terms of Wi-Fi and wired connections is not the concern, any well designed network should be able to handle that, it is the internet connectivity aspect. The additional loading that cloud services create is having a significant impact on the capacity required and many events and customers are not factoring this into plans and budgets.

Cloud services change the usage characteristics of internet connections, creating much more demand on the upload capacity compared to traditional browsing, this means that ‘synchronous’ connections need to be used where the upload and download speeds are more closely matched compared to ADSL where the download is much faster than the upload. Overall the capacity of the internet connections need to be higher too because everything is going back and forth to the internet. For example a user sharing a 10MB site plan to ten others will generate 10MB of upload traffic and 100MB of download traffic because it will synchronise to each user separately – this may not seem a lot but scale up to a whole site across all users (with multiple devices) and all services and the numbers get very large very quickly. Scrimping on the connectivity for a corporate training event where all 200 users are set to use a cloud service is a recipe for disaster as the user experience will undoubtedly be poor.

Storm
Cloud based services can cause a storm with event internet requirements

Adding more capacity is generally the easiest route but the jump in cost can be higher than customers expect because of the need to move to synchronous and low contention services such as EFM (Ethernet First Mile) and optic fibre. In some cases wired connectivity is limited so additional wireless or satellite capacity is required and these have their own requirements which need to be factored in early on in planning. Newer services like BT Infinity do help in some cases but it should be remembered that although the headline numbers look good these are consumer focused services which come with high contention ratios (meaning a high risk that performance will be much lower than stated at busy times) and there is no guarantee on the service.

Slowly cloud based services are realising there is a need to be able to distribute load and are making available the ability to provide local caches of data which will help over time but we are some way off this being easily available for most services. In the mean time understanding what services are being used and undertaking capacity planning is essential so that the correct level of capacity can be put in place.

Overall cloud services can be very cost effective but as the saying goes there is no such thing as a free lunch!

One of the most common requests we get is ‘Can you fix the mobile phone coverage at my event?’ It may be a simple question but the answer is not. There are many factors involved – signal coverage, network capacity, availability of mobile wireless spectrum and the cost of temporary masts to name a few. For events held at a temporary site like a festival the permanent infrastructure put in place by mobile operators is simply not designed to deal with 10,000 or more attendees descending for a short period.

The current approach for bigger events is the deployment of temporary mobile masts but this is not generally a good solution as the masts are costly to deploy, require separate masts for each operator, do not offer much additional capacity and have limited spectrum available for use. The result normally being that experience during the event remains poor.

Some operators have offered small ‘femtocells’ which provide a small area of mobile phone coverage using a broadband connection, however, they have been very limited in terms of how many users they can support and have to be registered at locations to be used. They also require all users to be pre-registered which limits their usability.

Becoming a thing of the past?

Becoming a thing of the past?

These on-going challenges with mobile coverage at events makes the announcement last week by O2/Telefonica about the launch of TuGo all the more interesting. On the surface it looks like another VoIP app like Viber and Skype but the difference is it uses your existing mobile number so it doesn’t matter to the caller whether you are on the normal mobile network or a Wi-Fi network. With Wi-Fi coverage at events under the control of the organiser this finally means that “mobile” coverage can be extended across event sites either just for crew or for attendees too. This can be scaled up or down based on need and tied to existing provision for event production teams making it far more efficient than having large mobile masts.

There is a catch as the Wi-Fi voice minutes used do count against your normal voice minutes but given the way most mobile contracts are structured these days this is not such an issue considering the potential for improved coverage. At present only O2 have launched an app to do this but hopefully with the pressure from services such as Skype and now TuGo the other operators will follow suit and offer similar services.

Maybe at last we will see mobile operators see Wi-Fi as an extension to their offering rather than a competitor.

A $500 million event that happens once a year watched by 111.3 million people, supported by some of the world’s biggest sponsors, is put on hold for 30 minutes by a power outage. When this kind of failure can happen at the Super Bowl it’s not surprising that those who run and support events are kept awake at night worrying about what can go wrong – you only get one chance to get it right.

Power outages can happen to the biggest and best events, no matter what the location and with just about everything relying on power to some degree it’s important to look at how to mitigate any issues if the lights do go out.

The first step is to identify what power you have and the risks associated with it (it’s very easy to take for granted especially when in a permanent building), closely followed by identifying what services rely on it. From a technology point of view this list can be very long – access control, internet, telephony, two-way radio boosters, ticket systems, CCTV, Wi-Fi to name a few.

Each service should be reviewed for impact and with this information decisions made on whether to employ mechanisms to minimise risk. It’s also important to understand the interdependencies, for example a decision may be made to have a back-up generator for Event Control but if the phones and radio communication cease to function due to power loss elsewhere on site then the operation could still be impacted.

These days box offices and entrances struggle to operate without power as they rely on real-time ticket scanning and electronic payment. In these key areas it’s important to not only have a power backup plan but also a contingency plan to continue operating if the power plan fails – even if that involves manual checks over the radios or using runners.

Events don't have this option

Events don’t have this option

Many events now rely on a network for many of their systems – from ticketing & phones through to CCTV. That network needs to be designed with redundancy and power failure in mind. All key points should be protected by a monitored UPS (Uninterruptible Power Supply) – the monitoring is important so that central control knows if power fails how long the battery within the UPS can continue to operate for, especially as it can take some time for a power issue to be diagnosed and rectified on a large site. For critical areas, such as servers and core networking, the UPS needs to have a significant operational time which may involve the ability to ‘hot swap’ batteries to extend run-time indefinitely.

Modern VoIP telephones, CCTV cameras and other network equipment can be operated using PoE (Power Over Ethernet) which means they take their power from the network itself rather than a mains supply. The benefit of this is that the power required can be centralised and protected with a UPS so that the impact of local power outages in cabins and offices can be minimised.

Events will always have to deal with the unexpected happening – it’s part of the excitement and challenge of the live industry but sensible planning and preparation can minimise the impact.