Event Technology Myths

For our third myth busters article Wi-Fi becomes the focus, touching on the relationship between microwave ovens, water and Wi-Fi, wireless signal propagation and Wi-Fi security.

My microwave oven stops my Wi-Fi from working properly – TRUE (but not always)

For the non-technical the idea that whilst warming up a bowl of soup in the microwave oven you struggle to browse the internet on your Wi-Fi seems bizarre but it can indeed be true. The reason is quite straightforward – the frequency of the microwaves used in a microwave oven are around 2.4GHz which is the same frequency as used by one of the two Wi-Fi bands. The issue can occur because microwave ovens are not always perfectly shielded so some of the microwaves can leak out (harmlessly) and interfere with the Wi-Fi. Industrial microwaves tend to be more of an issue as they use higher power.

The good news is that the 5GHz Wi-Fi band which is now more commonly supported in devices is not impacted by microwave ovens – although it can be affected by RADAR but that’s another story!

My Wi-Fi works through walls but not through trees – TRUE

The way wireless signals propagate through objects is quite a complex area but there some general rules. The first relates to 2.4GHz Wi-Fi and interestingly links back to microwave ovens. The reason microwave ovens operate around 2.4GHz is that this is the resonant frequency of water so if you bombard water with 2.4GHz microwaves the molecules vibrate vigorously and the water (or your food that contains water) heats up. This is great when you want to cook bacon quickly but no so good when you want to pass a 2.4GHz Wi-Fi signal through trees which contain lots of water – the signal is simply absorbed into all the water.

It is very important to note that Wi-Fi signals are extremely low power in comparison to a microwave oven so you will not cook yourself if you absorb Wi-Fi signals! On event sites trees can become a real bane for the IT engineers trying to run wireless links which is why you will hear them talking about ‘Line of Sight’.

When it comes to walls it does depend on the type of wall – a basic plasterboard or normal brick wall will only absorb some of the Wi-Fi signal, a more substantial wall will absorb more. Walls which have metal mesh in them will often block Wi-Fi altogether. On the whole though a strong Wi-Fi signal will pass through most normal walls. Windows can help or hinder depending on the type of glass used as modern thermal insulating glass can block Wi-Fi signals quite effectively.

Temporary structures at events sites are a whole case in themselves, some temporary cabins are near enough transparent to Wi-Fi but others, particularly the newer well insulated variety, are just about impervious requiring Wi-Fi access points in each cabin. Marquees and other temporary structures often exhibit a different behaviour, being transparent in good weather but more opaque when it starts raining! The water coats the marquee or structure and can create a reflective layer and also absorb signals so that less signal gets through.

The second element of this relates to the frequency of the Wi-Fi as when it comes to wireless signals the lower the frequency the greater the propagation. This is seen most obviously when you have dual band Wi-Fi operating at 2.4GHz and 5GHz. The lower frequency 2.4GHz signal will travel further than the 5GHz signal, and this becomes an important point when designing Wi-Fi coverage (along with lots of other factors!)

All Wi-Fi networks are insecure – BUSTED

Because Wi-Fi is a broadcast technology that passes through the open air anyone with the right equipment can pick up the signal, for this reason it is very important that these signals are encrypted to avoid information being intercepted by the wrong people. One of the most common ways of encrypting a Wi-Fi network is by using a technology called WPA2 – Wi-Fi Protected Access.

WPA2 is commonly set-up with a Pre-Shared Key (PSK), this alphanumeric string should only be known by those who need access to the network and they enter the key when they are connecting to the network. The potential problem with this approach is that the PSK is used to generate the encryption key and if you use a weak key then the network is left open to a fairly simple attack which can gain access to the network within minutes.

The solution is simple – longer and more complex keys! For every character added the cracking process becomes considerably harder by a factor of compute years. The question is how long. There is no agreed answer on this as it depends on how random the key is. A truly random key of 10 alphanumeric characters is actually very hard to break, taking many years but a similar length key using dictionary words could be broken very quickly.

To be safe we normally recommend a minimum of 12 characters with typical password rules – upper and lower case, numeric characters, special characters and no dictionary words unless they have character replacements.

Of course a strong key only remains strong whilst it is only known by those who should know it and this is a weakness of the shared key approach as if the key is leaked, security across the network is compromised. There are additional factors that can be introduced to improve security further – for example one technique is called Dynamic Pre-Shared Key (D-PSK) which uses dynamic, unique keys for each user so there is no risk of a leaked key.

We will cover Wi-Fi and general network security in more depth in a later blog but with the right set-up Wi-Fi networks are perfectly secure – more so than most wired networks!

etherLive Final - In the CommunitySince our move to Royal Wootton Bassett in 2015 we thought we should look for some ways to help the local community. Etherlive has always had an active community programme and much of this has been in education including working with organisations such as Young Enterprise, Bath University, Bristol University, Bucks New University and several Wiltshire schools & colleges.

Whilst continuing to support a range of activities we thought a locally focussed IT Drop-In Centre could be of benefit to those living in the area. This free service is open to all members of the local community and will offer advice and support on personal IT issues. This can range from assistance in removing malware and viruses from a laptop to advice on social media privacy and online safety. You can get to learn the key points on how to safely use Facebook, Twitter, Instagram and LinkedIn; or perhaps it is applications like BBC iPlayer and All 4 that you would like help with. Got some questions about your home Wi-Fi or broadband? We are experts in this area! We’ll cater for complete new starters and the more experienced.

No pre-booking is required just drop in any time from 5pm until 8pm during one of the monthly sessions and chat to our friendly engineers who deal with troubleshooting problems everyday. Bring along your laptop or tablet and we will do what we can to help whilst you enjoy a cup of tea or coffee and discuss the ups and downs of the modern world of technology with our friendly and helpful staff.

The first of these monthly sessions will be held in our offices in Royal Wootton Bassett on Thursday March 17th from 5pm onwards. More details can be found on our community page.

 

 

Event Technology Myths

In the second part of our myth busting we look at satellite, high density Wi-Fi and broadband speed.

Satellite is the best all round solution for quick event deployment – BUSTED

Over the last few years KA band satellite has become a cheap option for temporary internet access, it can be a great solution in certain cases but there are many cases where it is not suitable. Satellite suffers from a high latency due to the distance to the satellite and this means every piece of data takes around 600ms to cross space. That delay might not seem much but it is crippling to services such as VPN (Virtual Private Networks), VoIP, video calls, online gaming and any application which requires lots of rapid two-way data traffic. It is great however for large file uploads and video streaming, however, it is important to watch data usage as this can rack up significant additional costs.

Satellite is also a poor solution for wide-scale access such as public Wi-Fi, this is because of a technology it uses to try and boost speed, the downside of which limits the number of simultaneous users who can connect to one satellite service. Most KA satellite services also have high contention ratios which can reduce the advertised 18Mbps/6Mbps type speeds down to something considerably lower, a similar trick is used with home broadband services. Uncontended services are available but the cost is much higher and other than for short durations (it’s normally sold in 15 minute slots) it is not competitive with other solutions.

Satellite can absolutely be the right approach, and we deploy lots of satellite solutions, but understanding the user requirements and explaining what the user experience will be like are extremely important to avoid disappointment and frustration.

Better Wi-Fi just means using more Wi-Fi access points – BUSTED

One of the most common problems with Wi-Fi networks is too many Wi-Fi access points and a poor design. A typical response to a user complaining about Wi-Fi is for another Wi-Fi access point to be deployed to ‘improve coverage’, yet frequently this just makes matters worse. Large scale and high density Wi-Fi requires very careful design to avoid what is known as Co-Channel Interference (CCI) where multiple wireless access points are in effect shouting at each other and slowing the whole network down.

Using fewer high capacity managed wireless access points with a detailed radio spectrum design, often with focused antennas, can deliver much high capacity and a better user experience than a thick blanket of access points. Good Wi-Fi design is a technical art requiring some very detailed knowledge – the output though is pretty much invisible to the normal user until it doesn’t work!

20Mbps of broadband speed is always the same – BUSTED

It would be nice if the experience and speed of all broadband services were the same so that when you are told you have 20Mbps that’s what you get. Reality is somewhat different and more complex due to a number of factors:

  • Contention Ratio – Nearly all providers contend their services, which effectively shares the capacity between multiple users, this can be as much as 50:1 whereby your 20Mbps is shared between 50 users! More normally 20:1 is seen, then 5:1 on more business (and expensive) orientated services, up to the perfect 1:1 (no contention).
  • Asynchronous / Synchronous – ADSL and FTTC (known as BT Infinity but also sold under different names) services are asynchronous, this means that the download speed is not the same as the upload speed. The original principle was that people need more download than upload speed but with modern cloud services, video calls and general rich media this has changed considerably and a low upload speed can be more crippling than the download speed. For example, you may have an ‘20Mbps ADSL service’ but typically the upload is only 768kbps and if the upload is at capacity the download becomes throttled due to the way TCP/IP networks work. Services such as true optic fibre (also sometimes called leased lines) are synchronous.
  • Connection Speed / Throughput Speed – This is primarily an issue for ADSL/FTTC but can be seen with other services too. The speed advertised by an ADSL modem when it connects is only the theoretical speed of the link between the modem and the local exchange. The real throughput or speed depends on the entire route from your computer to the location you are connecting to – this is a complex web of routers, fibre and ‘internet peering’. Different parts of that route may suffer congestion and reduce the overall speed of the connection. Choice of Internet Service Provider (ISP) is an important factor as the good ones have better peering and higher capacity links to reduce the risk of congestion and optimise routing.
  • Latency – Every device, cable and piece of fibre on a network through which data has to pass introduces an element of latency or delay- that’s due to physics. The amount of delay depends on distance (hence why satellite is a problem), quality of links (a poor link needs to use more error correction which adds delay), utilisation of links (high utilisation adds delay) and the number of routers, switches, etc. in the path. Good services may only add a few milliseconds of latency, poor ones several hundred milliseconds and that can make a big difference to user experience.

That’s it for issue 2. Next time, does my microwave really break my Wi-Fi? How comes Wi-Fi works through walls but not though trees? And should you worry about network security.

1199922_38790784For event organisers life on the road, in and out of venues, holed up in damp cabins and questionable hotels means the technology they carry and the software tools they use are critical to their day to day job. It’s an ever changing landscape and, to some degree, a personal preference but there are a few key items to think about to ensure the teams stay productive at a sensible cost.

The Laptop – Personal & Critical

Although smartphones and tablets are the most talked about items of the last few years it is still the trusty laptop that is at the core of the road warrior armoury. It is the item not to skimp on, buying too cheaply will cost more in the longer term but at the same time there is no sense in buying at the top end – the best value is in middle.

Choosing a proper business laptop rather than the cheaper consumer models is a wise move – they survive better on the road and focus on the things that make a difference for an intensive user – battery life, keyboard feel, screen quality, lighter weight, etc. Size is important – there is no need for a massive 17” screen, you are better off sticking with a smaller screen and using an external monitor when you really need the extra screen area, the saving in weight and the fact you can then use your laptop on a train or plane is a much better benefit. Be wary of ultra-high resolutions on smaller displays as these often frustrate users as they can be so hard to read.

Hard drive failure just before an event is not something you want. To minimise the risk select an SSD (Solid State Drive) instead of a traditional hard drive – SSDs are not immune to failure but they are a lot more tolerant of being bashed about in an event world and they are much faster.

In terms of performance the marketing always suggests the latest, fastest and most expensive processor is the way to go, however, overall laptop performance is down to the sum of the parts so there is no point in buying one with a high end processor which is then crippled by a slow hard drive, limited memory and weak graphics. These days’ processors are so good that unless you have some very specific needs you are better off buying a mid-range processor with plenty of memory, an SSD, decent graphics and good build quality. For example in the Intel processor range you should avoid the low end Core i3, instead picking a Core i5. Unless you have a specific, very demanding usage case there is little point in the extra cost of a Core i7.

Ultrabooks (extra thin and lightweight laptops) are worth the expense for the highly mobile but be careful on selection as many no longer have a physical network connector built in – they rely purely on wireless connections. The workaround is typically an external adapter. Similarly, many Ultrabooks have dropped some of the older generation connectors such as VGA in favour of HDMI and mini-HDMI – this isn’t necessarily a bad thing but you need to think ahead when presenting!

The type of wireless the laptop supports is very important and it is almost essential that you choose one which supports both 2.4GHz and 5GHz frequencies. The 2.4GHz range is typically so crowded on event sites that it is often unusable, whereas 5GHz has more capacity and provides a much better experience.

Should you buy an Apple Mac or a Microsoft Windows based laptop? In my view it doesn’t really matter – they both share the same core components and each suffers from similar types of failures and security issues. It is more about what sort of user experience you want and if you are already used to one or the other do not underestimate the initial loss in productivity if you switch!

Productivity Tools – Too Many to Choose From

The emergence of cloud services has led to an explosion in productivity tools, particularly ones that work well across distributed teams. Dropbox, Box, Office 365, Google Drive, Evernote, Google Docs, Microsoft One Drive, Skype, WhatsApp – the list goes on and on. They all have pros and cons and most will meet the needs of the majority of users. It’s not so much about which tools you choose, but about how many and how you manage them.

With a distributed team, especially one that includes freelancers, it is far too easy for everyone to do their own thing and productivity drops because no one knows where anything is or which version is the current one. It is really important to agree the tools and stick to them – less is more!

Offerings such as Office 365 where email, office applications and project sites can all be delivered as a single SaaS (Software as a Service), allow rapid scaling and shrinking of licences which is very effective for dynamic teams. There are additional benefits too since they are hosted in the cloud there are no VPN (Virtual Private Network) complexities for users connecting back to a central office whilst on an event site.

The downside of the modern cloud services is they require connectivity, not an issue when you are in the office but on event sites the impact is a lot more significant. The background synchronisation that takes place from your laptop, phone and tablet all consume bandwidth and this has increased the connectivity demand from event sites significantly which must be factored into event plans.

Security – Ignore at Your Peril

Distributed teams, a need to share lots of information, contractors, freelancers and a just get it done driver provide a mix which is an IT security nightmare. Information access, control and protection gets more complex every day and sadly the leakage of sensitive information and hacking are very real problems.

It all starts with the humble login and password, still the way that nearly all systems are accessed. We all hate them and we all get lazy with them. A few golden rules to start with:

  • Never use ‘shared’ logins – the moment you use shared credentials you lose all ability to audit and control. If you suffer a breach you will not be able to trace it and the only way to stop it involves impacting everyone.
  • Do not use the same password on multiple accounts – People hate this one but it is increasingly important. The reason is simple – the majority of systems use your email address as the login id so if one system gets hacked (which is all too common) and login details are compromised the hacker knows that using the same login id / password combination on other systems is more than likely to work. What starts as an annoying but manageable breach on a harmless website becomes an exposure to financial data, banking, customer information etc.
  • Strong passwords – It’s incredible that the most popular password is still 123456 and the second most popular is password. In a business environment that should be treated as irresponsible and a possible disciplinary offence. Password hacking methods have moved way beyond the old ‘brute force’ attacks which means even fairly complex passwords are cracked surprisingly quickly. If you can remember your password easily then it is probably too simple!

The last two points above are at the core of the issue which blights confidence in computer security – realistically no human can manage dozens and dozens of different, complex passwords so the weak ones persist and play straight into the hacker’s hands.

At first the solution seems counter-intuitive – password managers. These utilities such as Lastpass and 1Password manage all of your passwords allowing you to have unique, complex passwords for every system you use. You then just have one password to remember to access the password manager.

Surely this is a bigger risk as that one password gets access to everything? Potentially yes, but there are reasons why this risk is smaller than the risk of not using a password manager.

Firstly, you are far more likely to remember one complex password than lots of them. Secondly the password manager (or at least the good ones) is local to your devices so to try and hack the password the hacker needs access to your actual device, not an on-line website so this adds another layer of defence. A password manager is infinitely more secure than yellow sticky notes stuck to your screen.

To go a step further, particularly for a password manager, using ‘two factor authentication’ is wise. Two factor authentication provides an additional layer of security in a similar way to the card readers used by many banks for on-line banking but instead of a card reader they use an application on your computer or smartphone. Products such as Google Authenticator are now supported on many password managers and also directly on other on-line services.

Passwords are a key part of security but there are a few other aspects which need to be watched carefully. Most security breaches are still caused by employees or contractors – both intentionally and unintentionally. With documents and information bouncing between people and systems at an alarming rate knowing who has access to information and where information is stored is crucial.

Thankfully the majority of staff and contractors are trustworthy but it only takes one. Using unique logins for all staff as mentioned above makes the process of closing down access much more straightforward when it is no longer required and provides traceability. Most systems now provide a granular access control so that not everyone gets access to everything. A clearly owned ‘leaver process’ is also important to make sure logins are removed and content deleted from sharing locations.

Effective technology usage can make a big difference to productivity but it is too easy to overcomplicate. We now have an amazing array of systems with which to share content and communicate but when the pressure is on ‘old fashioned’ email still comes out on top as it is simple and dependable. The same thought should hold true for the other aspects; event road warriors require simple and dependable solutions that do not distract them from what they need to do – run events!

People at concert shooting video or photo.

Only a few years ago communication from attendees at an event consisted of the occasional phone call (if you could make one) and maybe a text or two. The phenomenal success of smart phones and social media has changed all that at a pace no one was expecting.

We raced through textual commentary and onto photo commentary within a couple of years, a development which saw a huge shift in network demand on event sites, the few bytes of a text message replaced by megabytes of high resolution photos.

Now we are seeing the next shift into video, initially starting as a ‘record and upload’ approach but rapidly shifting to live video streaming with some current generation smartphones capable of 4K ultra HD video at more than 50Mbps! Last week Facebook announced Live Video, their offering in the live video streaming arena to compete with Periscope and Meerkat, a reflection on how fast the area of personal video streaming is moving. The key point is that Facebook Live Video is integrated into their main application removing a barrier to usage and fuelling more rapid adoption across its massive existing user base.

The data demands of such use are vast, especially in a high density environment such as an event and therefore, for the moment, this is likely to restrict the growth to some degree, however, it is happening and with it comes a significant shift in thinking about how content from live events is managed.

Putting aside the technical pressures on event networks in terms of capacity the real question is about the content. We have seen the shift from ‘no cameras’ to a reluctant acceptance that the control of photos from event sites is pretty much impossible, even though many artists do not like the sea of people watching events through their phones. Increasingly there is some acceptance of a time shifted amateur video appearing on YouTube but the idea of real-time video streaming takes the subject of content management to a new level.

For organisers, promoters and artists the question is do you try and stop it from occurring using either technical or physical approaches, or accept it and turn it to an advantage. Technically restricting video streaming on an event managed and controlled Wi-Fi network is perfectly feasible but on the 3G/4G mobile network services it would come down to discussions with those operators as to whether they would be prepared to block streaming from specific cell towers during an event which is unlikely, although in reality at present it is unlikely these networks would be reliable enough to support a video stream.

Alternatively, rather than trying to block at source, a continuous scan of live streaming services to identify and remove streams could be employed but the effort required to do this would be huge and would not be successful without the support of the service providers such as Facebook & Periscope.

Physically trying to control it becomes a question as to how strong do you get with identifying and removing abusers, an approach which can cause tension between fans and artists, would require additional resource to police and would never be entirely successful.

Is the answer then to accept and adopt, finding ways to benefit from this new communications channel? Does it really harm the event or the artists to have boundless unmanaged content strewn across the internet? On the other hand, do you want someone wandering around backstage streaming everything? Or someone in the front row streaming what could be a surprisingly high quality and atmospheric video? It’s a copyright nightmare but potentially provides massive exposure.

Technology is not only at the source of this issue it is also likely to be part of the solution too but not before event organisers decide on approach. One thing we do know from technology over the last twenty years is that it cannot be ignored, it is the ultimate disruptor and will always find a way of winning through.

 vote

Your Vote Counts!

Exhibition News has opened up nominations for their ‘Elite list’ which identifies the best suppliers in the events industry. In the past we have been proud winners of the Wiltshire Business Innovation and Growth Award, an Event Production Award and finalists in the Event Technology Awards amongst others.

If you think we have done a great job, it would be great if you can spare a moment and vote for us in category 3 – Technology Provider by clicking the link below.

enicon

Thanks!

NFC contactless are the next evolution

“Smartcard to wipe out cash”, a headline which sounds like it may have been used any time in the last few years, is actually taken from the Evening Standard in 1993 just before the launch of Mondex, one of the earliest smart card cashless payment systems.

Launched in Swindon, UK Mondex promised to revolutionise payments using what today is known as a ‘closed loop’ system where money is transferred to a smart card containing a chip and the card is then used to pay for items using a special reader until the virtual cash is used up.

It sounded great and launched to much fanfare but four years later it quietly disappeared never to be heard of again. Its lack of success is generally cited as being down to the hassle of loading the cash, the limited locations at which it could be used and the infrastructure required to support it. Soon after this chip & pin started to emerge offering an ‘open loop’ solution whereby the cash is debited directly from your bank account and within a few years this became the norm.

Skip forward twenty years and it feels like we are seeing history repeat itself.

For the last five years or so the talk of cashless payment at events has fuelled many a debate but the implementation and adoption in the UK at least has been very slow and fraught with issues. The basic idea has been the same as Mondex all those years ago – a closed loop system with the chip (now wirelessly contactable) typically embedded in a wrist band rather than a card.

Many of the same challenges still exist today – the hassle of adding credit to the wristband, the dedicated infrastructure required, limited areas of acceptance and redeeming unused balances. Then there is the user aspect, many of the benefits are for the organiser and promotor rather than the attendee. This is coupled with attendees having concerns about too much information being made available to the event about their purchases and payments.

The aforementioned issues with closed loop systems have allowed the next generation of open loop contactless systems to gain adoption at a much faster rate. Open loop contactless using an existing debit or credit card is a natural progression from chip & pin and removes many of the hurdles seen with closed loop. It is quite telling when one of the world’s largest closed loop systems – Transport for London’s Oyster card – is now moving to an open loop approach.

What is interesting is that in some countries there has been higher adoption of closed loop – the US for example. The US were much faster to the chip & pin party but have been behind the curve on contactless and this may have left a window for closed loop in the short term.

The question is where does this leave events who have several drivers to move to a cashless environment. With the rapid adoption of open loop contactless in day to day life coupled with several disrupters like Apple Pay, Android Pay and PayPal Here, all of which use an open loop approach with NFC (Near Field Communications) embedded in smart devices, the modern generation of event goers will move to the trusted services and closed loop will quietly die away.

What remains is the challenge at events in terms of how to deal with smart reader based payments as the infrastructure cost can be a hurdle to adoption. There are several components to this:

Universal Payment Terminals – The banking world needs to move faster in providing good quality payment terminals that are certified across multiple methods of communication (wired, Wi-Fi & mobile data) and multiple payment methods (chip & pin, contactless & NFC). Today different terminals have to be used depending on the connection method and payment type which means merchants have to hire terminals for use at events because they cannot use their normal terminal. A universal terminal would also make deployment on event sites much easier and cost effective.

Merchant IDs – Many smaller traders at events do not have the magic ‘Merchant ID’ required to set-up card based payment terminals. Merchant IDs are controlled by payment houses and can be costly and complex for very small businesses so a better mechanism is needed to facilitate access to open loop systems for those traders. This sounds like an easy area but it has some complexities due to money laundering issues. Systems such as iZettle help with this but carry (generally) higher fees.

Access to Data – A difference between closed loop and open loop for a promotor or organiser is the ability to easily access usage data. As closed loop is in the control of the organiser they get full visibility (although this can be seen as a negative by attendees). With open loop the data is held by the payment providers so to get a better view across the entire event (involving many merchant IDs) some form of agreed consolidated reporting would remove the concerns organisers have about visibility.

Providing Infrastructure – Open loop systems tend to have a slightly higher requirement when it comes to readers being connected to a network (although many closed loop systems are not as offline as promoted). A modern event has such high requirements in other areas for connectivity that adding in payment systems is not the concern it once was. It is now well accepted that providing access to contactless card based payments drives a higher spend so it should be recognised that an increased spend on infrastructure will reap returns overall.

In the last few years we have seen a rapid swing to providing a resilient payment environment across events and the feedback is very positive – fast and easy transactions, and an increased spend by attendees. It just needs more support from the banking world to resolve the last few issues and make the cashless (or near cashless) event a reality.

Notes:

Swindon Advertiser – How Smart was that?

15402526381_c8db6505dc_m

We may be heading into UK winter but Etherlive are already busy planning for the summer of 2016. We are always on the lookout for new recruits with opportunities for students and seasoned professionals for the peak summer season. Our environment does not suit everyone, this is IT with a difference; tight deadlines, indoors and outdoors across the UK and Europe, with events ranging from business briefings to large music festivals, these are roles for people who thrive on challenge, can demonstrate problem solving and who have excellent communication skills.

Our teams deploy technology services which support some of the largest and greatest events throughout the UK and Europe. Etherlive is the power behind the scenes which helps the production crews, security, artists and broadcasters bring their events to the millions who attend.

For some roles excellent technical knowledge is required, especially around wireless technologies and core IT such as TCP/IP, networking and server management but for others it is customer skills and project management that are key. Knowledge of the events industry, particularly outdoor events and festivals is an advantage but not essential.

We will be running a series of wide ranging skills and training workshops throughout the winter and spring so whatever your skillset or experience we are keen to hear from energetic and enterprising candidates with a passion for working in the events industry.

Those looking to apply should be ready to spend periods working away from home, sometimes several weeks. In the first instance please send your CV and a covering letter to introduce yourself to jobs@etherlive.co.uk.