3G-4G-5G

Pretty much every day I’m either asked or told about the way 5G is going to ‘change everything’. I’m currently sitting on a train on my way into Paddington and my 4G connection has dropped multiple times and when it is connected the speed varies from a trickle to occasional bursts that reflect the speeds I would expect on 4G. We all crave ubiquitous connectivity at a good speed but the reality is somewhat behind the hype.

I’m reading yet another article saying that 5G will deliver speeds in excess of 20Gbps and I’ll be able to download a HD movie in a few seconds. Ridiculous claims which accompany most technology launches – the claims for 4G were 1Gbps but the experience, in the best case, is at least a magnitude lower.

We have come a long way from GPRS when any data access was painfully slow but each technology step forward becomes more complex, more costly and delivers less of a step function change to the user. In reality when 5G is deployed into the world it will meet a mixture of technical and commercial challenges which will reduce the hype to more of a ‘4G evolution’ – it will be better, and that’s great, but not the revolution promised, at least not initially.

One of the problems is that 5G needs to operate across multiple frequencies. This is not a new technical challenge, the same type of approach is used for 3G and 4G networks, however, some of the frequencies required to make 5G work are much higher up in the spectrum, known as millimetre waves (mmwaves). These very high frequencies are where the headline speeds of 20Gbps come from, but there is a problem – these very high frequencies require line of sight to the mobile device and are heavily impacted by rain, trees, buildings, etc. Millimetre waves also have a low range so the deployment of the infrastructure is complex and requires many more antennas (base stations or cell towers) than the lower frequencies of today. It’s also worth noting that new handsets will be required to use these new frequencies so there will be a long period where many devices will not have the required circuitry inside to make use of the higher speeds.

If you consider that today a good example of 4G may deliver 10-50Mbps depending on various factors, it is expected that 5G may reach 100-200Mbps in similar circumstances – that’s a good increase but it isn’t 20Gbps, and it certainly will not be available widely, not in the short to medium term. As for remote locations they will be stuck on the lower frequencies which will restrict what can be offered.

As with Wi-Fi the connection speed is only half the problem, the backhaul internet connection is just as important and to support many users all expecting 5G speeds of 100Mbps requires a massive upgrade to the supporting fibre and wireless backhaul networks. Ignoring the significant cost implications of all this the practicalities of deploying the infrastructure is going to take time and it is expected that it will be at least 2022 before we see any serious progress on 5G.

So what does this mean for events?

The first thing to note is that on mobile devices over 60% of data traffic is carried over Wi-Fi networks rather than the mobile networks and that figure has grown, not shrunk, over the last few years. With mobile operators still recovering the cost of a 4G infrastructure, now faced with an even more expensive 5G infrastructure, the data plans are not going to get any cheaper so the cost conscious consumer will still hunt out Wi-Fi wherever possible.

Although a 5G speed of 200Mbps sounds great a good Wi-Fi network today can deliver speeds well over 200Mbps and already approved Wi-Fi standards go much further (The existing 802.11ac standard goes up to 3.45Gbps) so for the really serious data users Wi-Fi will remain the primary choice. It’s not a case of 5G versus Wi-Fi, both will co-exist – they are designed differently for different purposes – but in the event space Wi-Fi will be continue to be a pre-requisite for many years yet.

As with 4G the initial deployments of 5G will be focused on cities with a high density of users, less dense areas will be some way off and of course the countryside fields used for many outdoor events will be no better served than they are today as commercially it makes no sense unless a particular sponsor wants to put in temporary masts.

With the low range nature of the high frequency 5G spectrum what this does mean is that indoor venues such as conference centres could well be a target for early deployments given the business orientated footfall who are more likely to have handsets capable of using the 5G network. Another good example would be large stadiums but, in a similar way to Wi-Fi, in these locations the cost of infrastructure to support the density of users is significant so it will come down to who will pay.

Venue owners who want mobile coverage to be good in their venue but expect the mobile operators to pay have to consider that mobile operators can struggle to recoup the costs of a dedicated install as they generate no additional revenue from it. The use of shared infrastructure between mobile operators may help this but it doesn’t change the fact that someone has to pay.

Once 3G reached wide deployment then it became a viable option for using it to provide the backhaul internet connection in some circumstances with Wi-Fi as the access medium. 4G continued that trend and 5G will go further but limitations still apply. 5G is a shared medium so overall performance will depend on other users unless dedicated bandwidth is offered (something that is hard to do on 4G), so in an event environment as the number of attendees goes up the performance goes down.

With 4G operators have been reluctant to offer dedicated bandwidth over their infrastructure, something that would be fantastic for events. Perhaps this will change with 5G but even then, with the price of dedicated fibre circuits continuing to fall, shared services such as FTTP (Fibre To The Premise) now becoming available and improving Wireless Internet Service Provider (WISP) offerings the pricing from the mobile operators would have to be very competitive.

Without any guaranteed bandwidth then 5G is no different to 4G which means that it is fine for general use but if the performance is critical then there are no guarantees. You would not want to risk your ticketing, payment systems, video streams and core control systems on a mobile network over which you have no control but for less critical services it may be an option.

Over time 5G will provide another option in the event toolbag as a step forward from 4G but its real benefit will be to the truly mobile individual user and then hopefully on a train journey into London I can get a consistent 20Mbps all the way, but I fear that ideal is still quite a few years away.

A Career in EventsYou can get a great buzz out of attending a good event, and if you organise it the buzz is even better. Every successful event requires a team of dedicated professionals with a range of skills but often the event industry isn’t considered as a career choice.

At the recent ESSA (Event Supplier and Services Association) AGM a fantastic video was shown about working in the events industry. It really speaks to how passionate those of us in the industry are about making event magic happen, mostly behind the scenes working with festivals, exhibitions, corporate launches, sporting events and one-off special events like a Royal Wedding!

We welcomed our 4th full year intern last year, and this year we will be taking on two more to support our growing engineering teams in the field. For the first time this year we also have an apprentice, a great way to get exposure in the events world, so if you know anyone looking to take that first step don’t forget about the events industry.

Of course, those with more experience are just as welcome, our company like many others is built on the years of experience gaining from all quarters including large and small companies.

We do all have one thing in common, we are one team and we love events!

GDPR, CCTV and EventsIs this yet another GDPR article? Yes, but before you click on past, this article is a bit more specific, focussing on Event Organisers and a few important aspects relating to them.

If you have somehow managed to miss the basics here is a quick recap (otherwise skip the next two paragraphs). GDPR, the General Data Protection Regulation, comes into force on May 28th 2018 and is as dull as the name suggests but that doesn’t mean it should be ignored. GDPR is, in effect, a beefier version of the Data Protection Act and there are some significant aspects which have changed.

First off, the fines if you are found to be breaching the regulations could be huge – up to 4% of annual worldwide turnover (up to €20 million). Secondly, the onus with GDPR is focussed much more on how and why, with supporting documentation – no more simply ticking a box to say you comply. Lastly, and perhaps most significantly, with GDPR there is much more appetite to enforce, along with more resources to audit.

Nearly all of the material currently circulating is focussed around the more obvious areas of customer, supplier and employee data; everything from email addresses to bank accounts and the harvesting of information from websites, social media and direct mail campaigns. This is all valid and needs to be considered seriously, however, for events there are some additional areas which could too easily be overlooked.

CCTV

CCTV is not necessarily something that initially comes to mind when it comes to GDPR but it is very much part of it. The holding and releasing of CCTV footage is already well controlled but the new regulations go much further requiring information on camera placement, field of view and reasoning for coverage needed, coupled with proof of deployment and signage. This is a significant uplift for events compared to the current approach and will need to be factored into planning and deployment from the start.

It is also important to note that ANPR (Automatic Number Plate Recognition), drone and body-worn cameras will all need to be assessed too.

In practical terms, we are now expecting all temporary CCTV installations at events to undergo an audit during the build phase documenting the camera locations and reasoning for those locations. Field of view into public areas external to the event is especially important.

Agreement on how long footage is held for, the release process and who can receive the footage will also need to be under much tighter control.

Public Internet Access

Many events allow public access to the internet on an event Wi-Fi network after a ‘splash page’ which may capture details such as an email address to be used after the event to send marketing information. In the future this information is more controlled and must use explicit ‘opt-in’ clauses before the email address can be used.

Even the logging of an IP address (the identifier used when a device connects to the network) coupled with the user information is governed by GDPR, however, this information is required to be stored under the Investigatory Powers Act 2016 (aka the Snoopers’ Charter) so the way it is stored and who has access to it is very important.

For events which offer public internet access the method of access and what information is captured and stored will need to be reviewed, with likely changes to the Terms & Conditions and opt-in statements.

Supplier & Volunteer Registration Systems

Employee and customer data is called out in nearly all GDPR overviews but it is important to remember that GDPR covers all data including anything recorded for suppliers and volunteers. Any system (paper or electronic) which stores personal information must be assessed including aspects such as what information is stored, where it is stored, how it is stored, how it is used, how the owner can remove it and who has access.

Visitor/Attendee Information

Any personally identifiable information gathered on attendees, such as an email address falls under the same regulations – this could be via initial ticket purchase, attendee registration or at the event itself. Particular attention must be paid to any direct marketing as the attendee must explicitly opt in to any future communications and have means to update or remove their information.

Many of these areas are likely to require a Data Protection Impact Assessment (DPIA), this is a new tool and process which must be used when new technology is used or when there is high risk to individuals.

The new regulations also broaden the scope of ‘personal information’ to cover just about everything from a name, email or social information through to genetic, economic and cultural information. The holding of this information has to be shown to have positive, clear consent from the individual using ‘plain English’ type agreements.

An individual must be given the ability to view and update information, and importantly has the right ‘to be forgotten’, which means complete removal from all systems.

These changes may initially look very onerous, however, a lot can be covered by a sensible review and improvements to existing processes. The important thing is not to ignore it – the changes are coming and a lack of preparation will not be a defence if you are found to be in breach.

For events we work on we will be working closely with organisers to assist and make sure all aspects are covered, providing templates and guidance wherever possible. If you would like to discuss any aspect of GDPR impact on your event then please contact us and we will be happy to help.

October 18th-19th 2017 sees Etherlive exhibiting at the Showman’s Show for our 10th consecutive year, the only show we have exhibited at every year since our formation. Its’ been a journey so we thought we would take a look back through our archives.

2008 – New Kids on the Block

We still have some of the 2008 mugs we had made as a giveaway in daily use at the office!

Amazing what you can squeeze onto a 3m x 2m stand

Less than a year after forming Etherlive we exhibited at the Showman’s Show for the first time in October 2008. We focused on a message of enabling events with good connectivity, reliable Wi-Fi, VoIP & CCTV.

Much has changed over the last ten years but that message is still just as important today, if not more so. We believed demand was going to grow hugely but I don’t think even we expected to move from the norm of a few ADSL lines to multi-gigabit fibre so rapidly on event sites.

 

2009 – Innovation in Tower Lights

In conjunction with Ace Plant & TCP we launched the Ecolite P+ at Showman’s 2009. The Ecolite P+ was a more eco-friendly tower light using metal halide discharge lamps coupled with CCTV, Wi-Fi and a PA system. The light was palletised so that more units could be transported on a trailer and had a generator which could run for several days on its internal fuel tank.

The fleet of Ecolites remained operational until 2017 when they were retired, replaced by newer units in which the technology has evolved requiring less customisation. Tower lights with CCTV and Wi-Fi remain a popular option for certain locations on event sites and construction sites.

 

 

 

 

2010 – Integrated Comms

Back in 2010 access to high capacity services was still relatively limited so often multiple services needed to be aggregated, such as ADSL, satellite and 3G. Intelligent routing, balancing and bonding across these services became a key factor.Today everyone has at least a backup 4G unit and balanced or bonded 4G is commonplace for smaller pop-up style events. It is not the solution for critical services but it has its place and it won’t be long before we are demonstrating 5G solutions.

Today everyone has at least a backup 4G unit and balanced or bonded 4G is commonplace for smaller pop-up style events. It is not the solution for critical services but it has its place and it won’t be long before we are demonstrating 5G solutions.

 

 

2011 – Look to the Sky

The launch of Ka band satellite internet access in 2011 radically changed the use of satellite at events. Previously satellite internet was a relatively costly and difficult proposition which didn’t give a good user experience. The Ka band internet service was designed for broadband services, rather than broadcasters, using smaller dishes and allowing multiple users to share bandwidth to keep costs down. It has proved hugely popular and is now a common sight at events. It still suffers the important limitation of all satellite services – high latency – so can only be used for certain services and is a very different user experience to other types of connectivity.

2011 also saw us demonstrate a beta version of an outdoor femtocell which provided mobile coverage from an internet connection, sadly the units where never brought the unit to production. A high-end DECT based phone system was also on display offering wireless roaming across an event site. This was used heavily in our deployment for the London Media Centre in 2012 and is still in use at events today.

 

2012 – The Holy Grail of Cashless & RFID

Since 2010 we had been working on RFID solutions with a crew catering system and later an accreditation/management system. In 2012 we worked with Intelligent Venue Solutions to bring a number of RFID related activities under one umbrella at a spin-out stand at Showmans.

Cost remained a barrier to wide-scale adoption and with the rapid rise of open-loop contact-less payments, over time we have moved into an enabling function for other providers. Today, for example, we deploy thousands of contact-less payment terminals and support ticket/wristband scanning for all the main providers.

 

 

 

2013 – Live Event Footfall Analytics

In 2012 we introduced a people counting solution which we refined and extended into Live Event Footfall Analytics (LEFA) for Showman’s 2013. Structured counting, for example in well-contained routes such as exhibitions, is now well established, whereas unstructured crowd density analysis is still an evolving area but shows great potential longer term for helping manage crowd flows and density.

2013 also saw us demonstrate 4G solutions and a Teenage Registration system designed for WOMAD to assist with the identification of under-18s and their associated responsible adults. Our first use of HD-CCTV cameras was also during 2013 with demonstrations at Showmans. We are now moving some cameras to QHD and testing 360 degree cameras.

 

 

2014 – The Intelligent Event

With more and more technology being deployed at events the next logical step is the integration of technology and communication between all aspects providing a rich level of event intelligence. From HD-CCTV, ticket scanning information and device location information, through to geo-tagged social media posts, a picture can be built up of what is happening on site and fed live into event control.

 

 

 

 

 

2015 – Meeting the Increasing Demands

Ever increasing requirements, demand for higher speeds and more capacity had driven a rise in the amount of fibre deployed around sites. Bigger sites were typically run with a fully routed design providing improved redundancy, increased speeds and better isolation between areas. This was often coupled with multiple gigabit internet connections which were diversely routed for redundancy.

2015 also saw deployment of latest generation 802.11ac wireless access points providing more speed and capacity to wireless clients.

 

 

 

2016 – Monitoring & Management

Extending on from the Intelligent Event additional components were demonstrated including Automated Number Plate Recognition (ANPR) and the Etherlive Monitoring Unit (EMU). The EMU is a type of IoT (Internet of Things) device which can be used to monitor a variety of things. It is particularly useful for monitoring power giving us early warning of power issues across a large event site.

In partnership with AirBox Systems, we demonstrated an incident management system, developed initially for the emergency services but just as applicable to event sites.

 

And so to 2017…

A light-hearted surfing theme for 2017

In a way it is ‘more of all of the above’, demand and complexity continues to increase, more and more devices are becoming network enabled and the broader event eco-system drives new requirements. The importance of reliable networking, Wi-Fi, CCTV, VoIP, ticket scanning and payment systems is now recognised as one of the keys to event success.

With a snap general election called and the current political landscape in flux I wouldn’t be prepared to make any predictions on the outcome but it is safe to say that come election day town halls, sports halls and conference centres across the country will be buzzing with activity. A stream of reporters, candidates and supporters will be desperately trying to send and receive information through every media channel available.

The demand for good connectivity at election counts and associated events has mushroomed over the last ten years. We were first involved in a general election in 2010, providing services for the first televised leader debates, events which demonstrated the power and reach of Twitter in politics for the first time. Since then things have moved on with 140-character messages supplemented by live video services in the form of Periscope and Facebook Live.

Come June 8th social media channels will be pumping vast amounts of data around the UK and beyond, the difference this time is that with such short notice it will be all too easy for venues to get caught out and provide frustratingly inadequate connectivity.

For anyone tasked with running an event, be it a count, press and media centre, or political party gathering it is critical to act now and ensure you have the right communications infrastructure in place. For anything above the smallest of gatherings relying on data connections from normal mobile phone operators is a risky strategy as they are not designed to deal with the surge in demand and load at a media-centric event.

Some venues may well have an element of existing Wi-Fi infrastructure but does it have the capacity in terms of ability to support an increased number of users? Does it have the internet backhaul to support the high data volumes and video streams which today’s press and media demand? These aspects can be assessed quickly and used to drive requirements which may include additional Wi-Fi capacity and associated internet access.

Other ‘pop-up’ locations may have no existing service and require a full temporary provision. The trickiest part of this is normally the internet backhaul, which with short notice may have to be provided by satellite (not as expensive as people tend to think) or a wireless link from another location.

It is sometimes possible to provision additional ADSL and FTTC (BT Infinity) internet services at quite short notice depending on the location but these services need to be booked as soon as possible. Fibre optic services take at least three months so new services are not feasible in this scenario, however, existing services may be capable of a ‘burst’, temporarily increasing speed for a short period.

With campaigning well under way smaller pop-up gatherings often also require some additional support which may be in the form of specialised 3G/4G bonding units which can provide localised Wi-Fi coverage whilst on the road in the campaign bus.

For larger events, especially those with a big broadcast media presence, there are additional challenges in the form of interference disrupting Wi-Fi due to people bringing their own Mi-Fi devices or other broadcasting equipment. These events need centralised spectrum management, or preferably a single central provider, to ensure everyone co-exists harmoniously when it comes to connectivity.

Even with the connectivity there are other factors – security being a key one. Understanding how to control and manage a network is essential to avoid any embarrassing information leaks. Additional phones may be required which can be deployed quickly using VoIP telephony rather than traditional copper lines. It may also be important to have some printing capability available.

In the modern social media driven political landscape, ensuring the connectivity & IT works flawlessly is just as important as the tables, chairs, tea, coffee, campaign bus and printed propaganda.

Tucked away in the corner of a bustling Christmas market is one of fifteen, non-descript little grey boxes. Like most IT equipment it’s pretty drab on the outside, the clever bit is hiding inside. We call it an EMU – an Etherlive Monitoring Unit – and it is in effect part of the emerging world of the ‘internet of things’ but with an events twist.

The EMU only measures 4cm x 6cm x 3cm but inside it is packed with an array of features. At the Christmas market it is busy monitoring the power being delivered from the generators, tracking the voltage, frequency and, critically, the amount of run-time available from the uninterruptible power supply if the generator should fail for any reason. It monitors this information in real-time and relays it back across the on-site network to the central management system. If a problem is detected the system generates alerts, sending out emails and SMS messages to key contacts as well as displaying information on a management console.

It can even assist with detecting wiring issues, noticing for example when a fault causes a floating neutral situation which results in dangerously high voltages being delivered to power outlets, protecting both equipment from damage and people from potential injury.

With real-time monitoring of power the risk of outages in systems which rely on power, such as the core IT infrastructure delivering phones, CCTV, ticket scanning and payment systems, can be minimised, which is critical on a busy event site.

The EMU though can do a lot more than just watch the power, it is a general-purpose monitoring tool capable of taking inputs from a range of sensors – these can be heat, smoke, fluid, sound, light or in fact pretty much anything that can be monitored via a sensor capable of producing an analogue or digital output. The fact that when the EMU powers up it connects to the site network, identifies its location and checks into the central management system, makes it simple to deploy.

Increasingly we are deploying EMUs across event sites as part of our ‘Intelligent Event’ strategy combined with other data sources such as CCTV, mobile device heat mapping, access control, crowd analytics and even social media data. As the number of data sources increases a detailed real-time picture of the event can be seen in event control speeding up decision making and enabling pro-active management of aspects such as water reserves, fuel levels and area capacities. The intelligent event system can also help identify cost savings and optimisation opportunities on site.

The ‘internet of things’ can sometimes be seen as marketing hype but go beyond the fluff and with the right components and approach, it can help solve real problems very effectively.

Payment Terminal

Technology is entwined with much of our day to day lives, with no better example than the growth of smartphone adoption, a device now seen as a must-have. Payment and banking is almost unrecognisable from ten years ago with online banking, mobile apps, ‘chip and pin’, contact-less payments and online payments.

At events, however, many attendees often find trying to make a simple credit/debit card payment can be a frustrating and unreliable experience. For ourselves as technology providers ‘credit card machines’ or PDQs as they are known, come top of the list of complaints from event organisers, traders and exhibitors.

These problems not only cause frustration for attendees but also present a serious issue in terms of financial return for traders and exhibitors, and their desire to be present at events. It is well documented that the ability to take contactless and chip & pin payments at events increases takings, reduces risks from large cash volumes and can improve flow and trackability.

So why is it such a problem? Much of the issue comes down to poor communication and misinformation on top of what is already a relatively complex environment. Card payments and the machines which can take payments are highly regulated by the banking industry meaning they tend to lag behind other technology, however, this can be overcome and a properly thought through approach can deliver large scale reliable payment systems.

Bad Terminology

A lot of the confusion around PDQ machines comes from the design and terminology used. Although the machines all look the same there are differences in the way they work. Nearly all PDQs use the design of a cradle/base station with a separate handheld unit. The handheld part connects to the base station using Bluetooth. This is where the confusion starts as people often describe these units as ‘wireless’ because of the Bluetooth, however, their actual method of connectivity to the bank may be one of four different types:

  • Telephone Line (PSTN – Public Switched Telephone Network) – This is the oldest and, until a few years ago, the most common type of device, it requires a physical telephone line between the PDQ modem and the bank. It’s slow, difficult and very costly to use at event sites because of the need for a dedicated physical phone line, however, once it is working it is reliable.
  • Mobile PDQ (GPRS/GSM) – Currently the most common form of PDQ, it uses a SIM card to connect to a mobile network to use GSM or GPRS to connect back to the bank. Originally seen as the go anywhere device, in the right situation they are excellent, however, they have limitations, the most obvious being they require a working mobile network to operate. At busy event sites the mobile networks rapidly become saturated and this means the devices cannot connect reliably. As they use older GPRS/GSM technology they are also very slow – it doesn’t make any difference if you try and use the device in a 4G area – it can only work using GPRS/GSM. As they use the mobile operator networks they may also incur data charges.
  • Wi-Fi PDQ – Increasingly common, this version connects to a Wi-Fi network to get its connectivity to the bank. On the surface this sounds like a great solution but there some challenges, firstly it needs a good, reliable Wi-Fi network. The second issue is that many Wi-Fi PDQs still operate on the 2.4GHz Wi-Fi spectrum which on event sites is heavily congested and suffers lots of interference making the devices unreliable. This is not helped by the relative weak Wi-Fi components in a PDQ compared to a laptop for example. It is essential to check that any Wi-Fi PDQ is capable of operating in the less congested 5GHz spectrum.
  • Wired IP PDQ – Often maligned because people think it doesn’t have a ‘wireless’ handset, but they are actually the same as all the others and have a wireless handset but it uses a physical wire (cat5) from the base station to connect to a network. In this case the network is a computer network using TCPIP and the transactions are routed in encrypted form across the internet. If a suitable network is available on an event site then this type of device is the most reliable and fastest, and there are no call charges.

All of these units look very similar and in fact can be built to operate in any of the four modes, however, because banks ‘certify’ units they generally only approve one type of connectivity in a particular device. This is slowly starting to change but the vast majority of PDQs in the market today can only operate on one type of connectivity and this is not user configurable.

On top of these aspects there is also the difference between ‘chip & pin’ and ‘contactless’. Older PDQs typically can only take ‘chip & pin’ cards whereas newer devices should also be enabled for contactless transactions.

Myth or Fact

Alongside confusion around the various types of PDQs there is a lot of conflicting and often inaccurate information circulated about different aspects of PDQs. Let’s start with some of the more common ones.

I have a good signal strength so why doesn’t it work?

The reporting of signal strength on devices does nothing but create frustration. Firstly because it is highly inaccurate and crude, and secondly because it means very little – a ‘good’ signal indicator does not mean that the network will work!

The issue is that signal strength does not mean there is capacity on the network, it is frequently the case at event sites that a mobile phone will show full signal strength due to a temporary mobile mast being installed but there is not enough capacity in terms of data to service the devices so the network does not work. A useful analogy is comparing networks to a very busy motorway. You can get on, but you won’t necessarily go anywhere. The same can be true on a poorly designed Wi-Fi network, or a well-designed Wi-Fi network which doesn’t have enough internet capacity.

In fact you can have a low signal strength and still get very good data throughput on a well-designed network. Modern systems also use a technique known as ‘beam-forming’ where a device is not prioritised until it is actually transmitting data which means it may show a low signal strength which increases when it is doing something.

On the flip side your device may show a good signal strength but the quality of the signal may be poor, this could be due to interference, poor design or sometimes even weather & environmental conditions!

Wi-Fi networks are less secure than mobile networks

There are two parts to this, firstly all PDQs encrypt their data no matter what type of connection they use, they have to so that they meet banking standards (PCI-DSS) and protect against fraud. The second aspect is that a well-designed Wi-Fi network is as secure, if not more secure, than a mobile network. A good Wi-Fi network will use authentication, strong encryption and client isolation to protect devices, it should also be the case that all PDQs are connected to a separate ‘virtual network’ to isolate them away from any other devices.

You have to keep logging into the Wi-Fi network

Wi-Fi networks can be configured in many ways but for payment systems there should be no need to keep having to log in. This problem tends to be seen when people are trying to use a payment system on a ‘Public Wi-Fi network’ which will often have a login hijack/splash page and a time limit.

A multi-network M2M GPRS/GSM SIM is guaranteed to work

Sadly this is not true, although a PDQ with a SIM card which can roam between mobile networks and use GPRS or GSM may offer better connectivity, there is no guarantee. Some event sites have little or no coverage from any mobile operator and even where there is coverage, capacity is generally the limiting factor.

Mobile signal boosters will solve my problem

Mobile signal boosters, or more correctly signal repeaters, are used professionally by mobile operators in some circumstances, for example inside large buildings, to create coverage where signal strength is very weak due to their construction (perhaps there is a lot of glass of metal which can reduce signals from outside). In the UK the purchase and use of them by anyone outside of a mobile operator is illegal (they can cause more problems with interference). For temporary event sites they provide little benefit anyway as it is typically a capacity issue which is the root cause of problems.

A Personal hotspot (Mi-Fi) will solve my problem

Personal hotspots or Mi-Fi devices work by connecting to a mobile network to get connectivity and then broadcasting a local Wi-Fi network for devices to connect to. Unfortunately, at event sites where the mobile networks are already overloaded these devices offer little benefit, and even if they can get connected to a mobile network the Wi-Fi aspect struggles against all the other wireless devices. On top of that these devices cause additional interference for any existing on-site network making the whole situation even worse.

The Next Generation & the Way Forward…

The current disrupters in the payment world are the mobile apps with devices such as PayPal Here and iZettle. Although they avoid the traditional PDQ they still require good connectivity, either from the mobile networks or a Wi-Fi network, and hence the root problem still exists.

Increasingly exhibitors are also using online systems to extend their offerings at events via tablets and laptops which also require connectivity. An even better connection is required for these devices as they are often transferring large amounts of data, placing more demands on the network. Even virtual reality is starting to appear on exhibitors stands so there is no doubt that the demand for good connectivity will continue to increase year on year.

What the history of technology teaches us is that demand always runs ahead of capacity. This is especially true when it comes to networks. For mobile operators to deliver the level of capacity required at a large event is costly and complex, and in some cases just not possible due to limits on available wireless spectrum.

4G is a step forward but still comes nowhere close to meeting the need in high demand areas such as events, and that situation will worsen as more people move to 4G and the demand for capacity increases. Already the talk is of 5G but that is many years away.

For events, realistically, the position for the foreseeable future is a mixed one. For small events in a location well serviced by mobile networks with limited requirements then 3G/4G can be a viable option, albeit with risks. No mobile network is guaranteed and performance will always drop as the volume of users increases as it is a shared medium. There are no hard and fast rules around this as there are many factors but in simple terms the more attendees present the lower the performance!

For any sizeable event the best approach is a dedicated event network serviced with appropriate connectivity providing both Wi-Fi and wired connections. This solution facilitates usage for Wired IP-based PDQs, Wi-Fi PDQs, iZettle and other new payment devices, as well as supporting requirements for tablets, laptops and other mobile devices, each managed by appropriate network controls.

With the right design this approach provides the best flexibility and reliability to service the ever-expanding list of payment options. What is particularly important is that an event network is under the control of the event organiser (generally via a specialist contractor) and not a mobile operator, as this removes a number of external risks. For those without existing compatible PDQs the option of rental of a wired or Wi-Fi PDQ can be offered at the time of booking.

The key in all of this is planning and communication, payment processing has to be tightly controlled from a security point of view so it is important that enough time is available to process requests, especially where temporary PDQs are being set-up as they often require around 10 working days.

danger, office in a fire , a burning computer

The problem with modern IT is that on the whole it just works. Its reliability has made us lazy and overly confident so that when it does fail the pain is all the more intense. Twenty years ago a damaged floppy disk might have lost you 1.44 MB of data, now even a humble USB memory stick can have 64 GB of data on it.

The loss of some data is one thing but nearly all businesses are incredibly dependent on their IT systems, laptops, smartphones and internet connectivity. Businesses spend many thousands of pounds deploying systems which become integral to the operation of the business but frequently do not spend any time considering the what-if disaster scenarios or any approaches to mitigating those risks.

With many small and medium sized businesses now moving to cloud based solutions there seems to be an even more relaxed attitude due partly to the belief that cloud systems are 100% reliable. Unfortunately the cloud is no more than a buzz word behind which sits computer and networking equipment no different to any other IT system, and in the same way it will fail from time to time.

Hardware is very reliable and with redundant systems the physical side can be designed very effectively, however, there will be a single point of failure somewhere and more often than not today that point of failure is human – typically when making a configuration or software change.

Even an outage of a few hours can cost a business large amounts, from lost sales, production delays, shipping delays and a host of other aspects depending on the business type. It’s not just the IT systems directly though – fire, flood, terrorism, loss of building access, cyber-attack, loss of internet access, etc. all can have a potentially devastating impact albeit with varying degrees of probability.

Whether you are a sole trader or a large corporation a sensible approach to business continuity and disaster recovery is essential. For a small business it may be very straightforward but none the less it is important that the risks are reviewed and appropriate actions taken.

The first step is to identify the risks and run through each scenario noting down the potential impact. Each scenario can then be scored based on probability of occurrence and impact to the business. The next step is then to mitigate these risks as much as possible, looking at aspects such as processes, system design and environmental factors, from which a prioritised list of actions can be generated based on feasibility and cost.

This process is broad, covering physical building and operational aspects through to much maligned data backups but it is important that everything is looked at as it will always be the smaller details which cause the problems. One common issue for example is that in many buildings the internet connectivity comes into the building via the basement with sensitive networking equipment located in the area most at risk from flooding!

Not all risks can be eliminated so for those that remain the next step is to look at contingency. For example, a building fire or flood is likely to necessitate a relocation so a disaster recovery plan should be in place which details the steps and actions to be taken in the event such a disaster occurs. This may include pre-identified space in which to move to, stand-by equipment and a recovery plan for bringing services back online.

The biggest risk for most disaster recovery and business continuity plans is that they frequently do not get tested. Only when a disaster strikes does it get discovered that the system backups have been failing all along! (Yes, I have seen that happen) Checking and testing plans on a regular basis is a key part of the process, just like a fire drill.

Disaster recovery and business continuity planning is not necessarily as big a job as it might be perceived to be but without it the reality of a disaster is all the more painful. Relying on a ‘it won’t happen to us’ strategy is not good business practice.