3G-4G-5G

Pretty much every day I’m either asked or told about the way 5G is going to ‘change everything’. I’m currently sitting on a train on my way into Paddington and my 4G connection has dropped multiple times and when it is connected the speed varies from a trickle to occasional bursts that reflect the speeds I would expect on 4G. We all crave ubiquitous connectivity at a good speed but the reality is somewhat behind the hype.

I’m reading yet another article saying that 5G will deliver speeds in excess of 20Gbps and I’ll be able to download a HD movie in a few seconds. Ridiculous claims which accompany most technology launches – the claims for 4G were 1Gbps but the experience, in the best case, is at least a magnitude lower.

We have come a long way from GPRS when any data access was painfully slow but each technology step forward becomes more complex, more costly and delivers less of a step function change to the user. In reality when 5G is deployed into the world it will meet a mixture of technical and commercial challenges which will reduce the hype to more of a ‘4G evolution’ – it will be better, and that’s great, but not the revolution promised, at least not initially.

One of the problems is that 5G needs to operate across multiple frequencies. This is not a new technical challenge, the same type of approach is used for 3G and 4G networks, however, some of the frequencies required to make 5G work are much higher up in the spectrum, known as millimetre waves (mmwaves). These very high frequencies are where the headline speeds of 20Gbps come from, but there is a problem – these very high frequencies require line of sight to the mobile device and are heavily impacted by rain, trees, buildings, etc. Millimetre waves also have a low range so the deployment of the infrastructure is complex and requires many more antennas (base stations or cell towers) than the lower frequencies of today. It’s also worth noting that new handsets will be required to use these new frequencies so there will be a long period where many devices will not have the required circuitry inside to make use of the higher speeds.

If you consider that today a good example of 4G may deliver 10-50Mbps depending on various factors, it is expected that 5G may reach 100-200Mbps in similar circumstances – that’s a good increase but it isn’t 20Gbps, and it certainly will not be available widely, not in the short to medium term. As for remote locations they will be stuck on the lower frequencies which will restrict what can be offered.

As with Wi-Fi the connection speed is only half the problem, the backhaul internet connection is just as important and to support many users all expecting 5G speeds of 100Mbps requires a massive upgrade to the supporting fibre and wireless backhaul networks. Ignoring the significant cost implications of all this the practicalities of deploying the infrastructure is going to take time and it is expected that it will be at least 2022 before we see any serious progress on 5G.

So what does this mean for events?

The first thing to note is that on mobile devices over 60% of data traffic is carried over Wi-Fi networks rather than the mobile networks and that figure has grown, not shrunk, over the last few years. With mobile operators still recovering the cost of a 4G infrastructure, now faced with an even more expensive 5G infrastructure, the data plans are not going to get any cheaper so the cost conscious consumer will still hunt out Wi-Fi wherever possible.

Although a 5G speed of 200Mbps sounds great a good Wi-Fi network today can deliver speeds well over 200Mbps and already approved Wi-Fi standards go much further (The existing 802.11ac standard goes up to 3.45Gbps) so for the really serious data users Wi-Fi will remain the primary choice. It’s not a case of 5G versus Wi-Fi, both will co-exist – they are designed differently for different purposes – but in the event space Wi-Fi will be continue to be a pre-requisite for many years yet.

As with 4G the initial deployments of 5G will be focused on cities with a high density of users, less dense areas will be some way off and of course the countryside fields used for many outdoor events will be no better served than they are today as commercially it makes no sense unless a particular sponsor wants to put in temporary masts.

With the low range nature of the high frequency 5G spectrum what this does mean is that indoor venues such as conference centres could well be a target for early deployments given the business orientated footfall who are more likely to have handsets capable of using the 5G network. Another good example would be large stadiums but, in a similar way to Wi-Fi, in these locations the cost of infrastructure to support the density of users is significant so it will come down to who will pay.

Venue owners who want mobile coverage to be good in their venue but expect the mobile operators to pay have to consider that mobile operators can struggle to recoup the costs of a dedicated install as they generate no additional revenue from it. The use of shared infrastructure between mobile operators may help this but it doesn’t change the fact that someone has to pay.

Once 3G reached wide deployment then it became a viable option for using it to provide the backhaul internet connection in some circumstances with Wi-Fi as the access medium. 4G continued that trend and 5G will go further but limitations still apply. 5G is a shared medium so overall performance will depend on other users unless dedicated bandwidth is offered (something that is hard to do on 4G), so in an event environment as the number of attendees goes up the performance goes down.

With 4G operators have been reluctant to offer dedicated bandwidth over their infrastructure, something that would be fantastic for events. Perhaps this will change with 5G but even then, with the price of dedicated fibre circuits continuing to fall, shared services such as FTTP (Fibre To The Premise) now becoming available and improving Wireless Internet Service Provider (WISP) offerings the pricing from the mobile operators would have to be very competitive.

Without any guaranteed bandwidth then 5G is no different to 4G which means that it is fine for general use but if the performance is critical then there are no guarantees. You would not want to risk your ticketing, payment systems, video streams and core control systems on a mobile network over which you have no control but for less critical services it may be an option.

Over time 5G will provide another option in the event toolbag as a step forward from 4G but its real benefit will be to the truly mobile individual user and then hopefully on a train journey into London I can get a consistent 20Mbps all the way, but I fear that ideal is still quite a few years away.

‘Robust Internet for Events’ was our topic of choice when I sat with Craig Mathie, MD of the Bournemouth Rugby Sevens event, a few weeks ago for an open forum at Event Tech Live in the Truman Brewery. Around 30 people joined us from a range of events to cover what many people think is the ‘boring’ side of internet at events, primarily because it should already be sorted, just like power and water, but in many cases it is not.

One thing that I love about events, but is also the most challenging, is that most of the time we are starting, in some respect, from scratch. The event team might know how to plan an event but it’s probably a new venue, or a new team in the venue, or a new green field site, or a new sponsor. Something is always changing and because of that, even on the ‘basics’ side it is critical to consider key elements.

From the discussion I have summarised some of the key points below;

What’s the best type of internet for my event?

Rather like what’s the best type of car for my family, it depends on what you are trying to do and by when. I went through some examples of venues which are well known for their events, so they have invested in high speed connectivity, such as a leased line which means that over three years the service is cheaper to the event delivering the best speeds. For a new site, or one with short notice, we tend to bring in internet wirelessly from another location, or using copper services (which are mostly for the consumer market but can work in a pinch) or satellite.

Does whole venue or site wide Wi-Fi work?

Yes. If you want site wide Wi-Fi it can be delivered no problem. It is technically difficult but hey – that’s why you work with Etherlive! Venue wise it’s generally simpler to install, outdoors can be a bit more complicated but still very achievable if planned appropriately.

Is contactless payment reliable?

Yes. There is no reason for any technical issues with deploying contactless systems. Reports of issues are generally related to the devices themselves rather than the connectivity.

Isn’t 5G going to remove the need for Wi-Fi on site?

No. 5G, like 4G and others before it are great technologies just with better and bigger marketing budgets. Cellular technology (of which 5G is the latest standard) is designed to provide high speed connections for large populations, like towns, so it’s not designed to handle a very high peak of people in a specific area. It also isn’t really ‘supported’ like events need (have you ever tried to phone a mobile phone operator, complain about service in an area, and get an engineer to attend?). One advantage when 5G is introduced will be devices having another network option to connect – this will free up some of the 4G (and 3G) service.

A Career in EventsYou can get a great buzz out of attending a good event, and if you organise it the buzz is even better. Every successful event requires a team of dedicated professionals with a range of skills but often the event industry isn’t considered as a career choice.

At the recent ESSA (Event Supplier and Services Association) AGM a fantastic video was shown about working in the events industry. It really speaks to how passionate those of us in the industry are about making event magic happen, mostly behind the scenes working with festivals, exhibitions, corporate launches, sporting events and one-off special events like a Royal Wedding!

We welcomed our 4th full year intern last year, and this year we will be taking on two more to support our growing engineering teams in the field. For the first time this year we also have an apprentice, a great way to get exposure in the events world, so if you know anyone looking to take that first step don’t forget about the events industry.

Of course, those with more experience are just as welcome, our company like many others is built on the years of experience gaining from all quarters including large and small companies.

We do all have one thing in common, we are one team and we love events!

Tucked away in the corner of a bustling Christmas market is one of fifteen, non-descript little grey boxes. Like most IT equipment it’s pretty drab on the outside, the clever bit is hiding inside. We call it an EMU – an Etherlive Monitoring Unit – and it is in effect part of the emerging world of the ‘internet of things’ but with an events twist.

The EMU only measures 4cm x 6cm x 3cm but inside it is packed with an array of features. At the Christmas market it is busy monitoring the power being delivered from the generators, tracking the voltage, frequency and, critically, the amount of run-time available from the uninterruptible power supply if the generator should fail for any reason. It monitors this information in real-time and relays it back across the on-site network to the central management system. If a problem is detected the system generates alerts, sending out emails and SMS messages to key contacts as well as displaying information on a management console.

It can even assist with detecting wiring issues, noticing for example when a fault causes a floating neutral situation which results in dangerously high voltages being delivered to power outlets, protecting both equipment from damage and people from potential injury.

With real-time monitoring of power the risk of outages in systems which rely on power, such as the core IT infrastructure delivering phones, CCTV, ticket scanning and payment systems, can be minimised, which is critical on a busy event site.

The EMU though can do a lot more than just watch the power, it is a general-purpose monitoring tool capable of taking inputs from a range of sensors – these can be heat, smoke, fluid, sound, light or in fact pretty much anything that can be monitored via a sensor capable of producing an analogue or digital output. The fact that when the EMU powers up it connects to the site network, identifies its location and checks into the central management system, makes it simple to deploy.

Increasingly we are deploying EMUs across event sites as part of our ‘Intelligent Event’ strategy combined with other data sources such as CCTV, mobile device heat mapping, access control, crowd analytics and even social media data. As the number of data sources increases a detailed real-time picture of the event can be seen in event control speeding up decision making and enabling pro-active management of aspects such as water reserves, fuel levels and area capacities. The intelligent event system can also help identify cost savings and optimisation opportunities on site.

The ‘internet of things’ can sometimes be seen as marketing hype but go beyond the fluff and with the right components and approach, it can help solve real problems very effectively.

Payment Terminal

Technology is entwined with much of our day to day lives, with no better example than the growth of smartphone adoption, a device now seen as a must-have. Payment and banking is almost unrecognisable from ten years ago with online banking, mobile apps, ‘chip and pin’, contact-less payments and online payments.

At events, however, many attendees often find trying to make a simple credit/debit card payment can be a frustrating and unreliable experience. For ourselves as technology providers ‘credit card machines’ or PDQs as they are known, come top of the list of complaints from event organisers, traders and exhibitors.

These problems not only cause frustration for attendees but also present a serious issue in terms of financial return for traders and exhibitors, and their desire to be present at events. It is well documented that the ability to take contactless and chip & pin payments at events increases takings, reduces risks from large cash volumes and can improve flow and trackability.

So why is it such a problem? Much of the issue comes down to poor communication and misinformation on top of what is already a relatively complex environment. Card payments and the machines which can take payments are highly regulated by the banking industry meaning they tend to lag behind other technology, however, this can be overcome and a properly thought through approach can deliver large scale reliable payment systems.

Bad Terminology

A lot of the confusion around PDQ machines comes from the design and terminology used. Although the machines all look the same there are differences in the way they work. Nearly all PDQs use the design of a cradle/base station with a separate handheld unit. The handheld part connects to the base station using Bluetooth. This is where the confusion starts as people often describe these units as ‘wireless’ because of the Bluetooth, however, their actual method of connectivity to the bank may be one of four different types:

  • Telephone Line (PSTN – Public Switched Telephone Network) – This is the oldest and, until a few years ago, the most common type of device, it requires a physical telephone line between the PDQ modem and the bank. It’s slow, difficult and very costly to use at event sites because of the need for a dedicated physical phone line, however, once it is working it is reliable.
  • Mobile PDQ (GPRS/GSM) – Currently the most common form of PDQ, it uses a SIM card to connect to a mobile network to use GSM or GPRS to connect back to the bank. Originally seen as the go anywhere device, in the right situation they are excellent, however, they have limitations, the most obvious being they require a working mobile network to operate. At busy event sites the mobile networks rapidly become saturated and this means the devices cannot connect reliably. As they use older GPRS/GSM technology they are also very slow – it doesn’t make any difference if you try and use the device in a 4G area – it can only work using GPRS/GSM. As they use the mobile operator networks they may also incur data charges.
  • Wi-Fi PDQ – Increasingly common, this version connects to a Wi-Fi network to get its connectivity to the bank. On the surface this sounds like a great solution but there some challenges, firstly it needs a good, reliable Wi-Fi network. The second issue is that many Wi-Fi PDQs still operate on the 2.4GHz Wi-Fi spectrum which on event sites is heavily congested and suffers lots of interference making the devices unreliable. This is not helped by the relative weak Wi-Fi components in a PDQ compared to a laptop for example. It is essential to check that any Wi-Fi PDQ is capable of operating in the less congested 5GHz spectrum.
  • Wired IP PDQ – Often maligned because people think it doesn’t have a ‘wireless’ handset, but they are actually the same as all the others and have a wireless handset but it uses a physical wire (cat5) from the base station to connect to a network. In this case the network is a computer network using TCPIP and the transactions are routed in encrypted form across the internet. If a suitable network is available on an event site then this type of device is the most reliable and fastest, and there are no call charges.

All of these units look very similar and in fact can be built to operate in any of the four modes, however, because banks ‘certify’ units they generally only approve one type of connectivity in a particular device. This is slowly starting to change but the vast majority of PDQs in the market today can only operate on one type of connectivity and this is not user configurable.

On top of these aspects there is also the difference between ‘chip & pin’ and ‘contactless’. Older PDQs typically can only take ‘chip & pin’ cards whereas newer devices should also be enabled for contactless transactions.

Myth or Fact

Alongside confusion around the various types of PDQs there is a lot of conflicting and often inaccurate information circulated about different aspects of PDQs. Let’s start with some of the more common ones.

I have a good signal strength so why doesn’t it work?

The reporting of signal strength on devices does nothing but create frustration. Firstly because it is highly inaccurate and crude, and secondly because it means very little – a ‘good’ signal indicator does not mean that the network will work!

The issue is that signal strength does not mean there is capacity on the network, it is frequently the case at event sites that a mobile phone will show full signal strength due to a temporary mobile mast being installed but there is not enough capacity in terms of data to service the devices so the network does not work. A useful analogy is comparing networks to a very busy motorway. You can get on, but you won’t necessarily go anywhere. The same can be true on a poorly designed Wi-Fi network, or a well-designed Wi-Fi network which doesn’t have enough internet capacity.

In fact you can have a low signal strength and still get very good data throughput on a well-designed network. Modern systems also use a technique known as ‘beam-forming’ where a device is not prioritised until it is actually transmitting data which means it may show a low signal strength which increases when it is doing something.

On the flip side your device may show a good signal strength but the quality of the signal may be poor, this could be due to interference, poor design or sometimes even weather & environmental conditions!

Wi-Fi networks are less secure than mobile networks

There are two parts to this, firstly all PDQs encrypt their data no matter what type of connection they use, they have to so that they meet banking standards (PCI-DSS) and protect against fraud. The second aspect is that a well-designed Wi-Fi network is as secure, if not more secure, than a mobile network. A good Wi-Fi network will use authentication, strong encryption and client isolation to protect devices, it should also be the case that all PDQs are connected to a separate ‘virtual network’ to isolate them away from any other devices.

You have to keep logging into the Wi-Fi network

Wi-Fi networks can be configured in many ways but for payment systems there should be no need to keep having to log in. This problem tends to be seen when people are trying to use a payment system on a ‘Public Wi-Fi network’ which will often have a login hijack/splash page and a time limit.

A multi-network M2M GPRS/GSM SIM is guaranteed to work

Sadly this is not true, although a PDQ with a SIM card which can roam between mobile networks and use GPRS or GSM may offer better connectivity, there is no guarantee. Some event sites have little or no coverage from any mobile operator and even where there is coverage, capacity is generally the limiting factor.

Mobile signal boosters will solve my problem

Mobile signal boosters, or more correctly signal repeaters, are used professionally by mobile operators in some circumstances, for example inside large buildings, to create coverage where signal strength is very weak due to their construction (perhaps there is a lot of glass of metal which can reduce signals from outside). In the UK the purchase and use of them by anyone outside of a mobile operator is illegal (they can cause more problems with interference). For temporary event sites they provide little benefit anyway as it is typically a capacity issue which is the root cause of problems.

A Personal hotspot (Mi-Fi) will solve my problem

Personal hotspots or Mi-Fi devices work by connecting to a mobile network to get connectivity and then broadcasting a local Wi-Fi network for devices to connect to. Unfortunately, at event sites where the mobile networks are already overloaded these devices offer little benefit, and even if they can get connected to a mobile network the Wi-Fi aspect struggles against all the other wireless devices. On top of that these devices cause additional interference for any existing on-site network making the whole situation even worse.

The Next Generation & the Way Forward…

The current disrupters in the payment world are the mobile apps with devices such as PayPal Here and iZettle. Although they avoid the traditional PDQ they still require good connectivity, either from the mobile networks or a Wi-Fi network, and hence the root problem still exists.

Increasingly exhibitors are also using online systems to extend their offerings at events via tablets and laptops which also require connectivity. An even better connection is required for these devices as they are often transferring large amounts of data, placing more demands on the network. Even virtual reality is starting to appear on exhibitors stands so there is no doubt that the demand for good connectivity will continue to increase year on year.

What the history of technology teaches us is that demand always runs ahead of capacity. This is especially true when it comes to networks. For mobile operators to deliver the level of capacity required at a large event is costly and complex, and in some cases just not possible due to limits on available wireless spectrum.

4G is a step forward but still comes nowhere close to meeting the need in high demand areas such as events, and that situation will worsen as more people move to 4G and the demand for capacity increases. Already the talk is of 5G but that is many years away.

For events, realistically, the position for the foreseeable future is a mixed one. For small events in a location well serviced by mobile networks with limited requirements then 3G/4G can be a viable option, albeit with risks. No mobile network is guaranteed and performance will always drop as the volume of users increases as it is a shared medium. There are no hard and fast rules around this as there are many factors but in simple terms the more attendees present the lower the performance!

For any sizeable event the best approach is a dedicated event network serviced with appropriate connectivity providing both Wi-Fi and wired connections. This solution facilitates usage for Wired IP-based PDQs, Wi-Fi PDQs, iZettle and other new payment devices, as well as supporting requirements for tablets, laptops and other mobile devices, each managed by appropriate network controls.

With the right design this approach provides the best flexibility and reliability to service the ever-expanding list of payment options. What is particularly important is that an event network is under the control of the event organiser (generally via a specialist contractor) and not a mobile operator, as this removes a number of external risks. For those without existing compatible PDQs the option of rental of a wired or Wi-Fi PDQ can be offered at the time of booking.

The key in all of this is planning and communication, payment processing has to be tightly controlled from a security point of view so it is important that enough time is available to process requests, especially where temporary PDQs are being set-up as they often require around 10 working days.

danger, office in a fire , a burning computer

The problem with modern IT is that on the whole it just works. Its reliability has made us lazy and overly confident so that when it does fail the pain is all the more intense. Twenty years ago a damaged floppy disk might have lost you 1.44 MB of data, now even a humble USB memory stick can have 64 GB of data on it.

The loss of some data is one thing but nearly all businesses are incredibly dependent on their IT systems, laptops, smartphones and internet connectivity. Businesses spend many thousands of pounds deploying systems which become integral to the operation of the business but frequently do not spend any time considering the what-if disaster scenarios or any approaches to mitigating those risks.

With many small and medium sized businesses now moving to cloud based solutions there seems to be an even more relaxed attitude due partly to the belief that cloud systems are 100% reliable. Unfortunately the cloud is no more than a buzz word behind which sits computer and networking equipment no different to any other IT system, and in the same way it will fail from time to time.

Hardware is very reliable and with redundant systems the physical side can be designed very effectively, however, there will be a single point of failure somewhere and more often than not today that point of failure is human – typically when making a configuration or software change.

Even an outage of a few hours can cost a business large amounts, from lost sales, production delays, shipping delays and a host of other aspects depending on the business type. It’s not just the IT systems directly though – fire, flood, terrorism, loss of building access, cyber-attack, loss of internet access, etc. all can have a potentially devastating impact albeit with varying degrees of probability.

Whether you are a sole trader or a large corporation a sensible approach to business continuity and disaster recovery is essential. For a small business it may be very straightforward but none the less it is important that the risks are reviewed and appropriate actions taken.

The first step is to identify the risks and run through each scenario noting down the potential impact. Each scenario can then be scored based on probability of occurrence and impact to the business. The next step is then to mitigate these risks as much as possible, looking at aspects such as processes, system design and environmental factors, from which a prioritised list of actions can be generated based on feasibility and cost.

This process is broad, covering physical building and operational aspects through to much maligned data backups but it is important that everything is looked at as it will always be the smaller details which cause the problems. One common issue for example is that in many buildings the internet connectivity comes into the building via the basement with sensitive networking equipment located in the area most at risk from flooding!

Not all risks can be eliminated so for those that remain the next step is to look at contingency. For example, a building fire or flood is likely to necessitate a relocation so a disaster recovery plan should be in place which details the steps and actions to be taken in the event such a disaster occurs. This may include pre-identified space in which to move to, stand-by equipment and a recovery plan for bringing services back online.

The biggest risk for most disaster recovery and business continuity plans is that they frequently do not get tested. Only when a disaster strikes does it get discovered that the system backups have been failing all along! (Yes, I have seen that happen) Checking and testing plans on a regular basis is a key part of the process, just like a fire drill.

Disaster recovery and business continuity planning is not necessarily as big a job as it might be perceived to be but without it the reality of a disaster is all the more painful. Relying on a ‘it won’t happen to us’ strategy is not good business practice.

Virtual reality has been around for many, many years. From early home console attempts which could be bolted onto a Super Nintendo (anyone remember the
VictorMaxx?), to those massive virtual reality units which appeared for a while in arcades. Until recently however the desire was a long way ahead of the technology. Possibly we are now seeing a point where technology has caught up enough to make virtual reality the next big thing.

With the UK release of thVirtual Reality Evente Oculus Rift, a project which has been exciting the technology community for some time, many argue that home virtual reality has finally arrived. Cheaper projects like Google Cardboard demonstrate that even those who do not want to splash out mega bucks can get themselves a piece of the action. Alongside full virtual reality there has also been an explosion in competitively priced 360 degree cameras bringing a much broader accessibility to a more immersive video experience.

This is, of course, is only ‘virtual’ step one. To many, true virtual reality is a completely immersive experience you can walk, move and interact with. Liken it to the famous Holodeck of the Star Trek series. What we do have now though is a major step up from the past and a critical mass to iron out those remaining issues.

All this is very nice, but what does that matter for events?

Today’s technology could, and undoubtedly will over time, have an impact on what we understand as an event experience today.

What if you can sell a virtual ticket for your event to those with a disability which would normally impede their ability to attend? The experience might not quite be the same for a music event, nor can it ever really replace the handshake of a networking meeting, but facilitating access to events beyond the physical capacity is an exciting prospect for both the event organiser and attendee.

Or how would sponsors feel if they could replicate the excitement of a big product launch or latest artist after the event as part of their premium content? For training events there are many possibilities, enabling interactive and immersive sessions particularly suited to things like incident management and response, and also making virtual/remote training a more inclusive experience.

Needless to say, at this stage, the possibilities are exciting, things could go, virtually, anywhere (sorry!).

Laptops on deskIt may sound obvious but defining a clear specification for technical services means you’ll get what your event needs, when you want it at the right price.

Defining a clear specification in turn sets clear expectations. Without it assumptions will be made which can cause difficulties once works start. From the organiser perspective without documented specifications it can be challenging to tie a supplier down if they fail to deliver. It also ensures that pricing can be compared ‘apples to apples’ otherwise each supplier will make different assumptions leading to very different responses.

Customers often fear talking to suppliers prior to asking for formal quotes, yet using a supplier’s knowledge and experience can be a useful way to clarify what the actual requirements are.

A breadth of a scope or requirements document will vary based on the size of project but there are a few key areas which come up on nearly all projects:

  • Total Internet Capacity – Calculate how many concurrent people will need to use the internet at the same time. This may be lower than the total number of users who will be connected to the Wi-Fi at some point during an event. For example, if users are downloading an app, will they most likely do it when they first arrive? Are all the users going to be streaming? Is more of the usage just email/social media/browsing which typically has a lower concurrency.
  • Wireless (Wi-fi) coverage – Site plans make this much easier especially when coupled with an overlay of which areas need coverage helps with getting what you need. Events taking place within venues can usually supply dimensions of the space.
  • Engineer support – If the technology systems are a critical part of the event an engineer on site will mean help is on hand if anything goes wrong, or if plans change. Detailing the hours required removes any assumptions as to when support is expected on-site.
  • Build schedules – Everyone in the events industry knows that things change, but a concise plan of when things are happening, which is documented, can really help for quotes.
  • Integration – It is often the case that Wi-Fi and internet access will be used for multiple services from different suppliers and it really helps for this to be specified at quote stage to enable the technical supplier to understand what integration may be required – for example mobile apps, social media experiences, payment systems, streaming, etc.
  • Other services – There are also other areas which may help the event run smoothly. That could be desk phones (really helps when you need a reliable, secure, phone with a number than can be published prior to the event) printers, CCTV system (areas which need to be covered)

We help lots of customers design their specification before taking it to the market to ensure they are sourcing either as part of a brief phone call or a formal tender development engagement. A clear list of requirements helps the customer enforce what they asked for, and the supplier understand what they are providing. A little time invested in getting the specification correct reduces the risk of confusion and issues later on.