Every year around early January, the tech press gets obsessed with looking into their crystal balls and trying to predict what the big things of the year ahead will be; and 2019 is naturally no exception.
On the one hand, the technology sector always needs to be forward-looking and ready for what's around the corner, as this is not an industry where it's easy to play catch-up. But on the other hand, for everyone who got into Apple on the ground floor, there are plenty more who end up with the 21st century equivalent of a storage unit full of Betamaxes.
Clearly, predicting what will truly be the future of technology and which will just be another passing fad is easier said than done. Therefore, we've put together a list of some of the key trends you'll undoubtedly be hearing about this year and why they'll be the next big thing - and why they might not.
1. Blockchain-based privacy
What is it? Blockchain runs on the idea of decentralization. If there is no single version of a database that offers the 'truth', but is instead spread out over multiple locations, it can't be tampered with. By also having a visible public authority list that tracks, records and verifies any transactions related to the data, it will be more secure.
Why it's the future: In today's security-conscious environment, understanding of the importance of protecting personal data is higher than ever. Blockchain offers a solution to this that ensures individuals retain ultimate control of their data, rather than entrusting it to third parties; something that may be the only way to guarantee privacy in the coming years.
Why it's a fad: Despite the fact it has a range of applications, blockchain still conjures up associations with the volatility of cryptocurrencies like Bitcoin where it first made its name, and this could have an impact in its trustworthiness for many users. And with suggestions that its security may not be all it's cracked up to be, it would be a brave firm to take the plunge at this stage.
2. Data science
What is it? The number one trend for many IT pros, data science is a catchall term for the management, analysis and interpretation of the vast amounts of information that businesses now collect. With so much to handle, being able to quickly and cost-effectively dive into this and get relevant answers will be essential in ensuring the right decisions are made in all parts of a business.
Why it's the future: The amount of data generated today is huge, and it's not going to get any smaller. In fact, as trends such as the Internet of Things become more commonplace, the amount of information created is set to shoot up even further, and this will quickly make traditional analytics methods unworkable at such large scales. Therefore, companies that don't have the right skills in place to handle this will find themselves drowning in a sea of unusable data.
Why it's a fad: As the amount of data increases, this will inevitably mean more automation and lead to AI being more of a requirement. Therefore, in years to come, the chances are that being able to program and instruct these solutions could well become much more of a priority for firms than looking for developers with the skills to work on data directly. These people will still be required, but data science talents alone won't be enough to get them noticed.
3. Deep learning
What is it? An advanced form of artificial intelligence, deep learning involves algorithms that are capable of learning and building up a repository of knowledge without human input, and then making decisions about the best course of action to take. This could be anything from identifying and classifying an image in an archive to controlling robotics.
Why it's the future: Deep learning helps make business decisions clear without running the risk of any human bias or errors clouding judgement. And as time passes and the algorithms gain experience, the results they offer should only get faster and more accurate, allowing businesses to have confidence in their decision-making and powering many other key aspects of AI such as natural language processing and image recognition.
Why it's a fad: While the accuracy levels of deep learning can be impressive, many firms may be pushing towards the technology without a full understanding of what it is, and is not, useful for. Like many hyped projects, this is likely to lead to disillusionment as companies come to realize it’s not the cure-all panacea they'd hoped for. Ultimately, deep learning needs to work in conjunction with other AI tools, and may only be a stepping stone on the path to more complete, well-rounded artificial intelligence.
4. Edge computing
What is it? Traditionally, most processing in any network is done at the data center, and this means sending data from wherever it is captured or created back to a central hub. Edge computing, also known as 'intelligent networks', challenges this idea and instead conducts the necessary functions as close as possible to the original source of the data, right at the very edge of a company's infrastructure.
Why it's the future: For many firms today, milliseconds matter, and the time taken to send data to a central server, process it, then act on the results can be costly and make true real-time processing impossible. For instance, if you've got a self-driving car, you can't waste time sending images from its sensors back to a data center to determine if the obstacle in front of you is a pedestrian. Moving these operations to the edge boosts speed and efficiency as network sprawl continues to grow.
Why it's a fad: For the vast majority of applications, this need for split-second results is likely to be overkill, and businesses will have to ask whether the marginal gains it offers will really be worth the extra complexity and management required.
5. IoT security
What is it? Everything today is connected to the network. TVs, cars, energy monitors and yes, even the traditional example of connected 'things', i.e. the refrigerator, are hooked into the internet. The problem this presents is hackers suddenly have a huge number of new potential weak points to exploit.
Why it's the future: IoT is big, and getting bigger all the time. Some estimates suggest there may be as many as 10 billion connected devices in use around the world by the end of the decade. But with so many touch points, it's a goldmine for hackers, and many IoT devices won't have strong security measures enabled by default. Therefore, it will be up to IT pros to make sure they're safe - and the consequences for not doing so could be severe.
Why it's a fad: Let's be clear, the idea of IoT security itself is most definitely not a fad; if you're running any form of connected device, protecting it must be a top priority. But as for the IoT itself? That's less clear-cut. It could well be the case that, as in the past, we'll eventually determine we don't need all of these devices feeding us a constant stream of mostly useless information, especially if it makes us less secure.
6. Prescriptive analytics
What is it? Prescriptive analytics is a subset of big data analytics that not only runs queries on a dataset, but also offers a range of potential business decisions based on its interpretation. This therefore can offer advice and assessments on areas such as risk and potential gain as the data comes in, rather than having to go through lengthy waits while reports are compiled.
Why it's the future: A common theme running through may of the trends for 2019 and beyond is speed. Cutting time to market is a top priority for many business executives, but until now, they have usually had to sacrifice accuracy in exchange for this speed. But with prescriptive analytics, they can enjoy both.
Why it's a fad: Descriptive analytics, prescriptive analytics, predictive analytics, or any other name are really just iterations of a wider goal of getting business intelligence to work better and faster than in the past. This is a constantly evolving area, and focusing too closely on one solution is a surefire way to get left behind. Ultimately, it would help businesses to take a more holistic approach to their data and, as tools like AI get better, hyped-up terms such as prescriptive analytics are likely to fall by the wayside as automation and advanced machine learning take over much of the workload.
7. Quantum computing
What is it? A fundamental change in how we approach technology, quantum computing replaces the 0 and 1 bits of traditional binary machines with 'qubits' that can also effectively exist in both states at once. In a nutshell, this means a huge increase in the amount of processing power available, which means quantum computers can run calculations on a scale unimaginable for standard machines.
Why it's the future: Quantum computing promises a world where computing can do things that would simply not be possible today because the sheer amount of power required to run the calculations would fry a standard binary-based computer. Among its many projected use cases are much more detailed and accurate simulations that could help with everything from developing new medications to tackling world hunger.
Why it's a fad: 'Fad' may be strong term for quantum, which has been described as the future of computing for decades. But the fact is that even after all this time, it's still in that 'five years from being mainstream' limbo. Its proponents argue that it is only now that the technology needed to make it a reality is finally catching up with the theory, but for others, it’s a case of having heard it all before.
8. Robotic process automation
What is it? Robotic process automation, or RPA, is set to help take much of the monotony out of everyday activities that are slow, resource-intensive or just too boring for a human to complete effectively. By following a clear set of instructions, RPAs can greatly speed up processes such as data capture and filling out standard forms.
Why it's the future: When people talk about the robots taking over jobs, this is where it starts - but that's not necessarily a bad thing. By taking monotonous, repetitive tasks out of the hands of humans, workers can focus more of their time and talents on activities that actually create value for the business. At the same time, companies can enjoy greater efficiency, better accuracy and lower costs for their more mundane activities.
Why it's a fad: Ultimately, RPAs are limited by the restrictions of their initial programming, and as such will only be useful for a relatively limited number of tasks. When confronted with any inputs or data that is even slightly outside their understanding, such as incomplete fields in a form or inaccurate source data, they will still have to rely on human intervention to address any issues, so they won't be completely replacing human staff any time soon.
9. Virtual assistants
What is it? You've heard of Siri, Alexa, and even Cortana, but the smart virtual assistants that can manage your diary, read you your messages and book you a cab to the airport are getting smarter all the time, and it won't be long before they are genuinely able to manage every aspect of your working life
Why it's the future: The next generation of virtual assistants can do a lot more than just queue up your Spotify playlist. As they learn how to interpret more business information, they'll be used to help teams meet, resolve problems, track productivity, and offer useful information for each worker, wherever they are. They can even phone businesses and make you an appointment on their own, making them more like a real assistant than ever.
Why it's a fad: The early days of consumer-level virtual assistants have been filled with privacy concerns and other issues related to errors in how it interprets instructions. Stories such as the creepily-laughing speakers, or the device that sent all of its owner's personal details to a complete stranger are still far too common. Therefore, there are still far too many kinks to iron out before enterprise users are likely to trust them with anything more sensitive than a pizza order.