Insights

What will the job of a computer engineer look like in 20 years? – Part 1

October 7, 2021
What will the job of a computer engineer look like in 20 years? – Part 1
Many of the engineering jobs that will be common in 20 years have not been invented and they are going to be heavily influenced by the revolution that lies ahead.

|---Module:text|Size:Small---|

One might think that if we look back into the past, we can glimpse the future. Perhaps, but if we want to really understand how things may change in the next 20 years, we need to look way back – maybe 100 years. And see the waves of disruption that happened during that time. Eons of evolution are being squeezed into a “short” period, and everything is moving like a gigantic, unstoppable wave. This wave is fed by daily discoveries, vast amounts of data and information, knowledge, people’s creativity, and so on.

So, how can one realistically preview what the future holds for a computer engineer? Well, we can make educated guesses based on current trends and emerging technologies. This article will look into just a few of those technologies and provide my view on how things will turn out. There are many other exciting topics that I would like to discuss, like no-ops or no code, virtual/augmented reality, edge computing, 6G, etc., but I’ll leave them for another time. So, buckle up.

Artificial Intelligence and Machine Learning

Although Artificial Intelligence (AI) and Machine Learning (ML) were born much earlier than that, I gained particular awareness of them back in 1996, in the epic chess battle of Garry Kasparov against Deep Blue from IBM. Kasparov won the first set of games in 1996, so the team behind Deep Blue went back and worked relentlessly to improve the AI. They increased the processing power, the knowledge base, the database of past games, and many other things.

In 1997, Deep Blue and Kasparov replayed their six-game match. The chess-master won the first game, Deep Blue won the next one, the following three ended in a draw, and the final one was won by Deep Blue again. What a match! Even today, many people ask how the machine managed to win against one of the greatest chess masters alive. Several texts are outlining how they did it, but my question is the total opposite: how did Garry Kasparov beat Deep Blue, a computer capable of going through 200 million possible chess positions per second, had access to millions of games already played and understood advanced playing techniques?

This “men vs. machine” event was twenty-four years ago when ML was rapidly evolving and pushing the limits of AI. ML is a form of data analysis, where systems learn from data, identify patterns, and try to make informed decisions with little or no human intervention. Today we’re in the era of Deep Learning, where computers use advanced algorithms to create neural networks that can learn from data and make intelligent decisions independently. Computers are doing remarkable things with the sheer amount of data at their disposal. For instance, GPT-3 from OpenAI can write software applications based on natural language: you describe what you need, and the program generates the code all by itself. It’s awe-inspiring.

AI/ML jobs are a premium role today, but they will be everywhere in twenty years. Engineers will need to understand and work with AI tools daily. There’ll be highly advanced and optimized chipsets for AI running inside your PC, car, mobile, TV, microwave, or any IoT device. AI will be omnipresent and, in some cases, omniscient.

Some say that we’re engineering ourselves out of existence. I don’t believe this, although we do need to exercise caution. Sophia, a social humanoid robot from Hanson Robotics, gave a canned response when asked how humans and robots will work together. “Robots can free humans from the most repetitive and dangerous tasks, so they can use their time in what they are best, being creative and solving complex problems. Robot intelligence does not compete with human intelligence; it completes it”.

That’s right. Creativity and imagination will be THE best-valued assets of Engineering roles in the future. You simply cannot automate them. That’s why, 20 years ago, Kasparov still managed to beat Deep Blue against all odds.

Quantum Computing

In 1965, an Engineer named Roger Moore noticed an interesting trend: the number of transistors in an integrated circuit doubled every 18 months. In layman’s terms, this meant that the processing power of computers was doubling every couple of years, and the cost of production was cut in half in the same period. Several years have gone by, but Moore’s Law continues to hold today, albeit steadily slowing down. There are, however, limits to the number of transistors that can fit into an integrated circuit.

So, the question is, what’ll happen next? Will computers stop advancing at a certain point? Certainly not. Present computer architectures will evolve and explore other directions, and new ones will appear or mature. Which ones? Let’s talk about quantum computing.

Information in Quantum computers is stored in qubits instead of bits. Like standard computers, they represent data as zeros or ones, but they can also assume both values through something called superposition in quantum mechanics. In simple terms, they can be 35 percent zero and 65 percent one, for instance.

The result is that, while a conventional computer with n bits can do up to n calculations at once, quantum computers can perform two powered to n. Furthermore, there’s also a law for quantum computing (Neven’s Law), which states that computing power is achieving “doubly exponential growth relatively to conventional computing”. That’s interesting: all of a sudden, the law that said that processing power doubles every couple of years in silicon-based computers, is exponential in the quantum world.

Unfortunately, quantum computing is so fundamentally different from a conventional computer that it’ll take some time to gauge the benefits. “The differences between quantum computers and classical computers are even more vast than those between classical computers and pen and paper”, said Peter Chapman, CEO of quantum startup IonQ. While in a standard computer, you take an input, apply an algorithm, and expect a clear output, quantum computers provide you with an estimate of how probable an answer is.

So, we go from a deterministic to a stochastic model, which people are not very used to in the programming world. However, this can be pretty useful when working with problems that have different inputs and complex scenarios to evaluate.

There are a plethora of possibilities and areas that can take advantage of quantum computing, but I want to highlight just a couple of them:

  • Artificial intelligence – as we’ve already seen, AI is rapidly expanding due to the sheer amount of available data, evolution in algorithms, and the increase of processing power. Unfortunately, that processing power is still not enough for what engineers and scientists want to achieve. Jump to the quantum world, and the possibilities are unlimited.
  • Cryptography – today, security protocols rely on different mathematical problems or techniques to generate a password. These techniques are the perfect target for quantum computers, as they can easily use brute force to solve them and compromise security. Researchers are already working on quantum-safe cryptography, but there’s a steep road ahead. Imagine the y2k bug on steroids: instead of fixing a date issue, you need to replace an entire set of security algorithms in a billion programs around the world. In the end, however, we’ll have a much more secure infrastructure to support the years ahead.

All being said and done, Quantum computers won’t be a commodity anytime soon. They need to be kept at temperatures close to absolute zero, which means that it is very unlikely that you and I will have one under the table in the foreseen future. Still, it’s a perfect opportunity for cloud providers like Microsoft, IBM, Amazon, and Google. They already provide tools to explore quantum computing, so it’s not a question of “if”, but “when”.

In Part 2 of this article, we cover the domains of Blockchain and Cloud-based services. Stay tuned!

AI

MachineLearning

Written by
Download White Paper
Ready for a deep dive?