2 trends you can’t miss in 2021

The 2020 prediction year is unique. This year has been a financial and emotional disaster for many and 2020 has tested us in ways that we hope no year will.

Even as recent Deloitte’s survey showed that 77% of CEOs reported that the COVID-19 crisis accelerated their digital transformation plans, trying to predict anything right now seems almost inconceivable.

There are still too many questions: “How will we all bounce back from 2020?”, “How quickly will we get back to normal?” and “What will ‘normal’ look like?”

That’s why in this post, I’m trying to look at 2 trends that I think are worth taking a step back and taking a far-reaching approach: “Augmented Intelligence” and “Smart and Composable Applications.”

The case of “All augmented”

At the end of October 2020, Gartner published its “AI Radar”, a research tool that highlights 24 technologies related to artificial intelligence that are sure to affect our future.. You’ll find all kinds of predictions, from fully autonomous cars to smart biological enhancements. In my opinion, the most disruptive is the trend of “Smart and composable applications” that Gartner predicts will not happen for another 6-8 years. I’ll explain below why I think you should analyze this trend as I predict it will materialize faster than we can predict.

First, let’s look at trend n. 1: Augmented intelligence.

This trend refers to the optimization of practices through the intelligent use of automation with Artificial Intelligence (also known as AI). Take the data prep process – your people spend a disproportionate amount of time searching, gathering, and cleaning data before they can do anything with it. The process is usually very manual, error prone, and lengthy. However, these tasks seem repetitive, and for the most part, they seem fairly easy to automate.

So it makes sense to invent software to inspect data, derive patterns from it, and apply intelligent resolution to it. For example, if an algorithm can detect that a piece of data is confidential (credit card numbers, for example), it should automatically obfuscate it. If an algorithm detects that the physical addresses are grouped in the same column, it might determine that separating the data into multiple columns would simplify the analysis later. And, since you separate the data, if you find that specific records are missing information, you could enrich the address information by dynamically looking for external data sets – a great example of such a use case could be zip code autocompletion based on streets directions.

If such a scenario sounds futuristic to you, believe me, it is not: the idea of ā€‹ā€‹”augmented” was popularized by Gartner Rita Sallam in mid-2017. Since then, the “augmented everywhere” trend has taken off and a growing number of vendors and industry analysts have noted its impact. Sure, there have been some skeptics who feared that “augmented” would dangerously change our relationship with algorithms. Some have proposed that “augmented” could work both ways: machines could “augment” human tasks with speed and efficiency, while humans could “augment” machine tasks by applying bias auditing and judgment.

However, no later than last month, BARC, A European-based research company found that “augmented” was actually being implemented much less than everyone expected. Integrating artificial intelligence into everything we do is proving more challenging than anticipated, but While we may question how quickly the “enlargement” will be adopted, it has become difficult to believe that it will not happen.

Your IT environment is an ecosystem, and too many adjacent trends are sure to make augmentation a sticky concept – the cloud makes it easy to access computing power to process complex tasks at scale. And it also makes it affordable to find reliable information to fill in your data.

According to Gartner, in the next two years, Public cloud services will be essential for 90% of data and analytics innovation.. And in less than 4 years, Gartner believes that 75% of organizations will have AI operational, driving a 5x increase in streaming data and analytics infrastructures.

Which brings me to my second trend: Smart, composable apps.

In the previous section, we looked at how AI can intelligently respond to tasks within the context of a use case: we talked about “data preparation” but we could have meant applications like CRM or ERP. In fact, CRM applications are a packaged and repeatable way in which our industry has decided to “fit” a set of related tasks or jobs. During the many decades of existence of these systems, our industry has learned to make them faster and smarter.

One such trend has been the introduction of “smart applications” or the infusion of AI into a known business process. Take, for example, using AI to assess your potential customers’ propensity to buy. Before smart apps, CRM users would have to come up with their own method of predicting the likelihood that a particular prospect will be receptive to distinctive offers. Now, modern CRM applications automatically score and rank leads for users. Some even proactively approach prospects on behalf of users.

We can imagine that these applications will continue to improve over the years. Still, It might be premature to think that the future of our applications will continue to be defined by the monolithic constructions that we invented for ourselves decades ago.

Look at the graph below. The 2020 marketing technology landscape has 8,000 suppliers. Scott Brinker, its inventor, claims that when he started this in 2011, he was only 150.

One could argue that this is due to the fact that building limited apps has become easier. It is true. But, What if the way we group use cases within our boxed apps had run its course? And yes, the future of applications is actually a set of services that are called based on our needs.

In other words, what if asking salespeople to use a CRM system to do their job actually reduces their potential because the build of our CRM systems has become too limited?

What if, in fact, the response to a vendor’s need is best served by consuming hundreds of services, assembled on the fly, by AI, perhaps even fleetingly?

This, to me, is the potential that smart, composable applications present us with.

Now, you may find that this trend is widespread for your organization. And you may find that you won’t have to worry about it for decades (Gartner predicts this won’t happen for 6 to 8 years for many of us).

However, I will say that you should see it in 2021. In her latest Forbes article, Betsy Atkins calls “Low Code, No Code”, the most disruptive trend of 2021. I think he’s right. And I also think We are witnessing the beginning of the disaggregation of packaged applications in the way we know them in the last 2 or 3 decades.

One of the biggest advances I have seen with digital transformation is that it has not only allowed organizations to modernize their approach, but also it allowed them to imagine things they couldn’t before. It has allowed them to see problems differently. And it has allowed us to find different solutions because we realized that the problems we thought we were solving in the first place were different from the problems we should be solving.

I believe that, in The next 10 years, the digital transformation will lead us to an era where most of our tasks will be operated smarter and more fluidly. The popularization of the use of the cloud and artificial intelligence will accelerate this trend. And I can imagine that if in 2021 low code without code allows more humans to change the application landscape, we can easily imagine that algorithms will be able to do that next.

When modern systems and approaches allow leaders End the hassle of infrastructure operation, imagination and innovation ensue. I hope that the digital transformation we experienced this year will allow us to move on to the best ninth year of this decade in 2030.

Here is the innovation and the realization of your imagination!

Source