artificial intelligence

THOUGHT LEADERSHIP ARTICLE

How the cloud has unleashed the full potential of AI ?

With the cloud, companies now have access to a wide range of innovative services previously reserved for the world of R&D. However, to take full advantage of these services, they need to put in place a methodological framework: a prerequisite for any industrialization. VISEO experts share our vision.

Artificial intelligence, which has been confined to research laboratories for the past fifty years, is now accessible to all companies, from start-ups to larger ones. Two revolutions are responsible for this democratization. First of all, Big Data has made it possible to absorb the exponential growth in the volume of data. Structured data but also more complex information such as images, sounds or free text.


The other turning point came with the cloud transformation. By leveraging the scalability and computing power of their infrastructures, providers are offering an extensive catalog of innovative solutions in the form of managed services. Cognitive services such as speech recognition, natural language analysis, text classification, pattern recognition or automated translation, until recently reserved for R&D teams, are now available to any developer.

 

Innovative off-the-shelf services


A company can experiment and deploy these innovative services without having to develop them themselves or invest in a dedicated infrastructure. Moreover, these algorithmic models are packaged and pre-trained and their integration is done by simply calling an API. This reduces the time to production.


Machine learning automation offers (AutoML) go as far as suggesting the most relevant algorithm based on a given data set or problem. This "off-the-shelf" AI also helps overcome the shortage of data science experts.


The multi-cloud approach is, obviously, essential. It allows us to benchmark cloud providers according to their performance, service by service. One hyperscaler will be particularly relevant in computer vision in a particular context, another in natural language processing.  


In the necessary industrialization of AI projects, the cloud offers another advantage. It allows us to move from a centralized vision of data, in a data lake, to a decentralized environment, the data mesh. This "data mesh" will de-silo the data warehouses while applying the same rule structures.


The creation of digital factories allows to industrialize this process by setting up data pipelines. All this saves a lot of time for a data scientist who spends up to 80% of his or her time on the preparatory phases of exploring, extracting and using data. The cloud, with its packaged processes and services, allows this automation.


Edge computing enables information to be processed as close as possible to the point of collection, without having to go into the cloud to be analyzed. Processing - blood glucose analysis, for example - is done locally, but using algorithmic models trained in the cloud. This reduces latency and better protects sensitive data.

 

The POC (proof of concept) era is over


With all these advances, the POC era is definitely behind us. A company can immediately project itself into production. However, this industrialization requires the implementation of an ad hoc methodological framework. The DataOps approach improves the automation of data flows by promoting communication between data managers and data consumers. The data architect and the data engineer will work together, in the same environment. Once the model is in production, good governance will allow to monitor it to ensure that it does not drift over time.

 

More generally, a company must define a clear AI strategy. Some organizations have an idealized or even unreal vision of the benefits of machine learning and deep learning. We are still far from total AI and these technologies solve, for the moment, identified problems and on defined perimeters. It is therefore necessary to demystify AI and explain its real contributions in an operational context. In the framing and ideation phase, understanding the business challenges will help to identify the most relevant use cases.

 

The involvement of the business appears to be a key success factor. A model can be particularly effective and yet not be used simply because its results cannot be interpreted. To avoid the black box effect, the explicability of the model is essential. It is necessary to be able to explain the mechanisms by which it obtains, from the input data, the output results. The definition of indicators must be done with the future users. We must not work with AI for the sake of working with AI, but operational products with "inside" AI.

 

Another element to take into account from the beginning of an AI project: its environmental impact. To reduce its carbon footprint, it is important to establish indicators to measure these impacts, and then from the modeling to the implementation of the product to favor the algorithmic models that consume the least data. Or how to reconcile digital transformation and ecological transition.

 

Contact us to discuss the digital transformation of your company!