The cloud is in the air: Ai responsible

The cloud is in the air: Ai responsible

Artificial intelligence will be responsible, or will not be.

One day that I needed the French and English list of 125 participants in an organization for which we work, I was looking for information …

No luck: I just found column pages with unnecessary mentions. I immediately understood that I will take 20 minutes to have usable information to integrate it into our software.

So death in my soul, I decided to use an artificial intelligence …

Looking for lost time

First test with Google, but I hung up immediately, because it was necessary to accept “ad vitam” conditions, and frankly I found this very exaggerated.

Second test, another US which is said among the first in the world … But Pataras, by asking for the list, the “thing” stopped to the letter L.

Third test, I specified “full list” … The bidule runs but with fleas and without line jump, while before he jumped a line after each name, to understand nothing.

Fourth test, I also specified “without a chip, without line jump” and there indeed the expected result (long awaited) arrived.

All that for that

The moral of this experience, an annoying nothing, allowed me to draw some conclusions:

1. The results were different (line jump) without explanation

2. List understanding was insufficient (full list)

3. I had consumed 3 minutes instead of a …

Projection, by a million prompt on time, I saw thousands of giga watt hour consumed for nothing, simply by what no logical seemed to animate the so -called “intelligence” at which I tried to extricate an unfortunate list which was sometimes incomplete and if not (without reason, or reasoning) full of frills …

Do you find it smart?

This confirmed to me that: not it was neither reasonable, reasoned, coherent, nor responsible.

Why then do you say this paper?

Because without pretending to be Microsoft, Google or others, we, a small French publisher reflected from the start of our work on a responsible artificial intelligence model, or if you prefer to have environmentally friendly (and our wallet).

Conceive

This is what we did:

1. After a few tests from different GPU cards, we opted for a model which for an equivalent result consumes 50% less.

2. We also worked on the most efficient but optimized LLM models in size, which allowed us to save 30% calculations for an equivalent result.

3. We have worked on optimizing requests and consumption, both on our instructions and on the hosted model, and there again we saved 50%

4. We continue to work to imagine hosted models consuming 50% less, but there we will not give you the thing …

5. We work to manage the constraints and requests to execute them in a limited time, believing that we should gain between 30 to 50% consumption.

But then you will think, 50% +30% +50% +50 +30% is 210% so negative consumption!

No, it is simply that without optimization, we would have a system that would probably cost us three times more than it should actually.

Method

Our goal, already achieved even before the launch of our IA offers, will therefore consume three times less than if we had not paid attention to it.

And as we have only one resource to manage well (the finances of our company, the bill of our customers), and that a habitable planet, it seems very judicious to recall that AI has no future without a little intelligence to its design by software companies.

Summary ? The Cloud is in the air: Ai responsible.

Jake Thompson
Jake Thompson
Growing up in Seattle, I've always been intrigued by the ever-evolving digital landscape and its impacts on our world. With a background in computer science and business from MIT, I've spent the last decade working with tech companies and writing about technological advancements. I'm passionate about uncovering how innovation and digitalization are reshaping industries, and I feel privileged to share these insights through MeshedSociety.com.

Leave a Comment