The Friend connected collar promises a constant presence to break loneliness, but its permanent listening defies the general data protection regulations and seriously worries the CNIL.
A friend always there. Present without ever imposing itself. Who listens without interrupting. And who says exactly what we want to hear. Friend introduces himself like this. A continuous presence, worn around the neck, discreet, almost reassuring. A digital pendant designed like a jewel from a busy start-up, designed to accompany us everywhere.
Since the end of January, Friend has taken over the Paris metro with minimalist posters, “I will never leave the dishes in the sink”, “I will watch all the episodes with you”. A dating app in appearance. What sells, however, is an always available friend, worn like a badge of modern solitude.
Behind the project, Avi Schiffmann, a young entrepreneur already known for having launched one of the first global Covid monitoring dashboards in 2020. A success built on speed, intuition, immediate usefulness. A very Silicon Valley principle. Usage first. Then doubt, if you insist.
Technology based on the desire for privacy
Technically, Friend acts as a permanent probe. A Bluetooth microphone, paired with a smartphone, continuously captures the sound environment and transmits this data to a mobile application, then to cloud servers located outside Europe. The analysis and generation of responses relies on third-party language models, in particular Google’s Gemini. In this architecture, the voluntary request fades away. Daily life becomes the raw material. Conversations, silences, intonations, rhythms of voices. The experience with this new friend is built on continuous listening, sometimes captured without their knowledge, transformed and exploited remotely.
In the European area, this mode of operation fully falls within the scope of the GDPR. The main user can consent via the general conditions, but the sound environment is never limited to this contractual relationship. Friends, colleagues, loved ones or strangers, the necklace picks up voices who have not signed anything and acts like an open microphone in the social space. This capture takes place without explicit material mediation. No physical cutoff button. No light indicator indicating that listening is in progress. At the table, in a meeting or around a coffee machine, nothing distinguishes recording from the ordinary. Friend responds with its user agreement, which transfers full responsibility for complying with local laws and obtaining third-party consent to the user. The design problem remains unchanged. Comfortable for them. Legally fragile. For Aurore Bonavia, intellectual property lawyer at the Val-d’Oise bar, “between the microphone which records and the ear which ignores being captured, there is only a contract signed by one and never read by the other. Placing these obligations on the user is not legally valid”.
Sensitive data and emotional inference
The processing carried out by Friend goes beyond simple voice recognition. The device analyzes intonation, fatigue or stress in order to deduce emotional states, psychological fragilities and health elements. The device thus manipulates sensitive data. It is in this area that the European AI Act sets a safeguard. Since February 2025, certain practices have been regulated, notably the inference of emotions in professional and educational contexts. The mobility of the device makes this boundary more difficult to maintain. A necklace can be used everywhere, at work or at school. The regulatory framework focuses on the real functioning of the system and its concrete effects, beyond the commercial promise.
Friend claims to encrypt exchanges and limit data retention to a “context window”. In the absence of public independent expertise, the real scope of these guarantees remains at the promise stage. Audio streams pass through infrastructures located outside Europe, placing the data under other legal frameworks and opening the way to secondary uses that are difficult to control.
Self-entry of the file, the CNIL confirms to us: “The device can lead to a massive collection of data, possibly sensitive, health, political opinion or sexual orientation, depending on the content of the conversations, both of the user and of their interlocutors. Many questions arise in particular on the fate of the data, their place of storage, their security and their possible reuse for training purposes of the AI system. The CNIL is analyzing this device and will contact the company Friend.com in order to examine its compliance with the GDPR.”
A launch strategy that says it all
Some ethical dams hold. Others wear out through repeated use, under the guise of innovation. Doubt is becoming commonplace. The deployment of Friend in Europe, supported by significant advertising investments, is part of a now well-known sequence. Throw. Observe. Adjust. A method already proven in the American tech ecosystem, notably with Grok, X’s AI, and the generation of naked images without consent. The public space becomes a test bed, where reactions serve as a measure, tolerance as a threshold and habituation as validation.
Friend has a good chance of coming up against the European regulatory barrier. Public opinion remains skeptical. The first sales, in the United States as elsewhere, remain marginal. The object intrigues, without imposing itself. But over time, products that were ignored at their launch end up becoming established through use. Practices are slipping. The thresholds are moving. “Why not” becomes acceptable.
Friend is probably just a step. Other objects will follow, more integrated, more discreet, better adjusted to existing frameworks. OpenAI is already working on it, in collaboration with Jony Ive, former Apple designer. The AI leaves the screen. It gets closer to the body, the brain, everyday life. The question goes well beyond Friend. Why do digital technologies still benefit from an exceptional regime? Why does the precautionary principle, applied without hesitation in the physical world, disappear when it comes to software and algorithms? Why authorize marketing before compliance, banking on late regulation?
Some are calling for a relaxation of the rules so as not to slow down innovation. It remains to be defined which one. And above all according to what values.




