A mobile application that promises to record users' phone calls and pay them for audio data to be resold to AI companies has surprisingly become the second most popular app in the social networking category on the Apple US App Store.
The app, called Neon Mobile, is marketed as a money-making tool, promising users "hundreds or even thousands of dollars" per year in exchange for access to audio conversations.
According to Neon's website, the company pays 30 cents per minute when users call other Neon users; up to $30 per day when calling others. The app also offers rewards for referring new users. According to data from app intelligence company Appfigures, the app first ranked 476th in the social networking category on the US App Store on September 18, but it jumped to 10th place yesterday.
On Wednesday, Neon ranked second in the iPhone free social apps chart. The app also briefly became the seventh most popular app or game on Wednesday morning before rising to sixth place.
According to Neon's terms of service, the company's mobile app can capture incoming and outgoing calls. However, Neon's marketing claims only record the user's side of the conversation, unless it's a call with another Neon user.
Neon's terms of service clearly state that this data is sold to "AI companies," "for developing, training, testing, and improving machine learning models, artificial intelligence tools and systems, and related technologies."
The existence of such applications and their approval on app stores indicate that AI technology has already invaded users' lives and previously private areas. Its high ranking on the Apple App Store shows that there is indeed a portion of the market that is willing to trade privacy for a small income, regardless of the larger costs to themselves or society.
Although Neon's privacy policy explains some aspects, its terms include extremely broad permissions for user data. Neon grants itself "global, exclusive, irrevocable, transferable, royalty-free, fully paid rights and licenses (including multi-tiered licensing rights) to sell, use, host, store, transmit, publicly display, publicly perform (including through digital audio transmission), broadcast to the public, copy, reformat for display purposes, create derivative works, and distribute all or part of user recordings."
This leaves a wide range of possibilities for Neon's use of user data, far beyond what it claims.
The terms also include detailed provisions about test features, which are not guaranteed and may have various issues and vulnerabilities.
Although the Neon app raises many red flags, it may technically be legal. Jennifer Daniels, a partner in the Privacy, Security & Data Protection department at the law firm Blank Rome, told TechCrunch: "Recording only one side of the call is intended to circumvent eavesdropping laws. According to many state laws, recording a conversation requires the consent of both parties... This is an interesting approach."
Peter Jackson, a cybersecurity and privacy attorney at Greenberg Glusker, agreed and told TechCrunch that the wording about "one-sided transcription" sounds like a backdoor method, implying that Neon fully records users' calls but may simply remove what the other person said from the final transcription.
Additionally, legal experts have raised concerns about the level of data anonymization.
Neon claims to remove users' names, email addresses, and phone numbers before selling data to AI companies. However, the company does not explain how AI partners or other buyers will use this data. Voice data could be used to create fake calls that sound like the user, or AI companies might use the user's voice to create their own AI voice.
Jackson said, "Once your voice is over there, it could be used for fraud. Now this company has your phone number and enough information — they have your voice recording, which could be used to create a simulated version of your voice and commit various forms of fraud."
Even if the company itself is trustworthy, Neon does not disclose who its trusted partners are, or what subsequent processing these entities are allowed to do with user data. Like any company that holds valuable data, Neon also faces potential data breach risks.
In a brief test by TechCrunch, Neon did not provide any indication that it was recording users' calls, nor did it warn the caller. The app functions like any other VoIP app, with incoming calls displaying the caller ID as usual.
Neon founder Alex Kiam did not respond to requests for comment. Kiam, identified on the company website only as "Alex," operates Neon from an apartment in New York, according to business documents.
LinkedIn posts show that Kiam raised funds for his startup from Upfront Ventures several months ago, but as of publication time, the investor had not responded to TechCrunch's inquiry.
Legal experts pointed out that AI agents are now often added to meetings to take notes, and AI-enabled devices are also being sold in the market. At least in these cases, everyone agrees to the recording. Given the widespread use and sale of personal data, some people may cynically believe that since data is being sold anyway, they might as well profit from it.