In a recent report by The Wall Street Journal, it has been revealed that Apple has blocked the update of email app BlueMail, which is powered by an AI chatbot called ChatGPT. The reason behind this delay is said to be concerns regarding the possibility of the app generating inappropriate content for children.
The email app’s new AI feature has been designed to automate the writing of emails with the help of ChatGPT’s advanced AI technology. The app is equipped to assist users in generating automatic email replies and suggestions that match the context and tone of the message.
However, it seems that Apple is not fully convinced of the safety of the app’s AI feature for younger users. As per the report, the tech giant has asked BlueMail to increase its age restriction to 17 and older before releasing the app’s update.
The move by Apple has been seen as a precautionary measure to ensure that the app does not inadvertently generate content that may be inappropriate for younger users. While BlueMail has not commented on the matter yet, Apple’s concerns may stem from the fact that the AI technology used by ChatGPT can learn and replicate patterns from its users, which may not always be desirable.
This is not the first time that Apple has taken such a step to safeguard its users’ safety. In the past, the company has been known to remove apps from its App Store that violate its guidelines, especially those that are found to be collecting user data without permission or posing a security risk.
The delay in the update of BlueMail may be frustrating for its users, but it is a reminder that tech companies like Apple are taking measures to protect their users’ privacy and safety. While AI-powered tools like ChatGPT can be incredibly useful, it is crucial to ensure that they are designed with safety and security in mind, especially when children may be using them.