WordPress Ad Banner

Enhanced Features Coming to iOS Chrome: Built-in Lens, Maps, and Calendar Integration

Google announced today that Chrome on iOS is getting a few new features, including built-in Lens support that will allow users to search using just their cameras. Although you can already use Lens in Chrome on iOS by long-pressing an image you find while browsing, you will soon also be able to use your camera to search with new pictures you take and existing images in your camera roll.

The company says the new integration is launching in the coming months. For context, Google Lens lets you search with images to do things like identify plants and translate languages in real time.

Image Credits: Google

Google also announced that when you see an address in Chrome on iOS, you no longer need to switch apps to look it up on a map. The company says now when you press and hold a detected address in Chrome, you will see the option to view it on a mini Google Maps right within Chrome.

In addition, users can now create Google Calendar events directly in Chrome without having to switch apps or copy information over manually. You just need to press and hold a detected date, and select the option to add it to your Google Calendar. Chrome will automatically create and populate the calendar event with important details like time, location and description.

Image Credits: Google

Last, Google announced that users now have the ability to translate a portion of a page by highlighting text and selecting the Google Translate option.

“As our AI models improve, Chrome has gotten better at detecting a webpage’s language and suggesting translations,” the company wrote in a blog post. “Let’s say you’re planning to visit a museum in Italy, but the site’s in Italian and you don’t speak the language. Chrome will automatically offer to translate the museum’s website into your preferred language.”

OpenAI Introduces ChatGPT App For iOS

OpenAI has taken the tech world by surprise with the sudden release of its ChatGPT app for Apple iOS. This unexpected move brings the power of generative AI to iPhones worldwide in a remarkably short timeframe, as it comes less than six months after the highly acclaimed chatbot made its debut on November 30. The availability of ChatGPT on iOS devices opens up new possibilities for users to engage in human-like conversations and harness the capabilities of AI directly from their iPhones. This strategic move by OpenAI demonstrates its commitment to providing accessible and innovative AI solutions to a broader audience, further expanding the reach of conversational AI technology.

According to a blog post, the company says that the ChatGPT app in the App Store “syncs your conversations, supports voice input, and brings our latest model improvements to your fingertips.”

OpenAI added that the app is free to use and syncs a user’s history across devices. It also integrates Whisper, the company’s open-source speech-recognition system, enabling voice input. 

In addition, ChatGPT Plus subscribers get exclusive access to GPT-4’s capabilities, early access to features and faster response times.

In any case, as ChatGPT-like clones have flooded the App Store, and since open-source LLMs have been shown to work on smaller devices, it’s clear that this is a big move that OpenAI needed to make quickly. Apparently that didn’t leave much time to detail any efforts around safety issues — the only thing the blog post says is “As we gather user feedback, we’re committed to continuous feature and safety improvements for ChatGPT.”

But there is good news for non-Apple users, according to the blog post: “Android users, you’re next! ChatGPT will be coming to your devices soon.”

Users can download the ChatGPT app here.

Users of iPhones have been issued an urgent warning by Apple.

Apple informed Millions of iPhone customers of a pop-up notification that occurs when water is found in the device’s charging port. If you will Ignore the notification it will cause the pins on the Lightning port or the cable to corrode, and the result will be permanent damage or connectivity issues.

Two types of notifications you’ll find there, that will have a  yellow warning triangle and a blue water drop inside.

The first message informs you that “Charging Not Available,” while the second message informs you that “Liquid Detected in Lightning Connector.” Except in extreme cases, it is critical not to ignore both notifications.

If you want to dry your iPhone Apple has some suggestions, such as lightly tapping it with the Lightning connector facing down to remove any excess liquid and placing it in a dry place with adequate airflow, and waiting about 30 minutes before charging it again.

If that notification appears again on the screen, that means that there is still liquid present, so you have to leave the iPhone in a dry place with some airflow for up to a day before charging or connecting a Lightning accessory.

Apple warns against using external heat sources or compressed air to dry out the iPhone and discourages inserting foreign objects into the Lightning port, such as cotton swabs or paper towels.

It is also advised not to put the iPhone in a bag of rice because this can cause damage to the device.

11-year-old girl develops app that detects eye diseases

The 11-year-old girl’s mobile app can analyse various parameters such as light and colour intensity to locate the eyes within the frame range

Meet Leena Rafaq an 11-year-old Dubai-based Malayali girl originally from Kerala, who has developed an AI application to detect eye diseases and other conditions through a unique scanning method using an iPhone. Rafaq named the application “Ogler EyeScan” and began developing it when she was 10. In a video, she said that her application can analyse various parameters like color and light intensity, distance and look-up points to locate eyes within the range of the frame using advanced computer and machine learning,

In her post, Leena added that “Exciting news! I am thrilled to announce the submission of my new Artificially Intelligent mobile app, named Ogler EyeScan,” Rafeeq said in a LinkedIn post on Saturday adding that she created the AI mobile app when she was 10, the application also identifies any light burst issues and if the eyes are positioned exactly inside the scanner frame. She also said “Olger” can identify conditions like Arcus, Malenoma, Pterygium and even Cataracts with help of trained models.

Leena said that her app is currently under review in Apple’s app store and is hopeful that it will be approved soon. The Olger EyeScan is only supported in iPhone 10 and above with iOS 16+. She also said, “This App was developed natively with SwiftUI without any third-party libraries or packages, and it took me six months of research and development to bring this innovative app to life.”

Reverting to some of the comments on her viral LinkedIn post, she mentioned that the accuracy of her app is “nearly 70% at this moment.”