Google is pushing its AI-powered photo editing tools deeper into international markets, rolling out natural language editing capabilities to India, Australia, and Japan. The move brings the "Help me Edit" feature - which lets users describe photo changes in plain language rather than wrestling with sliders and filters - to millions of new Android users. First launched for Pixel 10 owners in the U.S. last August, the expansion signals Google's aggressive strategy to dominate consumer AI applications ahead of rivals like Adobe and Apple.
Google just handed millions of users across India, Australia, and Japan something most photo editing apps still can't match - the ability to fix images by just asking. The company announced Tuesday it's expanding natural language-based editing in Google Photos to these three markets, bringing conversational AI editing beyond the U.S. launch that kicked off with Pixel 10 devices last August.
The timing isn't accidental. While Adobe continues to charge premium subscriptions for AI-powered editing in Photoshop and Lightroom, and Apple slowly rolls out similar features across its ecosystem, Google's making a calculated bet that free, accessible AI tools will lock users into its broader services ecosystem. India alone represents over 600 million smartphone users, many accessing premium software features for the first time through Google's free tier.
Here's how it works in practice: Users in the newly supported countries now see a "Help me Edit" box when they tap the edit option on any photo. From there, they can either pick from suggested prompts or type their own requests in plain language. Want to "remove the motorcycle in the background" or "reduce the background blur"? Just type it. Need a more ambitious restoration job? Try "restore this old photo" and watch the AI reconstruct faded details.
The feature handles surprisingly specific requests, according to demonstrations shared by Google. Users can ask it to edit a friend's pose, remove their glasses, or even force open someone's eyes in a photo where they blinked. Behind the scenes, Google's Nano Banana image model processes these transformations entirely on-device, which means no internet connection required once you've downloaded the capability. That on-device processing gives Google a crucial edge in markets like India where connectivity remains inconsistent.











