Google Gemini’s Nano Banana AI saree development, a viral sensation sweeping Instagram, transforms bizarre selfies into nostalgic 90s Bollywood-style portraits that includes flowing chiffon sarees, golden-hour lighting, and cinematic backdrops. However, a disturbing expertise shared by Instagram consumer Jhalakbhawani has sparked issues concerning the security and privateness of AI-generated pictures.
The development entails customers importing a photograph to Google’s Gemini Nano Banana instrument, paired with a immediate to create retro-inspired edits, similar to polka-dot or black party-wear sarees with dramatic shadows and grainy textures harking back to basic Indian cinema. The course of is straightforward: log into the Gemini app, choose the “Try Image Editing” mode, add a transparent solo photograph, and enter a viral immediate to generate a Bollywood-style portrait inside seconds.
Jhalakbhawani, an Instagram consumer, shared her unsettling expertise in a publish, after attempting the development. She uploaded a photograph of herself in a inexperienced full-sleeve swimsuit and used a immediate to generate a saree edit. The ensuing picture was hanging, however she observed an alarming element. “There is a mole on my left hand within the generated picture, which I even have in actual life. The authentic picture I uploaded didn’t have a mole,” she wrote in her publish. She questioned how the AI instrument might find out about a private element not seen within the uploaded photograph, calling the expertise ‘scary and creepy’ and urging followers to ‘keep secure’ when utilizing AI platforms.
The incident, which gained important consideration on-line, has fueled discussions about AI security. While Google incorporates safeguards like SynthID, an invisible digital watermark, and metadata tags to determine AI-generated pictures, specialists warn these measures have limitations. According to aistudio.google.com, SynthID helps confirm a picture’s AI origin, however the detection instrument just isn’t but publicly out there, limiting its effectiveness for on a regular basis customers.
Experts suggest warning when collaborating in such traits. Users ought to keep away from importing delicate pictures, strip metadata like location tags earlier than sharing, and retain authentic images to detect unauthorised modifications.
