Google’s Firebase Studio lets you prototype apps in minutes — no code needed

Date:

Share:


Google is giving its mobile and web app development platform, Firebase, a vibe coding makeover — transforming it from a dev tool into a DIY AI app builder. It’s in a preview phase right now but any individual Google user can give it a go.

Just as chatbots like ChatGPT generate text or images from your prompts, Firebase Studio endeavors to generate app prototypes. For businesses, the idea is to majorly speed up the prototyping process, but the reason “vibe coding” is getting attention right now is that non-techie individuals can use these tools to build personal apps as well.

Cursor AI is one of the most popular apps for this use at the moment, gaining 360,000 paying subscribers and $200 million in revenue in around 12 months. Kevin Roose over at The New York Times, for example, seems to be having lots of fun vibe coding apps that help him pack his son’s school lunch or decide if something can fit in his car’s trunk. With Firebase Studio’s new features, it looks like Google wants in on the action.

You can use a template that’s close to your idea or start from scratch by simply typing in what you want. Firebase Studio will then send you back a plan describing the kind of features it will include, what the color scheme and layout will be, and what the app is called. You can tweak all of these details and click “prototype” to start building.

Now, I am no app developer, so what happened next looked a little crazy to me — just lines of code cascading down the screen while the AI model presumably did all of its generating. When it was done, the wall of code was replaced by a little web app prototype — I chose an app that would turn images of anime food into recipes, so the app was waiting for me to upload an image.

At this point, Firebase Studio prompts you for a “Gemini API key,” so in true vibe coding style, I just clicked the “Generate API key” button to see what would happen. It looked like it worked, but when I uploaded my image and clicked “Identify ingredients,” the app ran into another issue.

The AI offered to try and fix the issue for me — but honestly, I was already done with this little experiment. To me, vibe coding is just clicking buttons without any real idea what they’ll do, and that isn’t very fun. But if you do want to try it, the idea is to basically keep asking the AI to fix things, keep asking it to make changes, and keep testing out the prototype on the left-hand side until you get what you want.

The only thing now is to wait and see if Google can attract some vibe coders or if Firebase’s user base will stay mostly professional.








Source link

━ more like this

Web3 Development Trends 2026-2030: Architecture, ZK-Tech, and RWAs

Hardly anyone still asks if blockchain is “real” anymore. As we sit here in early 2026, the conversation has shifted. It is no...

The best cheap VPN in 2026

When talking about the best VPNs, I frequently warn about the dangers of trusting free VPNs without verifying them. Although there are a...

The first season of Amazon’s Fallout show is now free on Youtube

Fallout’s second season is coming to a close, and it’s been well the wait. But if a reluctance to add yet another...

Google’s NotebookLM can now turn your docs into AI videos on Android and iOS

The mobile version of NotebookLM app can now turn your documents into AI-generated videos, making it easier to understand dense material without scrolling...

Uber will soon have you riding in style in a driverless Mercedes-Benz S-Class

Uber, Nvidia and Mercedes-Benz have joined hands to develop a fleet of luxurious robotaxis based on the German carmaker’s S-Class platform. The companies...
spot_img