If your app is already integrating Firebase, Vertex AI for Firebase lets you access the Gemini API from your app.
Migration from Google client AI SDK
The API of Vertex AI for Firebase is similar to the Google AI client SDK. If you've already integrated the Google AI client SDK into your app, you can Migrate to Vertex AI for Firebase.
Getting started
Similar to the Google AI Client SDK, you can experiment with prompts in Google AI Studio. Alternatively, if the Gemini API isn't available in your country or region (see list), you can use Vertex AI studio.
Once you are satisfied with your prompts, go to Build with Gemini in the Firebase console, and then click the second card to launch a workflow that helps you do the tasks described in this document. If you don't see a card layout, then these tasks have already been completed.
In addition, do the following:
- Upgrade your project to use the Blaze pay-as-you-go pricing plan.
- Enable the following two APIs for your project:
aiplatform.googleapis.com
firebaseml.googleapis.com
.
Add the Gradle dependency
Add the following Gradle dependency to your app module:
dependencies {
...
implementation("com.google.firebase:firebase-vertexai:16.0.0-alpha02")
}
Initialize the Vertex AI service and the generative model
Start by instantiating a GenerativeModel
by providing the model version:
val generativeModel = Firebase.vertexAI.generativeModel("gemini-1.5-pro")
You can learn more about the models available in Vertex AI for Firebase in the Firebase documentation. You can also configure model parameters.
Next, you're ready to interact with the Gemini API.
Generate text
To generate a text response, call GenerativeModel.generateContent()
with
your prompt.
Note that generateContent()
is a suspend
function, which integrates well
with existing Kotlin code.
scope.launch {
val response = model.generateContent("Write a story about the green robot")
}
Image prompting
To augment text prompt with images, pass an image as a bitmap when calling
generateContent()
:
scope.launch {
val response = model.generateContent(
content {
image(bitmap)
text("what is the object in the picture?")
}
)
}
Multi-turn chat
You can also support multi-turn conversations. Initialize a chat with the
startChat()
function. You can optionally provide a message history. Then
call the sendMessage()
function to send chat messages:
val chat = generativeModel.startChat(
history = listOf(
content(role = "user") { text("Hello, I have 2 dogs in my house.") },
content(role = "model") { text("Great to meet you. What would you like to know?") }
)
)
scope.launch {
val response = chat.sendMessage("How many paws are in my house?")
}
Response streaming
To start displaying the response progressively as soon the first tokens are
produced, use generateContentStream()
, and collect the response stream:
scope.launch {
var outputContent = ""
generativeModel.generateContentStream(inputContent)
.collect { response ->
outputContent += response.text
}
}
Next steps
- Review the Firebase Vertex AI sample app on GitHub.
- Learn more about the Vertex AI for Firebase in the Firebase documentation.