Prompting with an Apollo 11 transcript

This notebook provides a quick example of how to prompt Gemini using a text file. In this case, you’ll use a 400 page transcript from Apollo 11.

Setup

Install the Google GenAI SDK

Install the Google GenAI SDK from npm.

$ npm install @google/genai

Setup your API key

You can create your API key using Google AI Studio with a single click.

Remember to treat your API key like a password. Don’t accidentally save it in a notebook or source file you later commit to GitHub. In this notebook we will be storing the API key in a .env file. You can also set it as an environment variable or use a secret manager.

Here’s how to set it up in a .env file:

$ touch .env
$ echo "GEMINI_API_KEY=<YOUR_API_KEY>" >> .env
Tip

Another option is to set the API key as an environment variable. You can do this in your terminal with the following command:

$ export GEMINI_API_KEY="<YOUR_API_KEY>"

Load the API key

To load the API key from the .env file, we will use the dotenv package. This package loads environment variables from a .env file into process.env.

$ npm install dotenv

Then, we can load the API key in our code:

const dotenv = require("dotenv") as typeof import("dotenv");

dotenv.config({
  path: "../.env",
});

const GEMINI_API_KEY = process.env.GEMINI_API_KEY ?? "";
if (!GEMINI_API_KEY) {
  throw new Error("GEMINI_API_KEY is not set in the environment variables");
}
console.log("GEMINI_API_KEY is set in the environment variables");
GEMINI_API_KEY is set in the environment variables
Note

In our particular case the .env is is one directory up from the notebook, hence we need to use ../ to go up one directory. If the .env file is in the same directory as the notebook, you can omit it altogether.

│
├── .env
└── examples
    └── Apollo_11.ipynb

Initialize SDK Client

With the new SDK, now you only need to initialize a client with you API key (or OAuth if using Vertex AI). The model is now set in each call.

const google = require("@google/genai") as typeof import("@google/genai");

const ai = new google.GoogleGenAI({ apiKey: GEMINI_API_KEY });

Select a model

Now select the model you want to use in this guide, either by selecting one in the list or writing it down. Keep in mind that some models, like the 2.5 ones are thinking models and thus take slightly more time to respond (cf. thinking notebook for more details and in particular learn how to switch the thiking off).

const tslab = require("tslab") as typeof import("tslab");

const MODEL_ID = "gemini-2.5-flash-preview-05-20";

Download the Apollo 11 transcript

const fs = require("fs") as typeof import("fs");
const path = require("path") as typeof import("path");

const downloadFile = async (url: string, filePath: string) => {
  const response = await fetch(url);
  if (!response.ok) {
    throw new Error(`Failed to download file: ${response.statusText}`);
  }
  fs.mkdirSync(path.dirname(filePath), { recursive: true });
  const buffer = await response.blob();
  const bufferData = Buffer.from(await buffer.arrayBuffer());
  fs.writeFileSync(filePath, bufferData);
};
const TRANSCRIPT_URL = "https://storage.googleapis.com/generativeai-downloads/data/a11.txt";
const transcriptFilePath = path.join("../assets", "a11.txt");
await downloadFile(TRANSCRIPT_URL, transcriptFilePath);

Upload the file using the File API so its easier to pass it to the model later on.

import { File, FileState } from "@google/genai";

async function deferredFileUpload(filePath: string, config: { displayName: string }): Promise<File> {
  const file = await ai.files.upload({
    file: filePath,
    config,
  });
  let getFile = await ai.files.get({ name: file.name ?? "" });
  while (getFile.state === FileState.PROCESSING) {
    getFile = await ai.files.get({ name: file.name ?? "" });
    console.log(`current file status (${getFile.displayName}): ${getFile.state ?? "unknown"}`);
    console.log("File is still processing, retrying in 5 seconds");

    await new Promise((resolve) => {
      setTimeout(resolve, 5000);
    });
  }
  if (file.state === FileState.FAILED) {
    throw new Error("File processing failed.");
  }
  return file;
}
const textFile = await deferredFileUpload(transcriptFilePath, {
  displayName: "Apollo 11 Transcript",
});

Generate Content

After the file has been uploaded, you can make client.models.generateContent requests that reference the File API URI. Then you will ask the model to find a few lighthearted moments.

const response = await ai.models.generateContent({
  model: MODEL_ID,
  contents: [
    "Find four lighthearted moments in this text file.",
    google.createPartFromUri(textFile.uri ?? "", textFile.mimeType ?? "text/plain"),
  ],
});
tslab.display.markdown(response.text ?? "No response text available");

Here are four lighthearted moments from the text:

  1. A Lost Bet Over Coffee:
    • 00 00 54 13 CMP And tell Glenn Parker down at the Cape that he lucked out.
    • 00 00 54 17 CC Understand. Tell Glenn Parker he lucked out.
    • 00 00 54 22 CMP Yes. He lucked out. He doesn't owe me a cup of coffee. This exchange reveals a personal bet between Michael Collins and Glenn Parker, adding a touch of everyday human interaction to the high-stakes mission.
  2. Crew Distracted by the View:
    • 01 03 15 30 CDR Yes, and he is eyeballing the Earth.
    • 01 03 15 32 CMP He's got his head out the window.
    • 01 03 15 35 CC I understand, I had trouble on 12 with him, too. The crew’s playful comments about a fellow astronaut “eyeballing the Earth” and the CAPCOMM’s relatable admission of having “trouble with him, too” highlight the human element of being captivated by the view from space.
  3. A Navy Term for Grayness:
    • 01 03 22 57 LMP Yes. Is there a Navy term for that?
    • 01 03 23 00 CC (Laughing.) A lot of gray paint. Buzz Aldrin asks a humorous, casual question about a “Navy term” for a visual phenomenon, and the CAPCOMM responds with laughter and a witty, simple answer, breaking the technical jargon.
  4. Zero-G Exercise and a TV Request:
    • 01 06 51 33 CMP Ever alert and ... Hey, you got any medics down there watching high grade? I'm trying to do some running in place down here, and I'm wondering just out of curiosity whether it brings my heart rate up.
    • 01 06 52 26 CC I'd like to see that sight. Why don't you give us a TV picture of that one. A crew member playfully asks about their heart rate while “running in place” in zero-g, leading the CAPCOMM to humorously request a TV broadcast of the unusual sight.

Delete File

Files are automatically deleted after 2 days or you can manually delete them using files.delete().

await ai.files.delete({
  name: textFile.name ?? "",
});
DeleteFileResponse {}

Learning more

The File API accepts files under 2GB in size and can store up to 20GB of files per project. Learn more about the File API here.