While command-line chats with local AI models are useful, the real power comes from scripting. This lesson walks you through the initial setup for creating powerful, local AI-driven scripts by combining Ollama, the Vercel AI SDK, and Zod.
You'll learn how to take a natural language command, like "Concatenate all of the markdown files in my Downloads folder into a single file," and use the generateObject function to intelligently extract structured, actionable data from it.
Workflow demonstrated in this lesson:
ollama-ai-provider, ai, and zod packages.{ command: string }).generateObject to pass a natural language prompt to the local model and receive a structured JSON object that matches your schema.bun to see the local model successfully infer the "concatenate" command from a full sentence.This foundational lesson unlocks the ability to build sophisticated, automated workflows by translating human language into machine-readable commands, all running entirely on your local machine.
[00:00] Head over to olama.com. We're going to use Gemma 3 for our demo. So after you install olama, if you just grab this command and drop it in any terminal, it will download and install that model. Since I already had it installed, I can just type any message, like hello, hit enter, and have a chat with a local AI model at my terminal. Now while terminal chats are great, what's way more useful is scripting.
[00:23] So I'll clear this out, and we're going to install a couple of packages. We'll install the Ollama AI provider, we'll install AI, the Versel AI SDK, and we'll install Zod to help us define what the AI should do. So let this run and now with everything installed in my index file, I'll just keep this at the root of my project and close this out. I can import Olama, I can import AI, and I can import Zod. So from Ollama we'll need the Ollama function which we'll create our model with.
[00:55] We're going to use generate object from AI and just Z from Zod. And we'll combine these pieces together by creating a model where olama is going to wrap around gemma3 and we're just grabbing the latest tag and then we can use generate object and pass in the model meaning that this gemma model will be used when we generate an object, with a schema of zObject and then a prompt which we're going to hard code above. So our prompt for right now is just going to be concatenate all of the markdown files in my downloads folder into a single file. Since we're not using any variables yet I'll hit command period, replace with the string literal just to make that red squiggly go away, and then in our schema we're going to extract the command from the prompt. I'll just hit tab to autocomplete this entire thing, then all we have to do is let this run and get the object out.
[01:49] Then if we just console.log the object, we'll be able to run this script in our terminal. I'm gonna run this with bun. It's just the easiest way to run a TypeScript script. Hit enter, and you'll see that the resulting object is an object with command of concatenate. So we were able to locally infer that from this sentence that concatenate is the command of this prompt.