Local Setup
What we're building in this step
By the end of this step:
- ✅ AI agent running
- ✅ Ready to add tools and authentication
- 🔄 Next: Add public stock tools to your agent
Setup an AI Inference Provider
AI Agents use Inference to generate responses. This process requires access to a large language model, or access to such a model via an API. For our workshop we are going to leverage hosted inference providers.
- Create account:
platform.openai.com/signup
- Get API key:
platform.openai.com/api-keys
You'll need an API key from one of these providers. The starter-kit uses Vercel AI SDK to abstract the inference providers.
Clone the workshop repository
First lets clone the git repository, so that we can get template and all the utilities for the workshop
git clone https://github.com/ciamshrek/demo-trade-pro.git
cd demo-trade-pro
nvm use --lts # if using nvm
pnpm install # install dependenciesStart your template agent
Now let's add your AI API key and run the agent:
cd apps/agent
# Create .env.local file with your OpenAI API key
nano .env.local # or code .env.local, vim .env.local, etc.
# Add this line to the file:
# OPENAI_API_KEY=your_actual_openai_key_here
# Save and exit, then start the agent
pnpm dev # starts on port 3003Test your template agent:
At this point your template agent is fully functional and running
- Open http://localhost:3003
- Ask: "What's the current price of WAYNE stock?"
- Expected response: "I don't have access to real-time stock data..." 🤷♂️
Perfect! Your agent provides trading advice but has no tools or authentication yet.
🎉 Success! Your template agent is running, ready for tools and authentication!
Overview of what's included in your starter kit
This starter kit is a monorepo which includes
apps/agent- Template AI agent with no tools (port 3003)packages/agent-utils- Shared tools package (we'll use this next!)packages/ui- Prebuilt UI Components
🚀 Ready for the next step: Adding public stock tools to your agent!