Leveraging The New Poe API
Operation end the month with zero tokens
I’ve been a Poe user for several years, early on in the days of ChatGPT and Claude I realized I didn’t want to be paying $20/month to a bunch of different vendors and liked the idea of accessing a whole lot of disparate models for one price.
Poe quickly became my personal AI tool using it for varied things like generating fun images (and later video as it became available) debating Buddhist philosophy and for quizzing myself before coding interviews.
Having access to both Claude Pro and Copilot during the day I found that my Poe usage was nowhere near the 1 million tokens a month limit, usually only using 250k of that when I went on a video generating spree.
I was very excited to hear earlier this week that Poe was introducing an API. This might be what was needed to start taking full advantage of my Poe subscription.
I did a quick experiment earlier this week throwing together a Python script and querying the API, but couldn’t really do much with the API besides kick the tires.
Toward the end of the week I remembered Simon Willison’s LLM project and thought this would be a good way for me to really use the Poe API without a lot of hard coding models and learning the API. The only issue was that nobody had released a plugin yet for LLM.
I had 45 minutes before my next meeting so I time-boxed myself that if Claude and I didn’t have a working plugin by then I’d table this project for the weekend.
Surprisingly the combination of clear documentation and good examples made this project go exceptionally fast. Other than getting tripped up in how to build the plugin everything else went smoothly and I had a functional prototype in about a half an hour.
I spent a little more time polishing up this plugin and released it on Github its been a while since I open-sourced anything and I’m excited to explore this more getting it to support image models and some of the more advanced features in LLM well.
Now that I had a command line tool for accessing all of the Poe models I still wanted to be able to use the new API in local coding tools instead of risking becoming more dependent on Claude Code. After a quick survey of open source tools I had already tried like Avante, CodeCompanion, and Crush a friend had recommended opencode and I liked the idea of of learning a new tool along the way.
Since opencode already supported OpenAi compatible APIs I was pretty confident Poe would work here as well. Adding a custom provider was well documented and I quickly got opencode working with a handful of hardcoded models I thought would work well.
Opencode is not quite 100% caught up with Claude Code performance, but its getting close and I really appreciate the flexibility and ease of hooking this up to local as well as a bunch of third party AI providers. This way I’m not hostage to future Anthropic price increases in Claude or just lazy/expensive design decisions like using the model itself to look at code structure rather than incorporating an LSP like Crush or opencode.
I now have a lot of different ways to burn those Poe tokens. I look forward to working more on my LLM plugin to make it easier to use more features of the different models, and am interested in finding a way to dynamically list models for opencode instead of hard coding them in JSON.
