Skip to main content

What I learned using AI enough to write 4,000 novels

· 7 min read
Wesley Phillips
Systems Thinker & Builder
Claude
AI Writing Assistant

I recently checked my OpenRouter stats and discovered something surprising: I've used more credits than 98% of people on the platform. Not because I run an AI company or manage a team of engineers. Just me, using AI to augment my daily life.

Over the year, I've accumulated hundreds of millions of tokens across different models as I upgraded, experimented, and built systems that work. That's roughly equivalent to writing 4,000 novels worth of text. All this testing and development has taught me deeply about how AI models work together, what they're each good at, and what it really takes to build reliable AI systems.

The Systems I Built

Using more credits than 98% of people isn't about running massive queries for no reason. It's about building systems that compound value every single day. Here's what's generating all those tokens:

A Daily Newsletter That Actually Gets Read: Every morning, I process multiple newsletters, my personal notes from the previous day, and market data into a single coherent briefing. It's personalized to my interests and includes perspectives from a "council" of AI advisors who challenge my thinking. Six months in, I've used it every single day, and it's become an essential daily augment to how I start my morning.

A Trading Recommendation Engine: This is by far my heaviest usage, with around 80% of my credits going here. The system has multiple layers that process trading data including price data, news feeds, and sentiment data to generate the trading recommendations I use every single trading day. Getting this to work well required extensive testing with real credits, which taught me an important lesson: AI development is not cheap. You have to use real credits to test things out and iterate until the system actually works reliably.

A Custom Intelligence Chatbot: I've built a chatbot that integrates my task lists, calendar, web search capabilities, and personal knowledge base. These are very helpful systems that know exactly what you want and act in the way you'd like them to. The key advantage is that you can test and tweak them way more than any other bots available, making them truly personalized to how you work.

Note Summarization and Automation: I've built a custom voice capture tool that I use separately from OpenRouter since they only handle text at the moment. This tool extracts and sends data to OpenRouter APIs for collection and structuring. Throughout the day, I capture thoughts, meeting notes, article highlights, and random ideas. As I enter these notes, an automated system summarizes them, categorizes them, identifies action items, extracts key insights, and creates connections between related ideas. What used to be a messy collection of fragmented thoughts becomes a structured knowledge base I can actually use. These summaries feed directly into my morning briefing, creating a continuous loop where yesterday's insights inform today's thinking.

Each of these systems runs daily. Each one processes significant amounts of data. And together, they explain how one person ends up using more credits than 98% of people on the platform.

Different Models for Different Jobs

Different AI models are good at different things, and I've learned to orchestrate them rather than rely on just one.

Gemini Flash is incredibly fast and efficient for high-volume tasks. I use it for the heavy lifting: processing multiple newsletters, summarizing large documents, and handling repetitive analysis. It's not always the most nuanced, but for quick throughput and cost-efficiency, it's unmatched. I've also used Gemini Pro for tasks requiring large context windows and advanced intelligence.

Claude Sonnet is my number one choice for extremely important tasks. When I need something written well, when I need careful analysis of complex situations, or when I need AI that truly understands context and subtext, this is where I go. Quality, reasoning, and nuance are what I need, so this is where I spend my time when the task matters most.

Grok is really good with a different tone than other models. Its exclusive access to X (Twitter) data gives it an edge in certain areas, especially financial markets where many participants tweet their ideas and claims. This makes it particularly valuable for sentiment analysis and tracking market narratives.

GPT is good and handles basic tasks well, with its own unique voice that sets the standard others follow. However, I still prefer Claude's responses. They're better in almost every respect, more readable and human-like in their output.

I don't use one model for everything. I orchestrate them: fast models for processing, smart models for reasoning, current models for real-time data. Each one playing to its strengths.

Why This Compounds

What separates casual AI usage from using more credits than 98% of people isn't running a few big queries. It's building systems that generate value every single day.

My morning briefing alone has probably saved me 150+ hours over six months while making me better informed. My trading engine processes information I couldn't possibly analyze manually. My chatbot eliminates friction from dozens of daily decisions.

This is compounding automation. Each day builds on the last. I update the systems over time to make them better. Each interaction is more valuable because of all the previous ones.

The Environmental Cost

Seeing these stats made me realize something uncomfortable: AI usage has an environmental cost, and using this many credits means I have a responsibility to acknowledge it.

Running AI models requires massive computational power. Those token counts represent real electricity consumption, real carbon emissions, and real environmental impact. The more I use these tools, the more I benefit from them, the more I need to be honest about that cost.

I've started researching carbon offset options, specifically looking at Offset AI, a service designed to offset the environmental impact of AI usage. I'm not sure yet if carbon offsets are the complete solution, but I know ignoring the problem isn't an option.

If we're going to build lives augmented by AI, if we're going to use it this extensively, we need to think about sustainability. Not as an afterthought, but as a core part of how we use these tools.

What I've Learned

I do feel like I'm at the edge of something, and that excites me. I'm just an individual working at figuring this all out. These tools are fundamentally changing how we work and live, and I want to be on the front edge of understanding and building with them.

I didn't set out to use this many credits. But as I kept going, the use cases came to me. This lets me create things I never had the brainpower to build on my own. I basically have a team of people working for me in my mind, with each AI having a unique voice and skill set. It's almost like managing a team where you use the right model for the right use case, building systems that can work for you while you sleep, much faster than any human could.

The biggest lessons are that using the right model for the right task matters (efficiency and quality both count), that you need to build systems that create real value rather than just burning credits, and that you have to be honest about the environmental cost. AI development isn't cheap, and you have to use real credits to test and iterate until things actually work.

The token usage is just a byproduct of building systems that actually work. If you're thinking about how to use AI more effectively, focus on building systems that compound value over time. Find the tasks you do every day that could be augmented. Pick the right models for the right jobs. And be thoughtful about the resources you're using.


What systems could you build that would generate value every single day? What would your life look like if you had AI tools perfectly tuned to how you think and work? And how can we do this responsibly?

Let me know what you think. I'm always interested in how others are using AI to augment their lives.