cameron pfiffer | aboutbloglinks

I've been paying a lot of attention to generative AI stuff, mostly because of my work on @comindco. I read a lot, watch a of videos, I'm on an unsustainable number of Discord servers, etc.

But I'm also tapering off. I think I've noticed that there's a few broad categories of AI that are worth paying attention to from the perspective of a builder, and many that are not.

Here's my breakdown:

  1. Technical tools are my primary focus. This is the big one. Stuff like LangChain, LlamaIndex, Ollama, .txt, MemGPT, etc. are all EXTREMELY practical. Agent workflows and the core infrastructure is where the most gains from attention are.

  2. Boring old stuff is very important: databases, distributed computation, containers, inference-as-a-service. This stuff is not the new-hotness, with perhaps the notable exception of vector databases. But it remains easily the most important part of whatever you're building.

  3. Models are not worth following. Models are basically commodities. There's oodles of free models, big crazy models like Opus/GPT-4, etc. Paying attention to these is fun but not really useful, since you can basically just swap models whenever you want. A lot of people's attention goes here because it is fun and novel, but it's not really worth your energy if you're trying to make something because all of them mostly do the same thing.

  4. Random tech demos. These are cool and inspiring but often not useful. I spend a small amount of attention on these just because motivation is important as a solo founder, but as a thing that can make it into Comind they are not practical. At least not now when the project is still in a relatively early phase.

– Cameron

Website built with Franklin.jl and the Julia programming language.