Over a decade, I shipped multiple product lines and made zero sales calls. The credibility came from the company I was working for. Trimble sold the clocks. Astera Labs sold the modules. My job was to make sure the product worked when it shipped. The feedback loop was clean. Ship, run, renew.
Then I found myself building an AI tool for a problem I was facing. myDamnVoice. It captures how you actually write. The patterns, the cadence, the linguistic fingerprint that makes your work recognizable before anyone checks the byline. It works. And I have absolutely no muscle for telling strangers it exists.
The marketing problem looks completely different.
Here's the thing nobody tells you about a decade in infrastructure. The firmware was trusted because Trimble was trusted. I was building under someone else's reputation the whole time and had no idea.
myDamnVoice is trusted because I say it is. Until someone uses it and agrees.
That gap is where marketing actually lives. It's a completely different engineering problem than anything I've worked on before.
In firmware, the spec sheet is the market. Hit the tolerance, ship the hardware, done. The customer is a procurement engineer with a checklist. They are not asking who wrote the code. They are asking whether the part holds the spec across temperature, voltage, and time. If it does, they buy. If it does not, they do not. The signal is binary. The vocabulary is shared.
In consumer tools, the market is a person who had three other tabs open when they landed on your page. They are not reading the spec. They are deciding in eight seconds whether you are worth the next eight. The signal is not binary. It is a noisy weighted average of every other tool they have tried, every claim that turned out to be a demo, every onboarding flow that asked for too much before showing anything. You are being graded against products you have never seen.
The other shift is the loop itself. In firmware, the feedback comes from the field. A bug report, a returned unit, a thermal failure under load. Slow but unambiguous. In a consumer tool, the feedback comes from people who never told you they tried it. They opened the page, scrolled to the demo, did not sign up, left. There is no return ticket. There is just an analytics row that says someone who looked like your target user did not convert.
So the engineering question is what do you build to compress that loop. Not what do you build to ship the product. That part is the easy part for a builder. What do you build to learn faster about the people who left.
This post is the experiment. I'm figuring out how to sell in public, starting now.
What that looks like in practice. I'm going to keep writing about the specific things I get wrong, the specific things I try, what moves and what doesn't. Not as a content strategy. As a feedback loop substitute. If I can't watch users, I can at least make my thinking legible enough that the right ones write back.
The credibility I had at Trimble and Astera Labs was earned across a decade of infrastructure that ships and stays shipped. I am not going to recreate that overnight by saying I did. The only thing that compounds the same way for a product under my own name is the same thing. Ship, run, renew. Different vocabulary. Same problem.
This is post one of figuring out the vocabulary.