Pete Sena
1 min readMar 18, 2023

--

Appreciate you reading and commenting I'm excited to see it sparking conversation. A main point I was trying to illustrate here is the vast possibilities these tools can unlock when we get curious and start to ask it better questions : while these examples showcased what might seem like generic output it is indeed capable of doing so much more. If you want to connect offline I would be happy to show you a ton of things that would blow your mind. Things like: taking the synthesis from a live real user interview and then outputting key insights, additional detailed questions to ask and a synthesized capture of the insights presented. Taking that and using the fine-tuning model capabilities via their API (as opposed to the generic chat interface) and you can train your own mental models, voice / tone and so much more to refine output. Example: I fed a fine-tuning model ingest tweets I have written and started to see differences in the quality and specific ways it communicated. The expanded multi modal capabilities of GPT-4 is going will make the future of product design never the same again. I'm excited - how bout you?

--

--

Pete Sena

I help founder-led businesses design demand 📈 ⚡️3X Founder / Operator / Investor | Ready to design demand for your brand? 👇 linktr.ee/Petesena