To receive industry-leading AI updates and exclusive content, sign up for our daily and weekly newsletters. Learn more
Open AIThe latest model in the family, GPT-o1, is expected to be more powerful and have better inference capabilities than previous models.
Using GPT-o1 is a bit different than prompts in GPT-4 and GPT-4o. This model has more inference capabilities, so some of the usual prompt engineering techniques don’t work as well. Previous models needed more guidance, and people would take advantage of longer context windows to provide more instructions to the model.
According to OpenAI API documentationo1 models “work best with simple prompts,” but techniques such as telling the model or shot prompts “may not improve performance and may in some cases hinder performance.”
OpenAI advised o1 users to consider four points when promoting new models:
- Keep your prompts simple and direct, and don’t guide your model too much as they will understand the instructions very well.
- o1 Avoid chains of thought because the model already does the reasoning internally
- Use delimiters such as triple quotes, XML tags, and section titles to make it clear which section the model is interpreting.
- OpenAI said that adding context or documentation when using its models for Search Augmentation Generation (RAG) tasks can make responses too complicated, so it will limit additional context for RAG.
OpenAI’s advice for O1 varies widely From the proposal The company had previously made very specific suggestions to include more detail in its models and provide step-by-step instructions, but GPTo1 would perform better by “thinking” for itself on how to solve queries.
Ethan Mollick, a professor at the Wharton School of Business at the University of Pennsylvania, said: Useful blogs Our experience as early users of O1 has shown us that it works better for tasks that require planning, where the model figures out how to solve the problem on its own.
Rapid engineering and easy model guidance
Of course, prompt engineering has become a way for people to dig into the details to get the responses they need from AI models – it’s not only a critical skill, but also a growing job category.
Other AI developers have released tools to make it easier to create prompts when designing AI applications: Google released Prompt Poet, built in partnership with Character.ai, which integrates external data sources to make responses more relevant.
Because GPT-o1 is still new, people are still figuring out exactly how to use it (including me, who still hasn’t figured out the first prompt), but some social media users predict they’ll have to change how they approach ChatGPT’s prompts.