Part 8/9:
Additionally, Shapiro emphasizes that GPT models already contain vast amounts of embedded knowledge. For instance, asking the model for OSHA regulation numbers often yields accurate results even without explicitly stating the regulation scope, thanks to the model’s extensive training data.
Final Thoughts and Future Directions
Wrapping up, Shapiro acknowledges the limitations faced when trying to embed multiple variables—such as tone or audience—into prompts. Sometimes, the models may not interpret these instructions perfectly, underscoring the importance of straightforward prompts and iterative testing.