At the beginning of my career, once I had an issue. I asked another expert in the team for fixing the issue. He also did not know how to resolve it, and he did Google Search. He got the results and fixed the issue.
After he left, I thought, I also did google search, but could not get the correct results. Maybe, I need to learn how to do google search correctly. I have also read a few articles about searching in a better way in google, but did not make much difference. (Back then courses for each and every topic were not that much popular.)
It took me quite some time to understand that, there was nothing to learn in "Google Search", but, I need to learn the domain.
Whenever I see Prompt Engineering courses for ChatGPT, I exactly feel the same thing.
Eventhough both the expert and myself did not know the answer to the issue, but, we both were not at the same level. He knew everything around the issue except the issue, and I did not know anything.
When somebody knows everything around the issue, many times they just need the approach to go in the right direction to solve it. Sometimes, even a hint is enough for them. But, for those who do not know anything, those steps are not enough.
The way an expert ask the question is a lot different from a novice user. Novice user's question would be very generic, and the expert's question would be very specific. Even if both ask the exact same question, the way both would process the same results would be a lot different. As soon as the expert gets a hint or approach, he/she would start working on that. But, a hint/approach is not enough for a novice user.
Whenever I asked any question in the domain which I have decent knowledge, I could use ChatGPT's reply easily and could solve the problem soon.
For any questions for which I did not have any domain knowledge, it was pretty horrible experience. For a few problems, it has taken me a few months to get the optimum solution. [In those months, I was learning the domain.]
In general, I do not have much good opinion on Prompt Engineering courses. But, if I were to give an opinion on that, I would say, treat ChatGPT like a very senior engineer in your organization. When I was a fresher, my then organization gave a session to all the freshers on how to get help and how to ask questions etc.
If you are going to send a mail to a senior employee (probably in another timezone) asking for help, and sending just a message like "That code is not working" is not correct.
One should give full context on the problem. What is the issue? In which cases, it occurs, and in which cases, it does not occur? What are the steps you tried? What was the change in the behavior when you tried those steps?
If one does not give full details, it would be difficult for anyone to respond. If anyone asks me for any help without providing any details, I generally ask them all the questions to get the context, and then I help.
Unfortunately, as of today, ChatGPT does not ask the questions back to get the clarity. It would either assumes a few things (and may lead to incorrect solution), or may give solution for all the assumptions (which would be overwhelming).
Also, if we ask ChatGPT multiple independent questions in a single prompt, it does not give efficient answer when compared to asking them separately. Of course, if the questions are related, they should be asked together to give better context.
If you have absolutely zero knowledge on some domain, and if you are not able to get the correct solution from ChatGPT, I don't think, you can get the solution by doing those prompt courses. But, you can get the solution, when you start understanding the domain (which you can learn from ChatGPT itself).











