ChatGPT is not a search engine where you are given results to a specific search, but instead, it creates “new content” by predicting the word most likely to come next (e.g., based on publicly available Internet sites as of 2022).
Language Learning Models (LLMs), like ChatGPT, are designed to model human language. They use mathematical models to predict what the next word is most likely to be based on what you are asking for.
Keep in mind: they don't think. They don't understand, read, choose or give you the "best information." Sometimes it might feel or seem like it, but this isn't how the technology works. That said, they also won't tell you where they got the information they're pulling from, and who is doing the work behind the scenes. Many, if not most, are unregulated and influenced by how we all interact with it.
Note: * CSUSB has not yet released an official policy regarding using AI tools like ChatGPT; it is recommended to avoid using these tools for any assignment, coursework, or research unless explicitly permitted by your instructor. Unauthorized use of AI tools can be considered a violation of academic integrity, as submitting work generated by AI tools as your own is a form of plagiarism. For thesis, projects, and dissertations, written approval from your supervisor and supervisory committee is required if you plan to use any AI tools.
One main way users interact with ChatGPT is to ask it a question, or give it a prompt and receive a quick answer.
How do they work?
Unlike a search engine, which searches and then gives results using information already created -- Large Language Models (LLMs) are making "new" content predicting the word most likely to come next (e.g. based on HUGE dataset -- publicly available Internet sites (which includes racist, conspiracy sites, etc.) as of ~2022). They are designed to model human language and use mathematical models to predict what the next word is most likely to be based on what you you are asking for. Keep in mind -- they don't think. They do NOT understand, read, choose or give you the "best information." Sometimes it might feel or seem like it but it is but this isn't how the technology works.
Prompts are the things you write into the tool to try to get it to do what you want. Better prompts can help you to try to get better outputs. These tools need very specific instructions, and they need you to verify/critically evaluate the information or output they give you. Learn more about prompts or take a course like Prompt Engineering for Chat GPT.