FAQ
In an attempt to be a helpful assistant, SciSeek can occasionally produce responses that are incorrect or misleading.
This is known as "hallucinating" information, and it’s a byproduct of some of the current limitations of frontier Generative AI models, like SciSeek. For example, in some subject areas, SciSeek might not have been trained on the most-up-to-date information and may get confused when prompted about current events. Another example is that SciSeek can display quotes that may look authoritative or sound convincing, but are not grounded in fact. In other words, SciSeek can write things that might look correct but are very mistaken.
Users should not rely on SciSeek as a singular source of truth and should carefully scrutinize any high-stakes advice given by SciSeek.
When working with web search results, users should review SciSeek's cited sources. Original websites may contain important context or details not included in SciSeek's synthesis. Additionally, the quality of SciSeek's responses depends on the underlying sources it references, so checking original content helps you identify any information that might be misinterpreted without the full context.
You can use the thumbs down button to let us know if a particular response was unhelpful, or write to via our contact page with your thoughts or suggestions.