- My Forums
- Tiger Rant
- LSU Recruiting
- SEC Rant
- Saints Talk
- Pelicans Talk
- More Sports Board
- Fantasy Sports
- Golf Board
- Soccer Board
- O-T Lounge
- Tech Board
- Home/Garden Board
- Outdoor Board
- Health/Fitness Board
- Movie/TV Board
- Book Board
- Music Board
- Political Talk
- Money Talk
- Fark Board
- Gaming Board
- Travel Board
- Food/Drink Board
- Ticket Exchange
- TD Help Board
Customize My Forums- View All Forums
- Show Left Links
- Topic Sort Options
- Trending Topics
- Recent Topics
- Active Topics
Started By
Message
re: Anyone with AI knowledge here?
Posted on 11/7/23 at 10:40 am to I Love Bama
Posted on 11/7/23 at 10:40 am to I Love Bama
Yes, it is possible. The cost-effective way to do so is leverage an existing AI-enabled document searching tool such as Azure Cognitive Search. The cog search indexes data and searches using AI, but only returns relevant documents.
In order to get an AI chat response, you can leverage either Azure's in-house connection to OpenAI or make a call to the OpenAI ChatGPT API. To make a call to that service you would need to include the context, or the "chat log" like in ChatGPT. This is where you would have the prompt, or the question, and include the contents of the document.
The result would be the ability to show ChatGPT a document and ask questions about it. As of now, at least for ChatGPT, this is the most cost effective way. To make calls to ChatGPT API the cost is $0.01 per 1000 tokens, or about 8000 words. To have it read 1000 pages would cost several dollars, doing so all at once wouldn't make much sense.
Solutions like this do already exist.
If you follow this demo you could set something like this up for yourself.
Azure Search + OpenAI
The only other way to train a model on new data is to have access to the actual model.
In order to get an AI chat response, you can leverage either Azure's in-house connection to OpenAI or make a call to the OpenAI ChatGPT API. To make a call to that service you would need to include the context, or the "chat log" like in ChatGPT. This is where you would have the prompt, or the question, and include the contents of the document.
The result would be the ability to show ChatGPT a document and ask questions about it. As of now, at least for ChatGPT, this is the most cost effective way. To make calls to ChatGPT API the cost is $0.01 per 1000 tokens, or about 8000 words. To have it read 1000 pages would cost several dollars, doing so all at once wouldn't make much sense.
Solutions like this do already exist.
If you follow this demo you could set something like this up for yourself.
Azure Search + OpenAI
The only other way to train a model on new data is to have access to the actual model.
Popular
Back to top
Follow TigerDroppings for LSU Football News