OpenAI created a GUI (graphic user interface) that allows average people to "educate a bot" with information they curate, and give it custom instructions so that in the end, you have a custom GPT that you can share to other people, or use for specific work.
This is something that I have been waiting in the wings for, as I have a PDF collection on cannabis history that I feel would make a powerful bot for helping educate children (or anyone) about the history and cultural anthropology of cannabis.
One of the challenges to this project is that every single GPT platform has a "limit" to the number of "tokens" it is willing to issue to a particular upload file. A token is what LLM's (large language models) use to break up informational bits in relation to subscriber overhead. In simplified terms, they limit the number of pages of a PDF you can upload. This is a huge limitation to a project that requires thousands of pages across several books.
I wanted to start with a single book, that stands on it's own as a tomb of cannabis history. It is probably my favorite book, and if its contents were taught to children from a young age, the idea of cannabis prohibition, and cannabis central-management would be snickered at.
I successfully cut the PDF "CANNABIS: EVOLUTION & ETHNO-BOTANY" into two parts and uploaded them both to the newly created GPT. I then gave it some "custom instructions" so that it would be less likely to "make things up" (hallucinate answers) and more likely to be accurate and succinct.
Below is a collection of videos I have crafted from the newly-made GPT. On one hand, I am pressure testing the BOT, on the other, I am crafting a process for making compelling videos about cannabis and cannabis history for adults that mine gems from the book.
You should see the quality and content of the videos improve, I am not sure what my plan is for them - if they should go on YT to be monetized or if there is some better way to profit from them.
Comentários