TTI begins work on a custom LLM


The title is right! This week we managed to navigate finding a proper open source for our own specialized fork. One we will not have to host, and can easily connect to. We do not plan for this model to be any more costly than OAI given our current expectations. While we won’t be leveraging OAI for this process we will be continuing to support these and huggingface models in the future.


Leave a Reply

Your email address will not be published. Required fields are marked *