Integrating CryptoTalks with Your Software Development Environment

This guide will show you how to configure your development tools to use as an OpenAI-compatible LLM service provider. If you're using tools like that expect to connect to OpenAI's LLM services, you can easily switch to using by updating your configuration settings. This guide will show you how to do this for, but the process is similar for other tools.


Before you start, make sure you have:

Step 1: Locate Your Configuration File

Most tools that allow customization of model providers will store their settings in a configuration file. For, this is typically found at:


Navigate to this file on your system.

Step 2: Edit the Configuration File

Open the config.json file with a text editor of your choice.

"models": [
    "title": " LLM Service",
    "provider": "openai",
    "model": "MODEL_NAME", 
    "apiBase": ""

Model name is the name of the model you want to use. You can find the list of available models at

You'll need to modify the apiBase section to point to instead of OpenAI's servers.

apiKey is your API key from

That's it! Have fun!