Integrating CryptoTalks with Your Software Development Environment

This guide will show you how to configure your development tools to use CryptoTalks.ai as an OpenAI-compatible LLM service provider. If you're using tools like continue.dev that expect to connect to OpenAI's LLM services, you can easily switch to using CryptoTalks.ai by updating your configuration settings. This guide will show you how to do this for continue.dev, but the process is similar for other tools.

Prerequisites

Before you start, make sure you have:

Step 1: Locate Your Configuration File

Most tools that allow customization of model providers will store their settings in a configuration file. For continue.dev, this is typically found at:

~/.continue/config.json

Navigate to this file on your system.

Step 2: Edit the Configuration File

Open the config.json file with a text editor of your choice.

{"models": [
    {
        "title": "CryptoTalks.ai LLM Service",
        "provider": "openai",
        "model": "MODEL_NAME", 
        "apiKey": "YOUR_CRYPTOTALKS_API_KEY",
        "apiBase": "https://cryptotalks.ai/v1"
    }
]}

Model name is the name of the model you want to use. You can find the list of available models at CryptoTalks.ai/v1/models.

You'll need to modify the apiBase section to point to CryptoTalks.ai instead of OpenAI's servers.

apiKey is your API key from CryptoTalks.ai.

That's it! Have fun!