Foundational Models

Introduction to the foundational models available and their use cases.

PropulsionAI offers a range of powerful foundational models that can be used as the starting point for your AI projects. These models have been pre-trained on vast datasets and are ready to be fine-tuned for your specific needs. Alternatively, if the pre-trained model meets your requirements, you can deploy it directly without any further fine-tuning.

Available Foundational Models

  1. Meta Llama Series

  2. Mistral AI Series

  3. Google Gemma Series

    • google/gemma-2-2b-it A compact and efficient model with 2 billion parameters, suitable for lightweight applications and tasks.

    • google/gemma-2-9b-it A more powerful model in the Gemma series, featuring 9 billion parameters for handling more complex tasks.

  4. Microsoft Phi Series

  5. Qwen Series

    • Qwen/Qwen2-0.5B-Instruct A smaller but highly efficient model with 0.5 billion parameters, suitable for focused instructive tasks requiring less computational power.

Using Foundational Models

These foundational models provide you with a solid base for your AI projects. You can choose to:

  • Fine-tune: Customize the model to better suit your specific data and requirements by fine-tuning it with your datasets.

  • Deploy Directly: If the pre-trained model fits your use case, deploy it directly without any additional fine-tuning. This can save time and resources, especially for applications where the model's general capabilities are sufficient.

Whether you’re looking to fine-tune a model to achieve specific results or deploy a foundational model as-is, PropulsionAI provides the tools and flexibility to meet your needs.

Requesting a Model

If you need a specific model that isn’t currently available in our foundational models, we’re here to help! You can request the addition of new models to the PropulsionAI platform by either:

We’re committed to continuously expanding our offerings to meet your needs, so don’t hesitate to reach out with your requests!

Last updated