Skip to main content

What is BYOM?

BYOM (Bring Your Own Model) is a GPTfy feature that lets your organization connect AI models of your choice—either prepackaged options like Azure OpenAI, Amazon Bedrock, and Google Vertex, or custom models you’ve purchased or host internally—directly within Salesforce.

With BYOM, you can leverage existing AI investments while benefiting from GPTfy’s seamless Salesforce integration capabilities.

Why is BYOM Important?

BYOM is important for several reasons:

  • Flexibility: Organizations can use AI models they’ve already invested in or have specific expertise with.
  • Customization: Companies can use specialized models that may be trained on industry-specific data or optimized for particular use cases.
  • Data Security: Organizations with strict compliance requirements can maintain control over their AI infrastructure while still leveraging GPTfy’s integration capabilities.
  • Cost Management: Companies that already have licenses for specific AI models can avoid duplicate costs.
  • Future-Proofing: As new AI models emerge, organizations can adopt them without waiting for GPTfy to include them in their package.

How is it integrated?

The AI Model record specifies the type of AI that GPTfy will invoke for generating responses. The creation of this AI Model record is a one-time activity.

Step 1: Navigate to the AI Model Section

  1. Open the Cockpit in GPTfy.
  2. Locate and click on the AI Model tile.
  3. Select the Create your own tile from the list of available models.

alt text

Step 2: Fill in your AI Model Details

Once you have selected the ‘Create your own’ card, a window will appear, prompting you to add details of the new Model. Fill in the required details in the provided fields.

  • Label: The name of your AI Model (e.g., “OpenAI GPT-4”).
  • Icon Source and Icon Details: You can choose between two options:
    • Static Resource: Upload your model’s logo as a static resource and enter its API name in the Icon Details field.
    • SLDS Icon: Select a standard Salesforce Lightning icon from the dropdown.
  • Sequence: A number that determines the order in which this tile will appear in the AI Model list.
  • Description: A short description that is displayed on the tile (e.g., “Used for generating chat summaries”).

Click Save after entering all required details.

alt text

Note: A new AI Model tile will appear in the Cockpit once the page reloads.

Note: A corresponding custom metadata record under GPTfy Card Configurations will also be created, but it will not be active yet.

Step 3: Validate and Save

Click the Save button to validate the entered details.

Once saved:

  • A new tile will appear in the AI Model section.
  • A custom metadata record under GPTfy Card Configurations will be created but remain inactive.

alt text

Ensure the list view is set to AI Model.

alt text

Step 4: Activate the Model

To activate the AI Model:

  1. Click on the newly created model tile.
  2. Navigate to the Connection Details tab.
  3. Provide the following information:
FieldDescription
ModelAuto-filled by the system.
PlatformPicklist field to define the platform the model runs on.
Named CredentialUse a secure credential (e.g., GPTfy) included in the GPTfy package.
AI TechnologySelect the appropriate AI provider.
TypeSpecify if this is a Voice, Chat, or Standard model.
Top PSets the cumulative probability threshold (e.g., 0.75) to control output diversity.
TemperatureAdjusts response randomness: lower = more focused, higher = more creative.
Max TokensLimits the number of tokens in each response.

alt text

Click Save after entering the details.

Step 5: Save and Activate the Model

After saving:

  1. Open the model tile again.
  2. Click Activate.

alt text

The system will validate your configuration. Once successful:

  • The model will be marked as enabled in GPTfy Card Configurations.
  • A green tick will appear on the model tile to confirm activation.

alt text

By following these steps, you can successfully create an AI Model in GPTfy.

Sample Connector Class

Note: The sample code here is provided ‘as-is’ for reference and informational purposes only. You can refer to this to understand how the model can be authenticated using a connector class.

alt text

global class GPTfyopenAIConnector implements ccai.AIAuthenticationInterface {
global HttpRequest getAuthenticatedRequest(ccai__AI_Connection__c aiModel) {
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:GPTfy/api/gptfyall');
return req;
}
}

Sample Processing Class

Note: The sample code here is provided ‘as-is’ for reference and informational purposes only. You can refer to this to understand how the model can be authenticated using the processing class.

alt text

global class SampleAIProcessingClass implements ccai.AIProcessingInterface {
global HttpResponse getProcessedResponse(ccai__AI_Connection__c aiModel, ccai__AI_Prompt__c aiPrompt, String promptCommand, String extractedData) {
String message = 'Human: '+promptCommand+' '+extractedData+' \nAssistant:';
Map<String, Object> reqBodymap = new Map<String, Object>{
'prompt' => message,
'max_tokens_to_sample' => aiModel.ccai__Max_Tokens__c,
'temperature' => aiModel.ccai__Temperature__c,
'top_p' => aiModel.ccai__Top_P__c
};
String strBody = JSON.serialize(reqBodymap);

HttpRequest req = new HttpRequest();
String endPoint = 'callout:AWSBedrock'+aiModel.ccai__EndPoint_URL__c;
req.setEndpoint(endPoint);

req.setMethod('POST');
req.setTimeout(120000);
req.setHeader('Content-Type', 'application/json');
req.setHeader('Accept', 'application/json');
req.setBody(strBody);
Http api = new Http();
HttpResponse response = api.send(req);

return response;
}
}