OrbitusOrbitus
  • Home
  • About
  • Pricing
  • Contact
Sign InSign Up
OrbitusOrbitus

Build amazing web applications with Orbitus

Products

FeaturesPricingDocsStorybook

Support

Getting StartedFAQCommunityContact

Newsletter

Get the latest updates and articles directly in your inbox.

You can unsubscribe at any time. Read our

Privacy Policy
© 2026 Orbitus • All rights reserved.
v1.0.1
website.help.helpCenter/Integrations/Advanced Services Setup Guide

Advanced Services Setup Guide

Guide to configuring Artificial Intelligence (OpenAI, Anthropic, Google, Groq), cloud storage services, and external issue trackers. Supercharge your platform capabilities by integrating LLMs to perform automated reasoning and AI-assisted workflows.

Last updated: 04/14/2026, 02:12 PM
<div class="features-wrapper"> <h2>Supercharging System Intelligence</h4> <p>Advanced services allow Orbitus to transcend raw data management by introducing Large Language Models (LLMs) to perform automated tasks, cloud buckets for massive file storage, and developer synchronizations.</p> <hr> <h2>1. Artificial Intelligence (AI) Models</h2> <p>Orbitus supports four major AI providers to enable intelligent features. You can switch between providers depending on your preference for model reasoning capability, speed, or cost. Connect <strong>one</strong> provider to enable AI workflows globally.</p> <h3>Available AI Providers</h3> <ul> <li><strong>OpenAI:</strong> Uses GPT-4o by default. Excels at complex tool calling and logic.</li> <li><strong>Anthropic:</strong> Uses Claude 3.5 Sonnet. Best in class for writing nuance and large context windows.</li> <li><strong>Google AI:</strong> Uses Gemini 2.0 Flash. Exceptionally fast multimodal reasoning.</li> <li><strong>Groq:</strong> Uses OpenAI/GPT-OSS models executing on LPU chips for blistering, sub-second inference speeds.</li> </ul> <h3>Step-by-Step Setup</h3> <ol> <li>Navigate to the API generation dashboard for your chosen provider: <ul> <li><a href="https://platform.openai.com/api-keys" target="_blank">OpenAI API Keys</a></li> <li><a href="https://console.anthropic.com/settings/keys" target="_blank">Anthropic Console</a></li> <li><a href="https://aistudio.google.com/app/apikey" target="_blank">Google AI Studio</a></li> <li><a href="https://console.groq.com/keys" target="_blank">Groq Console</a></li> </ul> </li> <li>Generate a new API Secret Key. E.g., for OpenAI, it begins with <code>sk-...</code></li> <li>In Orbitus, go to <strong>Integrations > AI Integration</strong>.</li> <li>Select your preferred provider from the top tab selector (e.g., "Anthropic").</li> <li>Paste the API Key into the input field.</li> <li>Click <strong>Test Connection</strong>. The system will dispatch a miniature prompt to the API to verify the key.</li> <li>Once you see the green <em>Connection Successful</em> alert, click <strong>Save Integration</strong>. The chosen LLM is now actively powering your workspace.</li> </ol> <div class="usecase-example"> <h3>Data Privacy Consideration</h3> <p>We emphatically recommend configuring "Zero Data Retention" policies on your enterprise API accounts (particularly for OpenAI and Anthropic). Ensure your organization's API tier prevents the provider from using your customer data to train their future public models.</p> </div> <hr> <h2>2. Cloud Storage Providers</h2> <p>If you prefer storing media, documents, and system backups in an external bucket rather than local disks, connect an object storage provider.</p> <p><em>In Integrations > Storage, select the S3-Compatible protocol. You will need the <strong>Region</strong>, <strong>Bucket Name</strong>, <strong>Access Key ID</strong>, and <strong>Secret Access Key</strong> provided by AWS, DigitalOcean Spaces, Cloudflare R2, or similar S3 providers.</em></p> <hr> <h2>3. Issue Tracker Connectors</h2> <p>Connect development tracking tools (Jira, Linear, or GitHub) to automatically sync Support Tickets with developer environments.</p> <p><em>In Integrations > Issue Tracker, input your team's endpoint domain (e.g., <code>org.atlassian.net</code>) and an API bridging token. This allows Orbitus to convert escalated "Bug Report" tickets directly into Jira Epics seamlessly.</em></p> </div>

Was this article helpful?

Comments0

Still need help?

Contact Support