Product updates

Tailor your AI cloud with Ori AI Fabric’s granular service catalog

Learn how Ori AI Fabric transforms your service catalog into the cockpit of your AI cloud so you can design, price, and govern cloud services from one place.
Deepak Manoor
Posted : October, 14, 2025
Posted : October, 14, 2025

    If you operate an AI cloud, your product isn’t just infrastructure, it’s also your service catalog that defines it: GPU-based compute, ML services, model libraries, pricing, availability, storage, and networking options. This catalog is what your users experience every day. Think of catalog management as the cockpit for your business: a single place to design the offerings of your public or private cloud, set regional availability, define pricing, and manage your operational workflows. For example, adding new cloud services, updating pricing, managing user quotas and distributing compute between services are among the many user-facing operations that are enabled by catalog management.

    The ability to evolve your services quickly, enrich it with new services, and enhance how it is presented determines how well you meet the needs of AI builders in an industry advancing at unprecedented speed. Ori AI Fabric delivers a powerful, flexible catalog management system that lets you configure, price, and govern every element of your AI platform.

    What catalog management looks like with Ori AI Fabric

    Ori AI Fabric’s catalog management features help you deliver the flexibility and comprehensive cloud services that are needed to serve large segments of AI builders, ranging from startups and AI labs to enterprises and public sector organizations:

    • Turn into a one-stop AI cloud with GPU and ML Services: From GPU Instances, Supercomputers, Serverless Kubernetes and Inference Endpoints, you can design an end-to-end Private AI cloud for experimentation, training and inference workloads. Ori Model Registry and Fine-tuning studio help your users customize models effortlessly with harmonious integration with infrastructure services.
    • Customizable Model Library: Add popular open-source models that customers can deploy with one click and let users bring their own models which can be stored and versioned in Model Registry.
    • Fine-grained pricing control: With per-minute billing, flexibility to offer both token-based and dedicated GPU inference, on-demand as well as committed spend rates, regional price books you have the ability to compete effectively in the AI cloud market.
    • On-board Third-party & Private Compute: One of the key strengths of Ori AI Fabric is how easy it is to add third party availability helping you win and retain customers, especially when compute is needed at short notice. On the other hand, you can also enable your customers to add their compute to the fleet managed by Ori AI Fabric.
    • Comprehensive SKU Management: Add more capacity or introduce new GPU models and compute vendors easily with Ori’s automated capacity management.
    • Dynamic Service Allocation: Partition and rebalance your compute clusters between GPU Instances, Supercomputers, Kubernetes and Inference services effortlessly, so you can prioritize the services that generate the most value for you.
    Dynamic GPU Allocation
    • Approval and Reservation Workflows: Administrators can seamlessly manage quotas and approve limit increases. Similarly, they can also manage reservations to allocate compute nodes exclusively for organizations, which helps deliver strict multi-tenancy for sensitive workloads.
    GPU Cloud Reservations
    • Init-Script Catalog: Customize GPUs and Supercomputers with init scripts that serve your customers the best, ranging from automated driver installation to pre-integrated frameworks such as PyTorch and Tensorflow. You can also manage your own library of custom Init scripts.

    Change Auditing: Log and monitor every user-generated event for all compute services with details such as the Type of Service, Status of Resource, User ID, Organization ID, Timeline of events.

    Build your own AI cloud with Ori AI Fabric, the platform that powers our cloud.

    License Ori AI Fabric

    Turn your service catalog into a growth driver

    Accelerate time-to-market: Ori AI Fabric gives you full control over how you use your compute, enabling you to quickly orient towards market demand. As businesses race to get AI models and apps faster to market, support for third-party compute, seamless onboarding and faster provisioning helps you win more opportunities.

    Delight your users with comprehensive services: Serve customers with diverse needs, ranging from training to fine-tuning and inference with services that are proven to work on a large scale. Designed for simplicity, Ori’s AI and ML services make it easier for you to deliver developer-friendly experiences across use cases.

    Reduce operational costs: Dynamic allocation helps you move compute to services that are the most profitable and bin packing helps maximize GPU utilization. With simplified and automated catalog management, your CloudOps team can be more nimble and efficient.

    Be future-ready: A multi-service catalog and multi-vendor compute help you address a variety of use cases and large segments of the AI cloud market. With a catalog that you fully control, you can add compute and models to match the pace of the fast-evolving AI cloud and compute markets.

    Deploy your AI cloud with Ori AI Fabric

    For AI clouds, the competitive advantage doesn’t come from raw capacity alone, it also is based on how precisely you design, price, and govern the catalog that exposes the capabilities of the infrastructure.

    Ori AI Fabric’s catalog management turns your AI cloud into a fully productized platform where every GPU, ML service, and model can be monitored and managed with powerful control. It empowers platform operators to adapt instantly to market demand, onboard new capacity or services faster, and deliver a consistent, enterprise-grade experience to every customer. Speak with our team to see how you can power the future of AI with Ori AI.

    Share