Fireworks provides open-source AI models optimized for various use cases, enabling rapid experimentation and deployment on the Fireworks Inference Cloud.

15sub-processors
HeadquartersSan Francisco, United States
Founded2021
Size201-1000
MarketB2B
PublicPrivate
ProductsOpen-source AI models, Fireworks Inference Cloud, Model lifecycle management, Enterprise-grade security, Real-time multimodal processing

All Sub-Processors

NameCategory
Amazon Web ServicesCloud Infrastructure
AnthropicAI & Machine Learning
CloudFlareContent & Localization
CoreWeaveCloud Infrastructure
CrusoeCloud Infrastructure
DiscordCommunication
Google Cloud PlatformCloud Infrastructure
Lambda LabsCloud Infrastructure
LinearCollaboration
Oracle Cloud InfrastructureCloud Infrastructure
PylonCustomer Support
SlackCollaboration
VercelCloud Infrastructure
Voltage ParkCloud Infrastructure
VultrCloud Infrastructure

Data Processing Locations

United States
11 sub-processors
United States, Japan
1 sub-processor
Processing is performed at the data center that is closest to the End User
1 sub-processor
United States, Iceland
1 sub-processor
United States, Japan, United Kingdom, Germany
1 sub-processor

Frequently Asked Questions

How many sub-processors does Fireworks AI use?
Fireworks AI uses 15 sub-processors (third-party data processors) as disclosed on their public sub-processor page.
What are Fireworks AI's main sub-processors?
Fireworks AI's sub-processors include Amazon Web Services (Cloud infrastructure and service provider), Anthropic (AI services), CloudFlare (Content delivery network provider), CoreWeave (Infrastructure Provider), Crusoe (Infrastructure provider), and 10 more.
Where does Fireworks AI process data?
Fireworks AI's sub-processors are located in United States, United States, Japan, Processing is performed at the data center that is closest to the End User, United States, Iceland, United States, Japan, United Kingdom, Germany.