🎉 After 47 releases, Spice.ai OSS has reached production readiness with the 1.0-stable milestone!
The core runtime and features such as query federation, query acceleration, catalog integration, search and AI-inference have all graduated to stable status along with key component graduations across data connectors, data accelerators, catalog connectors, and AI model providers.
-
Stable Data Connectors: The following data connectors have graduated to Stable:
-
Stable Data Accelerators: The following data accelerators have graduated to Stable:
-
Unity Catalog Connector: Graduated to Stable.
-
Databricks (mode: spark_connect) Data Connector: Graduated to Beta.
-
Beta Catalog Connectors: The Iceberg and Databricks catalog connectors graduated to Beta.
-
OpenAI Model & Embeddings Provider: Graduated to Release Candidate (RC).
-
Alpha Model Providers: The Anthropic and xAI (Grok) model providers graduated to Alpha.
-
Default Runtime Version: The CLI will install the GPU accelerated AI-capable Runtime by default (if supported), when running spice install
or spice run
. To force-install the non-GPU version, run spice install ai --cpu
.
-
Default OpenAI Model: The default OpenAI model has updated to gpt-4o-mini
.
-
Identifier Normalization: Unquoted identifiers such as table names are no longer normalized to lowercase. Identifiers will now retain their exact case as provided.
-
Sandboxed Docker Image: The Runtime Docker Image now runs the spiced
process as the nobody
user in a minimal chroot sandbox.
-
Insecure S3 and ABFS endpoints: The S3 and ABFS connectors now enforce insecure endpoint checks, preventing HTTP endpoints unless allow_http
is explicitly enabled. Refer to the documentation for details.