Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Snowflake Expands Capabilities for Enterprises to Deliver Trustworthy AI into Production

Nov 12, 2024 - financialpost.com
Snowflake, the AI Data Cloud company, has announced new advancements aimed at helping organizations easily and efficiently integrate AI into their operations. The new features will allow developers to build conversational apps for structured and unstructured data with high accuracy, run batch large language model (LLM) inference for natural language processing pipelines, and train custom models with GPU-powered containers. The advancements also include built-in governance, access controls, observability, and safety guardrails to ensure AI security and trust.

The company also introduced more customization options for large batch text processing, enabling data teams to build natural language processing pipelines with high processing speeds at scale. Snowflake ML now supports Container Runtime, allowing users to efficiently execute distributed ML training jobs on GPUs. The company also unveiled Model Serving in Containers, enabling teams to deploy both internally and externally-trained models for production using distributed CPUs or GPUs.

Key takeaways:

  • Snowflake has announced new advancements that accelerate the path for organizations to deliver easy, efficient, and trusted AI into production with their enterprise data.
  • With Snowflake’s latest innovations, developers can effortlessly build conversational apps for structured and unstructured data with high accuracy, efficiently run batch large language model (LLM) inference for natural language processing (NLP) pipelines, and train custom models with GPU-powered containers.
  • Snowflake is unveiling more customization options for large batch text processing, so data teams can build NLP pipelines with high processing speeds at scale, while optimizing for both cost and performance.
  • Snowflake ML now supports Container Runtime, enabling users to efficiently execute distributed ML training jobs on GPUs. This is a fully managed container environment accessible through Snowflake Notebooks and preconfigured with access to distributed processing on both CPUs and GPUs.
View Full Article

Comments (0)

Be the first to comment!