Seoul, South Korea – Google Cloud announced that NCSOFT, a global game developer and publisher, will be using Google Cloud’s AI infrastructure to power its in-house large language model (LLM) set, VARCO LLM.

Through Google Cloud’s LLM, NCSOFT will be developing a variety of AI-powered game services such as dynamically generated game content, enhanced player engagement, and customer service chatbots. 

This latest collaboration builds NCSOFT’s long-standing relationship with Google Cloud, and the company has continued to steadily expand its global services and advance technology using Google Cloud solutions such as Vertex AI and BigQuery.

Previously, NCSOFT leveraged Google Cloud Tensor Processor Units (TPUs) to develop VARCO LLM and has been training its models for the past year. By utilising Google Cloud TPUs, NCSOFT was able to optimise performance and cost of large-scale AI training workloads.

Speaking on this partnership, Lee Yeon-soo, head of NLP Center, NCSOFT, said,  “NCSOFT has been a leader in bringing AI technology to the gaming industry for over a decade. We are committed to using AI to create new and innovative experiences for our players. We look forward to continuing our partnership with Google Cloud to further explore the potential of generative AI in games.”

Meanwhile, Jack Buser, director for games, Google Cloud, commented, “Through this partnership, NCSOFT is able to bring new kinds of immersive experiences to players, underpinned by our powerful AI infrastructure that makes it easy to train and scale VARCO LLM.”

Australia – Global commercial software company SnapLogic has announced its partnership with Amazon Web Services (AWS) to expand the capabilities of SnapGPT, SnapLogic’s generative integration interface. 

As part of this collaboration, SnapLogic will make Anthropic’s cutting-edge Large Language Model (LLM), Claude 2, offered by Amazon Bedrock, available within the SnapGPT platform that was made generally available earlier this year. 

When integrated, SnapGPT will empower users to use Anthropic’s state-of-the-art LLM Claude 2 to build new integrations, define existing pipelines, create new expressions, and more using natural language prompts, allowing access to Claude 2’s advanced natural language processing and generation abilities that can accept prompts containing up to 75,000 words or 100,000 tokens. 

SnapGPT also leverages the larger prompt size of Claude 2 to offer a more powerful integration co-pilot that enhances performance across a wider range of applications, providing an ease of use that will allow businesses to roll-out citizen integration to clear backlog and drive business agility.

Specifically, the key benefits of SnapLogic using Amazon Bedrock include, allowing customers to generate integration pipelines directly from natural language instructions for accessibility, create complex data mapping configurations using natural language inputs, generate SQL queries from natural language queries, describe intricate data pipelines in plain language, as well as a new chat-based Q&A feature for users. 

Talking about the collaboration, Jeremiah Stone, CTO of SnapLogic, said, “SnapLogic is committed to providing our customers with the most advanced language models available, and our work with Amazon Bedrock and the integration of Anthropic’s model Claude 2 into SnapGPT is a testament to that commitment.”

Meanwhile, Rich Geraffo, vice president of North America at AWS, commented, “We are excited to work with SnapLogic to bring the power of Amazon Bedrock to SnapGPT users. This collaboration will enable organisations to leverage the capabilities of Amazon Bedrock and AWS’s generative AI expertise to drive innovation and enhance their data processing workflows.”

SnapLogic’s integration with Anthropic’s model Claude 2 will be available to SnapGPT users immediately, with comprehensive documentation and support to ensure a seamless transition.