Kasmo

Enhancing POS Analytics Through Query Optimization: A Snowflake Intelligence Implementation Success Story

query optimization

About the Client

The client is a software solutions provider serving the convenience retail ecosystem, with a strong focus on helping independently owned businesses grow and operate more effectively. They deliver a comprehensive POS and back-office platform built to support both single-store operators and complex multi-location networks, including franchises, brands, and distributors. The solution helps users access critical business functions from desktops or mobile devices. Key capabilities include inventory management, custom reporting, advanced transaction-level analytics, POS-integrated camera systems, and more. This helps businesses streamline operations, improve visibility, and support multi-location growth.

Business Challenges Faced by Client

The client faced significant query performance limitations with their existing PostgreSQL environment. It handled both high-volume transactional workloads and complex analytical queries. While transactional processing alone placed heavy demand on the system, running analytics alongside it led to severe performance degradation. Query optimization was a major challenge as designed queries took several minutes to return results. This inability to meet performance expectations impacted reporting and data-driven decision-making. To address these challenges, the client decided to adopt Snowflake, a purpose-built analytics platform.

Kasmo’s Solution

query optimization

Migration and Query Modernization on Snowflake

Our team migrated the client’s analytical sales data from PostgreSQL to Snowflake and converted the existing reporting queries into Snowflake SQL. This ensured functional parity while enabling the use of Snowflake’s native analytical capabilities. Once migrated, all queries were reviewed to establish a scalable foundation for postgres query optimization.

Performance Optimization Using Dynamic Tables

A detailed analysis of the converted queries revealed repeated use of complex and resource-intensive Common Table Expression (CTE) logic across multiple reports. To eliminate redundant computations, our team refactored shared CTEs into Snowflake Dynamic Tables. This approach allowed heavy joins and aggregations to be precomputed and reduced query execution overhead. Downstream reporting queries were simplified to be used directly in Dynamic Tables, improving both maintainability and performance.

Data Scan Reduction and Cost-Efficient Scaling

To further optimize performance, we applied clustering strategies on Dynamic Tables based on frequently filtered columns. This reduced data scan volumes and improved query response times. This improved query performance without increasing costs or the size of the Snowflake warehouse.

Hybrid Architecture and Analytics

As requirements evolved, it was determined that certain inventory-related datasets would remain in PostgreSQL rather than being migrated to Snowflake. Kasmo adapted it by redesigning Snowflake queries to deliver only core analytical outputs like aggregates and counts. Additional contextual or inventory-related data is appended to the application layer using Node.js, allowing Snowflake to function strictly as an analytics engine.

Additionally, we implemented a Dynamic Table for sold products to support future reverse ETL scenarios. This table provides a structured, analytical view of sold products and enables seamless synchronization of sales insights back into PostgreSQL for downstream inventory and operational workflows.

Snowflake Intelligence for Agentic Analytics

Kasmo implemented Snowflake Intelligence using Cortex Analyst to enable natural-language analytics on POS data, reducing reliance on manual querying. An agent layer was implemented on top of Cortex Analyst to understand business intent and accurately route analytical queries. This process included-

  • Designed semantic intelligence across 14 POS tables, covering multiple retail domains.
  • Optimized the intelligence scope to the Line and Events tables based on client performance and needs.
  • Enabled analytics across retail domains, including sales and performance, margins, inventory, promotions, payments, cash control, and multi-store insights.
  • Business-friendly questions were automatically translated into optimized Snowflake SQL, eliminating manual query optimization.
  • Query responses were returned in business-readable terms like store names, dates, and metrics rather than technical identifiers.
  • Intelligent time-frame inference was implemented to handle queries where date ranges were not explicitly specified.
  • Delivered governed insights from Snowflake with a scalable foundation for future expansion.

Key Results Achieved

The following results highlight the measurable improvements achieved through the optimized Snowflake-based analytics architecture.

  • PostgreSQL-based reporting queries were converted to Snowflake to use its native analytical capabilities.
  • Query performance was optimized by refactoring complex logic into Snowflake Dynamic Tables.
  • Data scan volumes were reduced through effective clustering strategies.
  • Significantly improved query performance while continuing to use an S-sized Snowflake warehouse.
  • Built a query architecture that is cleaner, modular, and easier to maintain.
  • A clear separation was established between analytics processing in Snowflake and data enrichment handled through Node.js.
  • Built a strong foundation to support future reverse ETL workflow and downstream data synchronization.
  • Achieved >95% accuracy and consistency with Snowflake Intelligence, delivering business-ready analytics across POS data.

query optimization

Interested to learn more, talk to our experts