AI/ML Solution Engineer

About the position

Snowflake is seeking an AI/ML specialist to join the Solutions Engineering team supporting the Retail & Consumer Goods sector. The role provides AI/ML technical guidance, works with technical and business executives on the design and architecture of the Snowflake Cloud Data Platform, and partners with the sales team to understand customer AI/ML needs, drive winning sales cycles, deliver value-based demonstrations, support enterprise Proof of Concepts, and close business. The role emphasizes multi-cloud data architecture expertise and communicating Snowflake’s data platform capabilities across data ingestion, transformation, and lakehouse workloads.

Responsibilities

Present Snowflake technology and vision to executives and technical contributors at prospects, customers, and partners; work hands-on with prospects and customers to demonstrate and communicate Snowflake value throughout the sales cycle from demo to proof of concept to design and implementation; maintain deep understanding of competitive and complementary technologies and position Snowflake appropriately; collaborate with Product Management, Engineering, and Marketing to improve Snowflake products and marketing; support sales with demos, POCs, and technical guidance to close deals.

Requirements

5+ years of data engineering experience in the Enterprise Data space (strong preference for deep, hands-on Snowflake architecture experience); 3+ years working with AI/ML technologies; outstanding presentation skills to technical and executive audiences; ability to connect customer business problems with Snowflake solutions; ability to perform deep discovery of customer architectures and map to Snowflake Data Architecture; broad experience with large-scale database/data warehouse technology, ETL, analytics and cloud technologies (Data Lake, Data Mesh, Data Fabric); hands-on development experience with SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and other big data technologies; deep understanding of data integration tools for ETL/ELT such as Apache NiFi, Matillion, Fivetran, Qlik, or Informatica; familiarity with streaming technologies (Kafka, Flink, Spark Streaming, Kinesis) and real-time/near real-time use cases (CDC); experience designing interoperable data lakehouse architectures and working with Iceberg, Delta, and Parquet; strong architectural expertise in data engineering; Bachelor’s degree in computer science, engineering, mathematics or related field (or equivalent experience); adhere to confidentiality and security standards for handling sensitive data.

Nice-to-haves

Experience participating in commission plans and sales compensation structures for solution engineering roles; prior experience with enterprise sales motion and supporting sales to close deals; familiarity with additional complementary technologies and vendors in the data and AI/ML ecosystem.