• Find Jobs
  • For Companies
Login
  • Find Jobs
  • For Companies
Download Unnanu
to your mobile
Download Unnanu
to your mobile
PMCS Services, Inc.

Snowflake Data Architect

PMCS Services, Inc. · Austin, TXClose: Jan 2nd 2025
Term:Full timeWork:Onsite
Type:EmployeeContract
Share
We are looking for a Full-time contractor or employee for an Snowflake Data Architect role.

  • Design overall data structure, ensuring that Snowflake's features (e.g., data sharing, scalability, secure data exchange, etc.) are fully utilized to meet the business requirements.
  • Create a blueprint for how data will be stored, processed, and accessed within the Snowflake platform.
  • Perform optimization of data pipelines and workflows for performance, scalability, and cost-efficiency.
  • Design ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes, and optimize queries and data storage strategies.
  • Integrate with other cloud services (e.g., AWS, Azure, GCP), third-party tools, and on-premises data systems.
  • Designs and implements strategies to control access to sensitive data, applying encryption, role-based access control, and data masking as necessary.
  • Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand their requirements and ensure the Snowflake environment meets those needs.
  • Monitor the performance of the Snowflake environment, identifying bottlenecks, and ensuring optimal query performance.
  • Automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
  • Preform data loading methods (bulk loading using COPY INTO, Snowpipe for real-time ingestion, and External tables).
  • Perform Snowflake cloning capabilities for databases and schemas.
  • Perform configuration and management of Snowflake Virtual Warehouses including scaling, resizing and auto-suspend/resume settings.
  • Implement roles and privileges for managing secure access utilizing Snowflake RBAC (Role-Based Access Control)
  • Integrate Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
  • Configure alerts and monitor data pipeline failures, resource spikes, and cost thresholds.

Required Skills:
  • Experience with data modeling, data integration, data warehousing, data governance, and data security
  • Experience with Oracle and/or PostgreSQL in HA deployments and Expertise in data storage
  • Proficiency in Snowflake architecture and its components.
  • Hands-on experience with Snowflake objects such as Databases, Procedures, Tasks, and Streams.
  • Expertise in using Snowflake’s cloning capabilities for databases and schemas.
  • Proven experience in managing Snowflake Warehouses and optimizing performance for efficient query execution.
  • Proficiency in Snowflake RBAC (Role-Based Access Control), including implementation of roles and privileges.
  • Experience with integrating Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management.
  • Experience working with data integration tools like Informatica and ADF for seamless ETL/ELT processes.
  • Ability to automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
  • Expertise in monitoring and troubleshooting Snowflake environments, including usage tracking and query profiling.
  • Strong understanding of Snowflake’s security features such as data masking, encryption, and network policies.
  • Technical writing and diagramming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project.
  • Experience on an agile sprint team
  • Experience with JIRA software
  • Experience working with multiple teams concurrently, being able to prioritize and complete work on time with high quality
  • Knowledge of Informatica 10.5
  • Developing reports in Cognos Analytics 11.1

Preferred Skills:
  • Familiarity with CI/CD pipelines and version control for managing Snowflake code deployments.
  • Prior experience in the Healthcare Industry
  • Prior experience with an HHS agency
  • Prior experience working with PII or PHI data
  • Prior experience working with HL7 data
  • Prior experience with Azure
  • Bachelor's degree in computer science, Information Systems, or Business or equivalent experience.
 

Updated on January 2, 2025
View other open positions at PMCS Services, Inc.

Data Analyst
Teletechno
© 2025 Unnanu, Inc.
Terms · Privacy · FAQ · About