Data Engineer

All Open Positions

Job Description:

We are seeking a talented Data Engineer to join our team and play a pivotal role in transforming raw data into actionable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and ETL processes, ensuring seamless data flow from various sources to our data warehouse. Your expertise in AWS technologies and BI tools will be instrumental in delivering high-quality data solutions that drive informed decision-making.

Required Skills:

  • Data Extraction, Transformation, and Loading (ETL): Develop and implement efficient ETL pipelines to extract data from diverse legacy systems, transform it to meet business requirements, and load it into our data warehouse or data lake.
  • Data Security: Develop security procedures and creating systems to keep data secure 
  • Data Warehousing and Data Lake: Design and build scalable data warehouses and data lakes to store and manage large volumes of structured and unstructured data.
  • Data Quality: Ensure data quality by implementing data validation and cleansing processes, identifying and resolving data inconsistencies.
  • Data Modeling: Create and maintain data models that accurately represent the business domain, ensuring data integrity and consistency.
  • AWS Technologies: Leverage AWS services like AWS Glue, S3, Redshift, EMR, and Lambda to build and manage data pipelines effectively.
  • BI Tools: Integrate data with BI tools (e.g., Tableau, Power BI) to enable data-driven insights and reporting.
  • Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand their data requirements and deliver solutions that meet their needs

Preferred Skills:

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field.
  • 6+ years of experience in data engineering or a similar role.
  • Experience building ETL pipelines from multiple database systems using SQL, Spark, Unix, Python, Java or C# languages
  • Strong proficiency in SQL and Python or other scripting languages.
  • In-depth knowledge of ETL processes and tools.
  • Hands-on experience with AWS technologies, including S3, Redshift, EMR, and Lambda.
  • Familiarity with data warehousing and data lake concepts.
  • Experience with RDBMS such as Oracle 12c, MS SQL Server, DB2 etc 
  • Experience with Agile development
  • Experience working with legacy systems and data from various sources.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skill        
  • Experience with data governance and data quality frameworks.
  • Knowledge of cloud-native data platforms and technologies.
  • Certifications in AWS or other relevant technologies.

Note: Candidate must have resided in USA for at least 3 years to be eligible for this position for background check.

    Apply for the Job: