View all job listings

Software Engineer

KFI

Permanent

-

Job description

  • Data Engineer Overview:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and optimizing data pipelines, ensuring efficient data processing, and enabling seamless data integration across various platforms. You will work closely with data scientists, analysts, and stakeholders to support data-driven decision-making within the organization.

Key Responsibilities:

  • Data Pipeline Development: Design, build, and maintain scalable and efficient ETL (Extract, Transform, Load) processes and data pipelines to ingest and process large datasets from multiple sources.
  • Database Management: Develop and optimize databases (SQL, NoSQL) to store and manage structured and unstructured data, ensuring high availability, reliability, and performance.
  • Data Integration: Implement and maintain data integration solutions to enable seamless data flow between various systems, applications, and platforms.
  • Data Quality: Ensure the accuracy, consistency, and quality of data by implementing data validation, cleansing, and transformation processes.
  • Performance Tuning: Monitor and optimize the performance of data pipelines and databases to handle large-scale data processing efficiently.
  • Data Warehousing: Develop and maintain data warehouses, data lakes, and other data storage solutions to support analytics and reporting needs.
  • Collaboration: Work closely with data scientists, analysts, and software engineers to understand data requirements, provide technical solutions, and deliver actionable insights.
  • Documentation: Document data architecture, data models, and ETL processes to ensure clarity and maintainability.
  • Security and Compliance: Ensure data security, privacy, and compliance with relevant regulations (e.g., GDPR, HIPAA) by implementing appropriate data governance practices.

Job requirements

Qualifications:

  • Education: Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology, or a related field.
  • Experience:
    • Proven experience as a Data Engineer or in a similar role, with expertise in data pipeline development, data warehousing, and database management.
    • Hands-on experience with SQL and NoSQL databases, such as MySQL, PostgreSQL, MongoDB, etc.
    • Proficiency in data processing tools and frameworks (e.g., Apache Spark, Hadoop, Apache Kafka, etc.).
    • Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) and data services (e.g., BigQuery, Redshift, Snowflake).
  • Programming: Strong programming skills in languages such as Python, or R for data manipulation and automation.
  • Tools: Familiarity with ETL tools (e.g., Apache NiFi, Airflow) and data visualization tools (e.g., Metabase, Tableau, Power BI).
  • Data Modeling: Solid understanding of data modeling concepts, including relational and dimensional modeling.
  • Soft Skills: Strong problem-solving skills, attention to detail, and the ability to work in a collaborative team environment.
  • Preferred:
    • Experience with machine learning pipelines and MLOps.
    • Knowledge of data governance and data security best practices.

Benefits

  • Personal laptop
  • Medical Benefit
  • Other internal HC self development program

Job information

Education

No Qualification

Experience level

-

Minimum experience

-

Gender

No Qualification

Age range

17 years - 65 years

Published date

22 Aug 2024

Powered by

Mekari Talenta